Home > Preview
The flashcards below were created by user
on FreezingBlue Flashcards.
actually refers to the control over access to system resources after a user's account credentials and identity have been authenticated and access to the system granted. For example, a particular user, or group of users, might only be permitted access to certain files after logging into a system, while simultaneously being denied access to all other resources
- 1. Mandatory
- 2. Discretionary
- 3. role
- 4. rule
MAC takes a hierarchical approach to controlling access to resources. Under a MAC enforced environment access to all resource objects (such as data files) is controlled by settings defined by the system administrator. As such, all access to resource objects is strictly controlled by the operating system based on system administrator configured settings. It is not possible under MAC enforcement for users to change the access control of a resource.
Unlike Mandatory Access Control (MAC) where access to system resources is controlled by the operating system (under the control of a system administrator), Discretionary Access Control (DAC) allows each user to control access to their own data. DAC is typically the default access control mechanism for most desktop operating systems. Instead of a security label in the case of MAC, each resource object on a DAC based system has an Access Control List (ACL) associated with it. An ACL contains a list of users and groups to which the user has permitted access together with the level of access for each user or group. For example, User A may provide read-only access on one of her files to User B, read and write access on the same file to User C and full control to any user belonging to Group 1. It is important to note that under DAC a user can only set access permissions for resources which they already own. A hypothetical User A cannot, therefore, change the access control for a file that is owned by User B. User A can, however, set access permissions on a file that she owns. Under some operating systems it is also possible for the system or network administrator to dictate which permissions users are allowed to set in the ACLs of their resources. Discretionary Access Control provides a much more flexible environment than Mandatory Access Control but also increases the risk that data will be made accessible to users that should not necessarily be given access.
Role Based Access Control (RBAC), also known as Non discretionary Access Control, takes more of a real world approach to structuring access control. Access under RBAC is based on a user's job function within the organization to which the computer system belongs.
Rule Based Access Control (RBAC) introduces acronym ambiguity by using the same four letter abbreviation (RBAC) as Role Based Access Control.
Black box testing
White box testing
Black Box Testing is a software testing method in which the internal structure/ design/ implementation of the item being tested is NOT known to the tester
White Box Testing is a software testing method in which the internal structure/ design/ implementation of the item being tested is known to the tester.
A vulnerability scan (or even a vulnerability assessment) looks for known vulnerabilities in your systems and reports potential exposures.
A penetration test is designed to actually exploit weaknesses in the architecture of your systems.
In short a technician runs a vulnerability scan while a hacker performs a penetration test.
IPS (Intrusion Prevention Systems)
NIPS and HIPS
IDS (Intrusion Detection Systems)
NIDS and HIDS
1. NIPS solutions evaluate traffic before it's allowed into a network or subnet. HIPS solutions evaluate packets before they're allowed to enter a computer.
2. Host Intrusion Detection Systems and Network Intrusion Detection Systems, or HIDs and NIDs, are computer network security systems used to protect from viruses, spyware, malware and other malicious file types. The difference is that HIDs are installed only on certain intersection points, such as servers and routers, while NIDs are installed on every host machine.
Read more : http://www.ehow.com/info_12225180_description-difference-between-hids-nids.html
RTO and RPO
RTO, or Recovery Time Objective, is the target time you set for the recovery of your IT and business activities after a disaster has struck. The goal here is to calculate how quickly you need to recover, which can then dictate the type or preparations you need to implement and the overall budget you should assign to business continuity.
RPO, or Recovery Point Objective, is focused on data and your company’s loss tolerance in relation to your data. RPO is determined by looking at the time between data backups and the amount of data that could be lost in between backups.
MTBF means Medium Time Between Failures. MTBF is the medium time between each failure within a process, or in other words, the frequency of each failure.
MTTR means Medium Time To Repair, thus the medium untill having repaired the failure.
Asymmetric Encryption is a form of Encryption where keys come in pairs. What one key encrypts, only the other can decrypt.
a message encrypted with the public key can be decrypted with the private key
Elliptic Curve Cryptography (ECC) was discovered in 1985 by Victor Miller (IBM) and Neil Koblitz (University of Washington) as an alternative mechanism for implementing public-key cryptography. Public-key algorithms create a mechanism for sharing keys among large numbers of participants or entities in a complex information system. Unlike other popular algorithms such as RSA, ECC is based on discrete logarithms that is much more difficult to challenge at equivalent key lengths.
web Application Firewall
- Web Application Firewalls (WAF) protect websites and web-based applications (technologies such as ASP, ASPX, PHP, Perl, JSP and other Common-Gateway Interfaces (CGI)) from online attacks
- is an appliance, server plugin, or filter that applies a set of rules to an HTTP conversation. Generally, these rules cover common attacks such as cross-site scripting (XSS) and SQL injection. By customizing the rules to your application, many attacks can be identified and blocked
SLE and ALE
The SLE is the cost of any single loss. The ARO indicates how many times you can expect the loss in a year. The ALE is calculated as SLE x ARO.
A computer virus is a malware program that, when executed, replicates by inserting copies of itself (possibly modified) into other computer programs, data files, or the boot sector of the hard drive; when this replication succeeds, the affected areas are then said to be "infected". Viruses often perform some type of harmful activity on infected hosts, such as stealing hard disk space or CPU time, accessing private information, corrupting data, displaying political or humorous messages on the user's screen, spamming their contacts, logging their keystrokes, or even rendering the computer useless. However, not all viruses carry a destructive payload or attempt to hide themselves—the defining characteristic of viruses is that they are self-replicating computer programs which install themselves without user consent.
denial of service
zero day attack
In computing, a denial-of-service (DoS) or distributed denial-of-service (DDoS) attack is an attempt to make a machine or network resource unavailable to its intended users.
A zero-day (also known as zero-hour or 0-day) is a computer threat that exposes undisclosed or unpatched computer application vulnerabilities. Zero-day attacks can be considered extremely dangerous because they take advantage of computer security holes for which no solution is currently available
Cross-site scripting (XSS) is a type of computer security vulnerability typically found in Web applications. XSS enables attackers to inject client-side script into Web pages viewed by other users. A cross-site scripting vulnerability may be used by attackers to bypass access controls such as the same-origin policy. Cross-site scripting carried out on websites accounted for roughly 84% of all security vulnerabilities documented by Symantec as of 2007. Their effect may range from a petty nuisance to a significant security risk, depending on the sensitivity of the data handled by the vulnerable site and the nature of any security mitigation implemented by the site's owner.
Christmas tree packets can be used as a method of divining the underlying nature of a TCP/IP stack by sending the packets and awaiting and analyzing the responses. When used as part of scanning a system, the TCP header of a Christmas tree packets has the flags SYN, FIN, URG and PSH set. Many operating systems implement their compliance with the Internet Protocol standard (RFC 791) in varying or incomplete ways. By observing how a host responds to an odd packet, such as a Christmas tree packet, assumptions can be made regarding the host's operating system. Versions of Microsoft Windows, BSD/OS, HP-UX, Cisco IOS, MVS, and IRIX display behaviors that differ from the RFC standard when queried with said packets.A large number of Christmas tree packets can also be used to conduct a DoS attack by exploiting the fact that Christmas tree packets require much more processing by routers and end-hosts than the 'usual' packets do.Christmas tree packets can be easily detected by intrusion-detection systems or more advanced firewalls. From a network security point of view, Christmas tree packets are always suspicious and indicate a high probability of network reconnaissance activities.
Elliptic curve Diffie Hellman
is an anonymous key agreement protocol that allows two parties, each having an elliptic curve public–private key pair, to establish a shared secret over an insecure channel. This shared secret may be directly used as a key, or to derive another key which can then be used to encrypt subsequent communications using a symmetric key cipher. It is a variant of the Diffie–Hellman protocol using elliptic curve cryptography.
is a specific method of securely exchanging cryptographic keys over a public channel and was one of the first public-key protocols as originally conceptualized by Ralph Merkle
Memorandum of Understanding
MOU are generally loose agreements and therefore may not have strict guidelines in place to protect sensitive data between the two entities
While remarkable for its simplicity and speed in software, RC4 has weaknesses that argue against its use in new systems. It is especially vulnerable when the beginning of the output keystream is not discarded, or when nonrandom or related keys are used; some ways of using RC4 can lead to very insecure protocols such as WEP.
TCP Wrapper monitors incoming packets. If an external computer or host attempts to connect, TCP Wrapper checks to see if that external entity is authorized to connect. If it is authorized, then access is permitted; if not, access is denied. The program can be tailored to suit individual user or network needs.
Temporal Key Integrity Protocol
TKIP was designed by the IEEE 802.11i task group and the Wi-Fi Alliance as an interim solution to replace WEP without requiring the replacement of legacy hardware. This was necessary because the breaking of WEP had left WiFi networks without viable link-layer security, and a solution was required for already deployed hardware. TKIP is no longer considered secure and was deprecated in the 2012 revision of the 802.11 standard.
Counter Mode Cipher Block Chaining Message Authentication Code Protocol,
is an encryption protocol designed for Wireless LAN products that implement the standards of the IEEE 802.11i amendment to the original IEEE 802.11 standard. CCMP is an enhanced data cryptographic encapsulation mechanism designed for data confidentiality and based upon the Counter Mode with CBC-MAC (CCM) of the AES standard. It was created to address the vulnerabilities presented by WEP, a dated, insecure protocol.
is the practice of concealing a file, message, image, or video within another file, message, image, or video. The word steganography combines the Greek words steganos (στεγανός), meaning "covered, concealed, or protected", and graphein (γράφειν) meaning "writing".
Security Assertion Markup Language
is an XML-based, open-standard data format for exchanging authentication and authorization data between parties, in particular, between an identity provider and a service provider
TOTP VS HOTP
The TOTP passwords are short-lived, they only apply for a given amount of human time. HOTP passwords are potentially longer lived, they apply for an unknown amount of human time.
is an older network protocol that ensures a loop-free topology for any bridged Ethernet local area network. The basic function of STP is to prevent bridge loops and the broadcast radiation that results from them. Spanning tree also allows a network design to include spare (redundant) links to provide automatic backup paths if an active link fails, without the danger of bridge loops, or the need for manual enabling/disabling of these backup links.
IPsec is a dual-mode, end to end security scheme that operates at the layer 3, the network layer, also known as the internet layer wihin the internet protocol suite. It is often used with L2TP for VPN tunneling, among other protocols
is when an attacker attempts to flood the CAM table of a switch with many packets, each of which has a different source MAC address. the cam table is an area in memory set aside to store mac address to physical port translations
Wi-Fi Protected Access 2
is the most secure protocol listed for connecting to wireless networks, it is more secure than WPA and WEP
Protected Extensible Authentication Protocol
Transport Layer Security
creates a TLS tunnel by acquiring a PKI certificate from a CLt is known simply as PEAP-TLS. It is a similar to EAP-TLS. ftps is ftp over ssl.
Spear phishing is the attempt at fraudulently obtaining information from specific individuals, usually through e-mail. DNS poisoning is a compromise of a DNS server’s name cache database. Pharming is an attack that redirects a website’s traffic to another illegitimate website. A Fraggle attack contains UDP traffic sent to ports 7 and 19; it is a type of DoS attack.
SOX, or Sarbanes-Oxley, governs the disclosure of financial and accounting data. HIPAA governs the disclosure and protection of health information. GLB, or the Gramm-Leach-Bliley Act of 1999, enables commercial banks, investment banks, securities firms, and insurance companies to consolidate. Top secret is a classification given to confidential data.
Which of the following is the least volatile when performing incident response procedures?
The order of volatility defines any type of registers as the most volatile, and cache and RAM as slightly less volatile. On the other hand, backup tapes are less volatile than hard drives, and optical discs are less volatile as well. Those last two options make for good options if forensics data needs to be stored over the long term.
What kind of security control do computer security audits fall under
A computer security audit is an example of a detective security control. If a security administrator found that a firewall was letting unauthorized ICMP echoes into the network, the administrator might close the port on the firewall—a corrective control, and for the future, a preventive control. The term protective control is not generally used in security circles as it is a somewhat ambiguous term.
Detective controls These controls are used during an event and can find out whether malicious activity is occurring or has occurred. Examples include CCTV/video surveillance, alarms, NIDSs, and auditing.
Preventive controls These controls are employed before the event and are designed to prevent an incident. Examples include biometric systems designed to keep unauthorized persons out, NIPSs to prevent malicious activity, and RAID 1 to prevent loss of data. These are also sometimes referred to as deterrent controls.
Corrective controls These controls are used after an event. They limit the extent of damage and help the company recover from damage quickly. Tape backup, hot sites, and other fault tolerance and disaster recovery methods are also included here. These are sometimes referred to as compensating controls
- Class D fire extinguishers are the type used for combustible metal fires such as ones that can burn magnesium, titanium, and lithium. They are designated with a yellowed decagon.
- Class A extinguishers are used for ordinary fires that consume wood.
Class B extinguishers are used for liquid and gas fires.
Class C extinguishers are used for electrical fires.
Shielding is part of the TEMPEST standards. TEMPEST is a group of standards that refers to the investigations of conducted admissions from electrical and mechanical devices that may or may not compromise an organization. It is important to shield devices such as air conditioners to prevent electromagnetic interference to network devices and cabling.
System A fails open system b fails closed
System A fails open. System B fails closed. System A requires high availability, so it should fail open. For example, if the system were a monitoring system, and a portion of it failed, the organization might want it to fail open so that other portions of the monitoring system will still be accessible. However, System B requires security, so it should fail closed. Let’s say that System B was a firewall. If it crashed, would we still want network connectivity to pass through it? Probably not, because there would be little or no protection to the network. In general, if you need high availability, the system should fail open. If you need high security, it should fail closed. Additional Learning
username and password
SQL Injection is one of the many web attack mechanisms used by hackers to steal data from organizations. It is perhaps one of the most common application layer attack techniques used today. It is the type of attack that takes advantage of improper coding of your web applications that allows hacker to inject SQL commands into say a login form to allow them to gain access to the data held within your database.In essence, SQL Injection arises because the fields available for user input allow SQL statements to pass through and query the database directly
Data encryption standard
Also referred to as 3DES, a mode of the DES encryption algorithm that encrypts data three times. Three 64-bit keys are used, instead of one, for an overall key length of 192 bits (the first encryption is encrypted with second key, and the resulting cipher text is again encrypted with a third key).
Advanced encryption standard
AES is a variant of Rijndael which has a fixed block size of 128 bits, and a key size of 128, 192, or 256 bits. By contrast, the Rijndael specification per se is specified with block and key sizes that may be any multiple of 32 bits, both with a minimum of 128 and a maximum of 256 bits.
Secure Hash Algorithms
is a cryptographic hash function designed by the United States National Security Agency and is a U.S. Federal Information Processing Standard published by the United States NIST.SHA-1 produces a 160-bit (20-byte) hash value known as a message digest. A SHA-1 hash value is typically rendered as a hexadecimal number, 40 digits long.SHA stands for "secure hash algorithm". The four SHA algorithms are structured differently and are named SHA-0, SHA-1, SHA-2, and SHA-3. SHA-0 is the original version of the 160-bit hash function published in 1993 under the name "SHA": it was not adopted by many applications. Published in 1995, SHA-1 is very similar to SHA-0, but alters the original SHA hash specification to correct alleged weaknesses. SHA-2, published in 2001, is significantly different from the SHA-1 hash function.SHA-1 is the most widely used of the existing SHA hash functions, and is employed in several widely used applications and protocols.In 2005, cryptanalysts found attacks on SHA-1 suggesting that the algorithm might not be secure enough for ongoing use. NIST required many applications in federal agencies to move to SHA-2 after 2010 because of the weakness. Although no successful attacks have yet been reported on SHA-2, it is algorithmically similar to SHA-1. In 2012, following a long-running competition, NIST selected an additional algorithm, Keccak, for standardization under SHA-3. In November 2013 Microsoft announced their deprecation policy on SHA-1 according to which Windows will stop accepting SHA-1 certificates in SSL by 2017. In September 2014 Google announced their deprecation policy on SHA-1 according to which Chrome will stop accepting SHA-1 certificates in SSL in a phased way by 2017. Mozilla is also planning to stop accepting SHA-1-based SSL certificates by 2017
A user of RSA creates and then publishes a public key based on the two large prime numbers, along with an auxiliary value. The prime numbers must be kept secret. Anyone can use the public key to encrypt a message, but with currently published methods, if the public key is large enough, only someone with knowledge of the prime numbers can feasibly decode the message. Breaking RSA encryption is known as the RSA problem; whether it is as hard as the factoring problem remains an open question.
service set identifier
name for your network that is broadcast
is a case sensitive, 32 alphanumeric character unique identifier attached to the header of packets sent over a wireless local-area network (WLAN) that acts as a password when a mobile device tries to connect to the basic service set (BSS) -- a component of the IEEE 802.11 WLAN architecture.
trusted platform module
A Trusted Platform Module (TPM) is a specialized chip on an endpoint device that stores RSA encryption keys specific to the host system for hardware authentication. Each TPM chip contains an RSA key pair called the Endorsement Key (EK). The pair is maintained inside the chip and cannot be accessed by software
three primary security controls
operational, technical and management
is a symmetric key cipher where plaintext digits are combined with a pseudorandom cipher digit stream (keystream). In a stream cipher each plaintext digit is encrypted one at a time with the corresponding digit of the keystream, to give a digit of the ciphertext stream. An alternative name is a state cipher, as the encryption of each digit is dependent on the current state. In practice, a digit is typically a bit and the combining operation an exclusive-or (XOR).The pseudorandom keystream is typically generated serially from a random seed value using digital shift registers. The seed value serves as the cryptographic key for decrypting the ciphertext stream.Stream ciphers represent a different approach to symmetric encryption from block ciphers. Block ciphers operate on large blocks of digits with a fixed, unvarying transformation. This distinction is not always clear-cut: in some modes of operation, a block cipher primitive is used in such a way that it acts effectively as a stream cipher. Stream ciphers typically execute at a higher speed than block ciphers and have lower hardware complexity. However, stream ciphers can be susceptible to serious security problems if used incorrectly (see stream cipher attacks); in particular, the same starting state (seed) must never be used twice.
a block cipher is a deterministic algorithm operating on fixed-length groups of bits, called blocks, with an unvarying transformation that is specified by a symmetric key. Block ciphers are important elementary components in the design of many cryptographic protocols, and are widely used to implement encryption of bulk data.The modern design of block ciphers is based on the concept of an iterated product cipher. Product ciphers were suggested and analyzed by Claude Shannon in his seminal 1949 publication Communication Theory of Secrecy Systems as a means to effectively improve security by combining simple operations such as substitutions and permutations. Iterated product ciphers carry out encryption in multiple rounds, each of which uses a different subkey derived from the original key. One widespread implementation of such ciphers is called a Feistel network, named after Horst Feistel, and notably implemented in the DES cipher. Many other realizations of block ciphers, such as the AES, are classified as substitution-permutation networks.The publication of the DES cipher by the U.S. National Bureau of Standards (now National Institute of Standards and Technology, NIST) in 1977 was fundamental in the public understanding of modern block cipher design. In the same way, it influenced the academic development of cryptanalytic attacks. Both differential and linear cryptanalysis arose out of studies on the DES design. Today, there is a palette of attack techniques against which a block cipher must be secure, in addition to being robust against brute force attacks.Even a secure block cipher is suitable only for the encryption of a single block under a fixed key. A multitude of modes of operation have been designed to allow their repeated use in a secure way, commonly to achieve the security goals of confidentiality and authenticity. However, block ciphers may also be used as building blocks in other cryptographic protocols, such as universal hash functions and pseudo-random number generators.
Cryptography (or cryptology; from Greek κρυπτός kryptós, "hidden, secret"; and γράφειν graphein, "writing", or -λογία -logia, "study", respectively) is the practice and study of techniques for secure communication in the presence of third parties (called adversaries). More generally, it is about constructing and analyzing protocols that block adversaries; various aspects in information security such as data confidentiality, data integrity, authentication, and non-repudiation are central to modern cryptography. Modern cryptography exists at the intersection of the disciplines of mathematics, computer science, and electrical engineering. Applications of cryptography include ATM cards, computer passwords, and electronic commerce.Cryptography prior to the modern age was effectively synonymous with encryption, the conversion of information from a readable state to apparent nonsense. The originator of an encrypted message shared the decoding technique needed to recover the original information only with intended recipients, thereby precluding unwanted persons from doing the same. Since World War I and the advent of the computer, the methods used to carry out cryptology have become increasingly complex and its application more widespread.Modern cryptography is heavily based on mathematical theory and computer science practice; cryptographic algorithms are designed around computational hardness assumptions, making such algorithms hard to break in practice by any adversary. It is theoretically possible to break such a system, but it is infeasible to do so by any known practical means. These schemes are therefore termed computationally secure; theoretical advances, e.g., improvements in integer factorization algorithms, and faster computing technology require these solutions to be continually adapted. There exist information-theoretically secure schemes that provably cannot be broken even with unlimited computing power—an example is the one-time pad—but these schemes are more difficult to implement than the best theoretically breakable but computationally secure mechanisms.The growth of cryptographic technology has raised a number of legal issues in the information age. Cryptography's potential for use as a tool for espionage and sedition has led many governments to classify it as a weapon and to limit or even prohibit its use and export. In some jurisdictions where the use of cryptography is legal, laws permit investigators to compel the disclosure of encryption keys for documents relevant to an investigation. Cryptography also plays a major role in digital rights management and piracy of digital media.
What is TLS/SSL
TLS Enhancement to SSL
One problem when you administer a network is securing data that is being sent between applications across an untrusted network. You can use TLS/SSL to authenticate servers and clients and then use it to encrypt messages between the authenticated parties.The Transport Layer Security (TLS) protocol, Secure Sockets Layer (SSL) protocol, versions 2.0 and 3.0, and the Private Communications Transport (PCT) protocol are based on public key cryptography. The Security Channel (Schannel) authentication protocol suite provides these protocols. All Schannel protocols use a client/server model.
TLS Enhancements to SSL The keyed-Hashing for Message Authentication Code (HMAC) algorithm replaces the SSL Message Authentication Code (MAC) algorithm. HMAC produces more secure hashes than the MAC algorithm. The HMAC produces an integrity check value as the MAC does, but with a hash function construction that makes the hash much harder to break. For more information about the HMAC, see “Hash Algorithms in The Handshake Layer in TLS/SSL Architecture” in How TLS/SSL Works. TLS is standardized in RFC 2246. Many new alert messages are added.
In TLS, it is not always necessary to include certificates all the way back to the root CA.
You can use an intermediary authority.
TLS specifies padding block values that are used with block cipher algorithms.
RC4, which is used by Microsoft, is a streaming cipher, so this modification is not relevant.
Fortezza algorithms are not included in the TLS RFC, because they are not open for public review.
(This is Internet Engineering Task Force (IETF) policy.) Minor differences exist in some message fields.
Radio Frequency identification
is the wireless use of electromagnetic fields to transfer data, for the purposes of automatically identifying and tracking tags attached to objects. The tags contain electronically stored information. Some tags are powered by electromagnetic induction from magnetic fields produced near the reader. Some types collect energy from the interrogating radio waves and act as a passive transponder. Other types have a local power source such as a battery and may operate at hundreds of meters from the reader. Unlike a barcode, the tag does not necessarily need to be within line of sight of the reader and may be embedded in the tracked object. RFID is one method for Automatic Identification and Data Capture (AIDC).
authentication authorization, accounting
shared secrets to protect communication
Short for Remote Authentication Dial-In User Service, an authentication and accounting system used by many Internet Service Providers (ISPs). When you dial in to the ISP you must enter your username and password. This information is passed to a RADIUS server, which checks that the information is correct, and then authorizes access to the ISP system.Though not an official standard, the RADIUS specification is maintained by a working group of the IETF.
Whitelisting and Blacklisting
Whitelisting and blacklisting are two approaches to addressing application security. Antivirus products used blacklisting to identify, block, and remediate against malicious software. Newer endpoint security products use whitelisting to allow trusted software to run and block the rest.
- Operates using a list
- of approved software
If an application is not on the approved list of software then the application is denied or restricted
- Blacklisting Approach:
- Default-allow approach with exceptions
Operates using a list of unapproved software
If an application is not on the unapproved list of software then it is approved
public key and private key
secure channel protocol
The SCP protocol is a network protocol, based on the BSD RCP protocol, which supports file transfers between hosts on a network. SCP uses Secure Shell (SSH) for data transfer and uses the same mechanisms for authentication, thereby ensuring the authenticity and confidentiality of the data in transit. A client can send (upload) files to a server, optionally including their basic attributes (permissions, timestamps). Clients can also request files or directories from a server (download). SCP runs over TCP port 22 by default. Like RCP, there is no RFC that defines the specifics of the protocol.
pretty good privacy
Gnu Privacy Guard
is a data encryption and decryption computer program that provides cryptographic privacy and authentication for data communication. PGP is often used for signing, encrypting, and decrypting texts, e-mails, files, directories, and whole disk partitions and to increase the security of e-mail communications. It was created by Phil Zimmermann in 1991.PGP and similar software follow the OpenPGP standard (RFC 4880) for encrypting and decrypting data.
GNU Privacy Guard (GnuPG or GPG) is a free software replacement for the PGP suite of Symantec software. GnuPG is compliant with RFC 4880, which is the current IETF standards track specification of OpenPGP. Current versions of PGP (and Veridis' Filecrypt) are interoperable with GnuPG and other OpenPGP-compliant systems.
salt or salting
a salt is random data that is used as an additional input to a one-way function that hashes a password or passphrase. The primary function of salts is to defend against dictionary attacks versus a list of password hashes and against pre-computed rainbow table attacks.A new salt is randomly generated for each password. In a typical setting, the salt and the password are concatenated and processed with a cryptographic hash function, and the resulting output (but not the original password) is stored with the salt in a database. Hashing allows for later authentication while protecting the plaintext password in the event that the authentication data store is compromised.Cryptographic salts are broadly used in many modern computer systems, from Unix system credentials to Internet security.
- PBKDF2 (Password-Based Key Derivation Function 2
- The added computational work makes password cracking much more difficult, and is known as key stretching.
The bcrypt function is the default password hash algorithm for BSD and many other systems. The prefix "$2a$" or "$2b$" (or "$2y$") in a hash string in a shadow password file indicates that hash string is a bcrypt hash in modular crypt format. The rest of the hash string includes the cost parameter, a 128-bit salt (base-64 encoded as 22 characters), and 184 bits of the resulting hash value (base-64 encoded as 31 characters).[
Extensible Authentication Protocol
WEP over EAP-PEAP
WPA2 over EAP-TTLS
EAP is an authentication framework providing for the transport and usage of keying material and parameters generated by EAP methods. There are many methods defined by RFCs and a number of vendor specific methods and new proposals exist. EAP is not a wire protocol; instead it only defines message formats. Each protocol that uses EAP defines a way to encapsulate EAP messages within that protocol's messages.
high gain antenna with a narrow radiation pattern
unified threat management
(UTM) is an approach to security management that allows an administrator to monitor and manage a wide variety of security-related applications and infrastructure components through a single management console.
An initialization vector (IV) attack is an attack on wireless networks. It modifies the IV of an encrypted wireless packet during transmission. Once an attacker learns the plaintext of one packet, the attacker can compute the RC4 key stream generated by the IV used.
TACACS (Terminal Access Controller Access Control System) is an older authentication protocol common to UNIX networks that allows a remote access server to forward a user's logon password to an authentication server to determine whether access can be allowed to a given system.
RADIUS encrypts only the password. TACAS+ encrypts the entire session. TACAS+ more reliable TCP. RADIUS UDP. RADIUS combines authentication and authorisation. TACAS+ splits. TACAS+ can interact with a Active Directory environment and use Kerberos.
is a stealthy type of software, typically malicious, designed to hide the existence of certain processes or programs from normal methods of detection and enable continued privileged access to a computer.
Blowfish block cipher
is a symmetric block cipher that can be used as a drop-in replacement for DES or IDEA. It takes a variable-length key, from 32 bits to 448 bits, making it ideal for both domestic and exportable use. Blowfish was designed in 1993 by Bruce Schneier as a fast, free alternative to existing encryption algorithms.
is a piece of code intentionally inserted into a software system that will set off a malicious function when specified conditions are met
in a computer system (or cryptosystem or algorithm) is a method of bypassing normal authentication, securing unauthorized remote access to a computer, obtaining access to plaintext, and so on, while attempting to remain undetected
is a software testing technique, often automated or semi-automated, that involves providing invalid, unexpected, or random data to the inputs of a computer program. The program is then monitored for exceptions such as crashes, or failing built-in code assertions or for finding potential memory leaks. Fuzzing is commonly used to test for security problems in software or computer systems. It is a form of random testing which has been used for testing hardware or software.
Before allowing and entity to perform certain actions, you must ensure you know who that entity actually is (Authentication) and if the entity is authorized to perform that action (Authorization). Additionally, you need to ensure that accurate records are maintained showing that the action has occurred, so you keep a security log of the events (Accounting).
random one-time pad
is an encryption technique that cannot be cracked if used correctly. In this technique, a plaintext is paired with a random secret key (also referred to as a one-time pad). Then, each bit or character of the plaintext is encrypted by combining it with the corresponding bit or character from the pad using modular addition. If the key is truly random, is at least as long as the plaintext, is never reused in whole or in part, and is kept completely secret, then the resulting ciphertext will be impossible to decrypt or break. It has also been proven that any cipher with the perfect secrecy property must use keys with effectively the same requirements as OTP keys. However, practical problems have prevented one-time pads from being widely used.