Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
50 Cards in this Set
- Front
- Back
Computer Security
|
The protection of an automated information system in order to preserve confidentiality, integrity, and availability of information system resources.
|
|
Confidentiality
|
Preserving authorized restrictions on information access and disclosure. Includes protecting personal privacy and propriety information.
|
|
Integrity
|
Guarding against improper information modification or destruction. Includes ensuring information nonrepudiation and authenticity.
|
|
Availability
|
Ensuring timely and reliable access to, and use of, information.
|
|
Authenticity
|
Property of being genuine and being able to be verified and trusted. Includes verifying users are who they say they are.
|
|
Accountability
|
Actions of an entity must be able to be traced to that entity.
|
|
Challenges of computer security
|
– Not as simple as it appears.
– Must always consider potential attacks of security features. – Security mechanisms are therefore elaborate and sometimes unclear. – Have to decide where to use them. – Computer security is a battle of wits. Designer must try to eliminate all weaknesses. – Security is too often an afterthought. – Little perceived benefit until something happens. – Impedes user-friendly operation. |
|
Threat agent
|
An entity that attacks, or is a threat to a system.
|
|
Attack
|
Assault on system security. A deliberate attempt to evade security services.
|
|
Countermeasure
|
In action/device/procedure imposed by the owner to reduce threat, vulnerability or an attack.
|
|
Risk
|
An expectation of loss: the probability of a particular threat will exploit a vulnerability with a harmful result.
|
|
Asset
|
Data/service/capability that is valuable and trying to be protected.
|
|
Threat
|
A potential for violation of security that could cause harm.
A possible danger that might exploit a vulnerability. |
|
Vulnerability
|
A flaw or weakness in the system's design/operation that could be exploited.
|
|
Vulnerabilities: corrupted
|
Does the wrong thing/gives wrong answers.
|
|
Vulnerabilities: leaky
|
Info leaked to unauthorized people.
|
|
Vulnerabilities: unavailable
|
Using the system becomes impossible/impractical.
|
|
Active attacks
|
Attempt to alter system resources or functions.
|
|
Passive attacks
|
Attempt to make use of information from the system – does not affect system resources.
|
|
Euclidean Algorithm
|
a = qb + r
:: gcd(a,b) = gcd(b,r) |
|
Fermat's Little Theorem
|
if p is a prime
then a^(p-1) ≅ 1 mod p |
|
Euler's Totient Theorem
|
If n and a are coprime,
then a^phi(n) ≅ 1 mod n [phi(n) = (p-1)(q-1)] |
|
4 Types of threats
|
Unauthorized disclosure
Deception Disruption Usurpation |
|
Unauthorized disclosure
|
Confidential information is exposed, intercepted, inferred, or intruded.
|
|
Deception
|
To appear to be somebody or something that you are not.
Masquerade Falsification Repudiation |
|
Disruption
|
Attack on availability and/or integrity.
Incapacitation: system is brought down by means of physical destruction. Corruption: result of an attack on system integrity which results in a system operating in an unintended manner or providing data that is inaccurate. Obstruction: Block ability to communicate. |
|
Usurpation
|
Threat to integrity. Two major consequences:
Misappropriation: service is denied or inappropriately used Misuse: Unauthorized access resulting in a range of difficulties such as stealing passwords, accessing data and disabling protocols. |
|
Threat agents
|
Trusted people.
People who may cause difficulty. People offsite. The inanimate. |
|
Assumptions in modern cryptographic algorithms
|
– The method of encryption is public
– Security must depend on keeping the key secret – Caveat emptor: all messages are decrypted to be read in our originally prepared in plain text. These are moments of extreme vulnerability. Also, keys must be remembered somehow. How are they stored? Where? An encrypted message should be able to be considered unclassified. |
|
Relation
|
A subset of a Cartesian product (Rod Cooper's definition anyways)
|
|
State of a computer (consists of...)
|
The contents of the registers.
The control register The state word. Memory. |
|
Discretionary Access Control (DAC)
|
A scheme where users are given the ability to determine the permissions governing access to their own files.
DAC schemes allow users to grant privileges on resources to other users on the same system. |
|
Mandatory Access Control (MAC)
|
A more restrictive scheme that does not allow users to define permissions on files, regardless of ownership. Instead, security decisions are made by a central policy administrator.
|
|
Bell-Lapadulo Security Model
|
A classic mandatory access-control model for protecting confidentiality.
|
|
2 Security rules in Bell–LaPadula Model
|
– Simple security property: Can't read up
– Star property: Can't write down |
|
State in Bell-LaPadula model (consists of...)
|
A tuple: (b, M, f, H)
b = current access set (s, o, a) M = Access Matrix f = Level function H = Hierarchy tree |
|
Current access set: b
|
This is a set of triples of the form (subject, object, access- mode). A triple (s, o, a) means that subject s has current access to o in access mode a.
|
|
Access matrix: M
|
The matrix element M_ij records the access modes in which subject S_i is permitted to access object O_j.
|
|
Level function: f
|
This function assigns a security level to each subject and object.
|
|
Hierarchy: H
|
This is a directed rooted tree whose nodes correspond to objects in the system. The model requires that the security level of an object must dominate the security level of its parent.
|
|
How to change the state of a Bell-Lapadulo Model
|
The security state of the system is changed by any operation that causes a change any of the four components of the system, (b, M, f, H).
• Get access/release access (changes b) • Change object level/change current level (changes f) • Give/rescind access permission (changes M) • Create an object (changes H) • Delete an object (changes H and possibly b [release access]) |
|
Biba Integrity Model
|
Has a similar structure to the BLP model, but it addresses integrity rather than confidentiality.
|
|
3 Integrity rules in Biba Integrity Model
|
– Simple integrity: Can't write up.
– Integrity confinement: Can't read down. – Invocation property: A subject can invoke another subject only if the integrity level of the first subject dominates the integrity level of the second subject. |
|
Clark-Wilson Model
|
Rather than dealing with document confidentiality and/or integrity, the Clark-Wilson (CW) model deals with systems that perform transactions.
|
|
The Chinese Wall Model
|
– Designed for use in the commercial sector to eliminate the possibility of conflicts of interest.
– The model groups resources into “conflict of interest classes.” – The model enforces the restriction that each user can only access one resource from each conflict of interest class. |
|
The Orange Book
|
Formally the: Trusted Computer System Evaluation Criteria (TCSEC) made by the NSA and other US government agencies in the 1980's. Part of the "Rainbow Series"
- Has 4 classifications for security, A to D |
|
The Federal Criteria
|
The one that Rod Cooper likes... It is the BEST.
|
|
The Common Criteria
|
The Common Criteria (CC) for Information Technology and Security Evaluation are ISO standards for specifying security requirements and defining evaluation criteria. The aim of these standards is to provide greater confidence in the security of IT products as a result of formal actions taken dur- ing the process of developing, evaluating, and operating these products.
|
|
Reference Monitor
|
The reference monitor is a controlling element in the hardware and operating system of a computer that regulates the access of subjects to objects on the basis of security parameters of the subject and object. The reference monitor has access to a file, known as the security kernel database, that lists the access privileges (security clearance) of each subject and the protection attributes (classification level) of each object. The reference monitor enforces the security rules (no read up, no write down).
|
|
Trusted computing base (TCB)
|
A portion of a system that enforces a particular policy. The TCB must be resistant to tampering and circumvention. The TCB should be small enough to be analyzed systematically.
|