• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/157

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

157 Cards in this Set

  • Front
  • Back
What is a database management system (DBMS)
a suite of programs used to manage large sets of structured data with ad hoc query capabilities formany types of users. These can also control the security parameters of the database.
What is a database
a collection of data stored in a meaningful way that enables multiple users and applications to access, view, and modify data as needed.
What are the characteristics of a database
• It centralizes by not having data held on several different servers throughout the network.
• It allows for easier backup procedures.
• It provides transaction persistence.
• It allows for more consistency since all the data are held and maintained in one central location.
• It provides recovery and fault tolerance.
• It allows the sharing of data with multiple users.
• It provides security controls that implement integrity checking, access control, and the necessary level of confidentiality.
What is Transaction persistence
means the database procedures carrying out transactions are durable and reliable. The state of the database’s securityshould be the same after a transaction has occurred, and the integrity of the transaction needs to be ensured.
Databases come in several types of models, what are they:
•Relational
• Hierarchical
• Network
• Object-oriented
• Object-relational
What is the most commonly used implementation of the hierarchical model
Lightweight Directory Access Protocol (LDAP) model
Define Relational data model -
Uses attributes (columns) and tuples (rows) to contain and organize information. A primary key is a field that links all the data within a record to a corresponding value.
Define Hierarchical data model -
Combines records and fields that are related in a logical tree structure. Can have one child, many children, no children. Are useful for mapping one-to-many relationships.
Define network database model
Built upon the hierarchical data model. Instead of being constrained by having to know how to go from one branch to another and then from one parent to a child to find a data element, the network database model allows each data element to have multiple parent and child records.
Define object-oriented database
designed to handle a variety of data (images, audio, documents, video). An object-oriented database management system (ODBMS) is more dynamic in nature than a relational database, because objects can be created when needed and the data and procedure (called method) go with the object when it is requested.
Define An object-relational database
(ORD) or object-relational database management system (ORDBMS) is a relational database with a software front end that is written in an object-oriented programming language.
What is a Record
A collection of related data items.
What is a File
A collection of records of the same type.
What is a Database
A cross-referenced collection of data.
What does DBMS do
Manages and controls the database.
What is a Tuple
A row in a two-dimensional database.
What is a Attribute
A column in a two-dimensional database.
What is a Primary key
Columns that make each row unique. (Every row of a table must include a primary key.)
What is a View
A virtual relation defined by the database administrator in order to keep subjects from viewing certain data.
What is a Foreign key
An attribute of one table that is related to the primary key of another table.
What is a Cell
An intersection of a row and column.
What is a Schema
Defines the structure of the database
What is a Data dictionary
Central repository of data elements and their relationships.
What is Open Database Connectivity (ODBC)
An application programming interface (API) that allows an application to communicate with a database either locally or remotely. The application sends requests to the ODBC API.

ODBC tracks down the necessary database-specific driver for the database to carry out the translation, which in turn translates the requests into the database commands that a specific database will understand.
What is the function of Object Linking and Embedding Database (OLE DB)
Separates data into components that run as middleware on a client or server. It provides a low level interface to link information across different databases and provides access to data no matter where it is located or how it is formatted. It allows different applications to access different types and sources of data.
What is ActiveX Data Objects (ADO)
An API that allows applications to access back-end database systems. It is a set of ODBC interfaces that exposes the functionality of data sources through accessible objects. ADO uses the OLE DB interface to connect with the database and can be developed with many
different scripting languages. It is commonly used in web applications and other client/server applications.
What is Java Database Connectivity (JDBC)
An API that allows a Java application to communicate with a database. The application can bridge through ODBC or directly to the database.
What are the characteristics of JDBC
• It is an API that provides the same functionality as ODBC but is specifically designed for use by Java database applications.
• It has database-independent connectivity between the Java platform and a
wide range of databases.
• JDBC is a Java API that enables Java programs to execute SQL statements.
What is Data definition language (DDL)
Defines the structure and schema of the database. The structure could mean the table size, key placement, views, and data element relationship. The schema describes the type of data that will be held and manipulated, and its properties. It defines the structure of the database, access operations, and integrity procedures.
What is Data manipulation language (DML)
Contains all the commands that enable a user to view, manipulate, and use the database (view, add, modify, sort, and delete commands).
What is Query language (QL)
Enables users to make requests of the database.
What is Report generator
Produces printouts of data in a user-defined manner.
What is a data dictionary
is a central collection of data element definitions, schema objects, and reference keys. The schema objects can contain tables, views, indexes, procedures, functions, and triggers. A data dictionary can contain the default values for columns, integrity information, the names of users, the privileges and roles for users, and auditing information. It is a tool used to centrally manage parts of a database by controlling data about the data (referred to as metadata) within the database. It provides a cross-reference between groups of data elements and the databases.
What is a primary key
is an identifier of a row and is used for indexing in relational databases. Each row must have a unique primary key to properly represent the row as one entity. When a user makes a request to view a record, the database tracks this record by its unique primary key. If the primary key were not unique, the database would not know which record to present to the user.
What is a foreign key
If an attribute in one table has a value matching the primary key in another table and there is a relationship set up between the two of them, this attribute is considered a foreign key. This foreign key is not necessarily the primary key in its current table. It only has to
contain the same information that is held in another table’s primary key and be mapped to the primary key in this other table.
What is semantic integrity
Is a mechanism that makes sure structural and semantic rules are enforced. These rules pertain to data types, logical values, uniqueness constraints, and operations that could adversely affect the structure of the database.
What is Referential integrity –
Mechanism would ensure that no record would contain a reference to a primary key of a nonexisting record or a NULL value.
What is Entity integrity
A mechanism that ensures that the tuples are uniquely identified by primary key values. In the previous illustration, the primary keys are the names of the dogs, in which case, no two dogs could have the same name. For the sake of entity integrity, every tuple must contain one primary key. If it does not have a primary key, it cannot be referenced by the database.
What is rollback
an operation that ends a current transaction and cancels the current changes to the database. These changes could have taken place with the data itself or with schema changes that were typed in. When a rollback operation is executed, the changes are cancelled, and the database returns to its previous state. A rollback can take place if the database has some type of unexpected glitch or if outside entities disrupt its processing sequence.
What is commit
A operation completes a transaction and executes all changes just made by the user. As its name indicates, once the commit command is executed, the changes are committed and reflected in the database. These changes can be made to data or schema information. Because these changes are committed, they are then available to all other applications and users. If a user attempts to commit a change and it cannot complete correctly, a rollback is performed. This ensures that partial changes do not take place and that data are not corrupted.
What are save points
Savepoints are used to make sure that if a system failure occurs, or if an error is detected, the database can attempt to return to a point before the system crashed or hiccupped. Savepoints are easy to implement within databases and applications, but a balance must be struck between too many and not enough savepoints. Having too many savepoints can degrade the performance, whereas not having enough savepoints runs the risk of losing data and decreasing user productivity because the lost data would have to be reentered.
What are Checkpoints
Checkpoints are very similar to savepoints. When the database software fills up a certain amount of memory, a checkpoint is initiated, which saves the data from the memory segment to a temporary file. If a glitch is experienced, the software will try to use this information to restore the user’s working environment to its previous state.
What is a two phase commit
A two-phase commit mechanism is yet another control that is used in databases to ensure the integrity of the data held within the database. Databases commonly carry out transaction processes, which means the user and the database interact at the same
time. The opposite is batch processing, which means that requests for database changes are put into a queue and activated all at once—not at the exact time the user makes the request. In transactional processes, many times a transaction will require that more
than one database be updated during the process. The databases need to make sure each database is properly modified, or no modification takes place at all. When a database change is submitted by the user, the different databases initially store these changes
temporarily. A transaction monitor will then send out a “pre-commit” command to each database. If all the right databases respond with an acknowledgment, then the monitor sends out a “commit” command to each database. This ensures that all of the necessary information is stored in all the right places at the right time.
What are the two main database security issues
aggregation and inference.
Define aggregations
Aggregation happens when a user does not have the clearance or permission to access specific information, but she does have the permission to access components of this information. Aggregation is the act of combining information from separate
sources. The combination of the data forms new information, which the subject does not have the necessary rights to acce
what is inference,
inference is the intended result of aggregation. The inference problem happens when a subject deduces the full story from the pieces he
learned of through aggregation. This is seen when data at a lower security level indirectly portrays data at a higher level. ss. The combined information has a sensitivity that is greater than that of the individual parts. Inference is the ability to derive information not explicitly available.
What is Context-dependent access control
means that the software “understands” what actions should be allowed based upon the state and sequence of the request. So what
does that mean? It means the software must keep track of previous access attempts by the user and understand what sequences of access steps are allowed.

(example: In a context-dependent access control situation, it would be more like this: “Does Julio have access to File A?” The system then reviews several pieces of data: What other access attempts has Julio made? Is this request out of sequence of how a
safe series of requests takes place? Does this request fall within the allowed time period of system access (8 A.M. to 5 P.M.)? If the answers to all of these questions are within a set of preconfigured parameters, Julio can access the file. If not, he needs to go find
something else to do.)
How do we keep any subject, or any application or process acting on behalf of
that subject, from indirectly being able to act in a inferable way
Context-dependent access control
Content-dependent access control
What is Content-dependent access control
based on the sensitivity of the data. The more sensitive the data, the smaller the subset of individuals who can gain access to the data.

(EXAMPLE: Content-dependent access control can go like this: “Does Julio have access to File A?” The system reviews the ACL on File A and returns with a response of “Yes, Julio can access the file, but can only read it.”)
What are Common attempts to prevent inference attacks
cell suppression, partitioning the database, and noise and perturbation.
Define Cell suppression
is a technique used to hide specific cells that contain information that could be used in inference attacks.
Define Partitioning
Partitioning a database involves dividing the database into different parts, which makes it much harder for an unauthorized individual to find connecting pieces of data that can be brought together and other information that can be deduced or uncovered.
Define Noise and perturbation
is a technique of inserting bogus information in the hopes of misdirecting an attacker or confusing the matter enough that the actual attack will not be fruitful.
Define database views
Permit one group or a specific user to see certain information, while restricting another group
from viewing it altogether.
Define Polyinstantiation
is a process of interactively producing more detailed versions of objects by populating variables with different values or other
variables. It is often used to prevent inference attacks.

This enables a table that contains multiple tuples with the same primary keys, with each instance distinguished by a security level. When this information is inserted into a database, lower-level subjects must be restricted from it. Instead of just restricting access, another set of data is created to fool the lower-level subjects into thinking the information actually means something else.
What is the purpose of Online transaction processing
(OLTP) is generally used when databases are clustered to provide fault tolerance and higher performance. OLTP provides mechanisms that watch for problems and deal with them appropriately when they do occur. For example, if a process stops functioning, the monitor mechanisms within OLTP can detect this and attempt to restart the process. If the process cannot be restarted, then the transaction
taking place will be rolled back to ensure no data is corrupted or that only part of a transaction happens.

The main goal of OLTP is to ensure that transactions happen properly or they don’t happen at all. Transaction processing usually means that individual indivisible operations are taking place independently. If one of the operations fails, the rest of the operations
needs to be rolled back to ensure that only accurate data is entered into the database.
OLTP records transactions as they occur (in real time), which usually updates more than one database in a distributed environment. This type of complexity can introduce many integrity threats, so the database software should implement the characteristics of
what’s known as the ACID test: define ACID
• Atomicity Divides transactions into units of work and ensures that all modifications take effect or none takes effect. Either the changes are committed or the database is rolled back.

• Consistency A transaction must follow the integrity policy developed for that particular database and ensure all data are consistent in the different databases.

• Isolation Transactions execute in isolation until completed, without interacting with other transactions. The results of the modification are not available until the transaction is completed.

• Durability Once the transaction is verified as accurate on all systems, it is committed, and the databases cannot be rolled back.
Define data warehousing
Data warehousing combines data from multiple databases or data sources into a large database for the purpose of providing more extensive information retrieval and data analysis. Data from different databases is extracted and transferred to a central data
storage device called a warehouse. The data is normalized, which means redundant information is stripped out and data are formatted in the way the data warehouse expects it. This enables users to query one entity rather than accessing and querying different databases.
Define Data mining
Data mining is the process of analyzing a data warehouse using tools that look for trends, correlations, relationships, and anomalies without knowing the meaning of the data. Metadata is the result of storing data within a data warehouse and mining the data with tools. Data goes into a data warehouse and metadata comes out of that data warehouse.
What is Metadata:
Data about data. Data produced by data mining tools to find associations and correlations.
Data mining is also known as knowledge discovery in database (KDD), and is a combination of techniques to identify valid and useful patterns. Different types of data can have various interrelationships, and the method used depends on the type of data and the patterns sought. What are three approaches used in KDD systems to uncover these patterns:
• Classification Groups together data according to shared similarities.

• Probabilistic Identifies data interdependencies and applies probabilities to their relationships.

• Statistical Identifies relationships between data elements and uses rule discovery.
What is fuzzy logic
a set theory, and expert systems to perform the mathematical functions and look for patterns in data that are not so apparent.
What is the goal of data warehouses and data mining
to be able to extract information to gain knowledge about the activities and trends within the organization, as shown in
Figure 11-11. With this knowledge, people can detect deficiencies or ways to optimize operations. For example, if we worked at a retail store company, we would want consumers to spend gobs and gobs of money there.
Define A garbage collector
identifies blocks of memory that were once allocated but are no longer in use and deallocates the blocks and marks them as free. It also gathers scattered blocks of free memory and combines them into larger blocks. It helps provide a more stable environment and
does not waste precious memory.
What is the difference between Verification vs. Validation
Verification determines if the product accurately represents and meets the specifications. After all, a product can be developed that does not match the original specifications, so this step ensures the specifications are being properly met.

Validation determines if the product provides the necessary solution for the intended real-world problem. In large projects, it is easy to lose sight of the overall goal. This exercise ensures that the main goal of the project is met.
What are the System Life Cycle Phases
• Project initiation
• Functional design analysis and planning
• System design specifications
• Software development
• Installation
• Maintenance support
• Disposal
What happens under Project initiation phase
• Conception of project definition
• Proposal and initial study
• Initial risk analysis
What happens under Functional design analysis and planning
• Requirements uncovered and defined
• System environment specifications determined
• Formal design created
What happens under System design specifications
• Functional design review
• Functionality broken down
• Detailed planning put into place
• Code design
What happens under Software development
• Developing and programming software
What happens under Installation
• Product installation and implementation
• Testing and auditing
What happens under Maintenance support
• Product changes, fixes, and minor modifications
• What happens under Disposal
• Replace product with new product
Over the years several Software Development Methods (SDMs) have been created to meet the various requirements of software developers and vendors. These methods help developers through the different stages (analysis, design, programming, maintenance)
of program creation. What are the common SDM
Waterfall model
Spiral model
what are the common testing approaches:
• Unit testing Individual component is in a controlled environment where programmers validate data structure, logic, and boundary conditions.

• Integration testing Verifying that components work together as outlined in design specifications.

• Acceptance testing Ensuring that the code meets customer requirements.

• Regression testing After a change to a system takes place, retesting to ensure functionality, performance, and protection.
Define the Waterfall model
A classical method using discrete phases of development that require formal reviews and documentation before moving into the next phase of the project.
define the Spiral model
A method that builds upon the waterfall method with an emphasis
on risk analysis, prototypes, and simulations at different phases of the
development cycle. This method periodically revisits previous stages to
update and verify design requirements.
Define Structured Programming Development
A programming methodology that involves the use of logical blocks to achieve system design using procedural programming. A structured program layout minimizes the use of arbitrary
transfer control statements such as GOTO and emphasizes single points of entry and exit. This hierarchical approach makes it easier for the program to be understood and modified later on. Structured programming encourages module reuse to allow for better memory utilization.
Define Iterative Development
A method that follows a cyclic approach to software development. Unlike traditional models, iterative development focuses on mapping out project milestones through continually assessing
the current state of the project with the initial objectives on the basis of resources, timeframes, and execution plan. Iterative development provides a dynamic method of evaluating a project’s overall status and allows corrective amendments to improve project effectiveness.
Define Modified Prototype Model (MPM)
A method specifically designed to confront challenges in web application development, the modified prototype model allows developers to swiftly translate client requirements into a
displayable product or prototype. Modified prototypes are generally used when both the developer and the client are unsure of the final nature of the product. Using modifiable prototypes allows the final product to be carved out as the system specifications become less hazy.
Define the Exploratory Model
A method that is used in instances where clearly defined project objectives have not been presented. Instead of focusing on explicit tasks, the exploratory model relies on covering a set of specifications that are likely to encase the final product’s working. Testing is an important part of
exploratory development as it ascertains that the current phase of the project is compliant with likely implementation scenarios.
Define Joint Analysis Development (JAD)
A method that uses a team approach in application development in a workshop-oriented environment.
Define Rapid Application Development (RAD)
A method of determining user requirements and developing systems quickly to satisfy immediate needs.
Define Reuse Model
A model that approaches software development by using progressively developed models. Reusable programs are evolved by gradually modifying pre-existing prototypes to customer specifications. Since the reuse model does not require programs to be built from scratch, it drastically reduces both development cost and time.
Define Cleanroom
An approach that attempts to prevent errors or mistakes by following structured and formal methods of developing and testing. This approach is used for high-quality and critical applications that will be put through a strict certification process.
Define Component-Based Development
A model that refers to the use of independent and standardized modules that are assembled into serviceable programs. Each standard module consists of a functional algorithm or instruction set and is provided with an interface to communicate with one another. A common example of these modules are “objects,” which are frequently used in object oriented programming. Component-based development adds reusability and pluggable functionality to programs and is widely used in modern programming to augment program coherence and substantially reduce software maintenance costs.
Define Extreme Programming
A methodology that is generally implemented in scenarios requiring rapid adaptations to changing client requirements. Extreme programming emphasizes client feedback to evaluate project outcomes and to analyze project domains that may require further attention.
The coding principle of extreme programming throw outs the traditional long-term planning carried out for code reuse and instead focuses on creating simple code optimized for the contemporary assignment. Extreme programming recognizes the fact that customer requirements are likely to change significantly throughout the project life cycle and plies the development process to adjust these changes.
Define Computer-aided software engineering ( CASE)
involves the use of tools to create and manage software. “CASE tools” is a general term for many types of tools used by programmers, developers, project managers, and analysts that help them make application development faster and with fewer errors. This is because many of the manual tasks are taken care of through automation with the use of CASE tools. Different tools provide managerial, administrative, and technical help in software projects. The first CASE tools were translators, compilers, assemblers, linkers, and loaders.
Define Attack surface analytics
are created once a threat model has been created. The use of attack surface analysis techniques provides a structured process for analyzing program entry points as well. Attack surface analytics focus on documenting possible entry points irrespective of their defined privileges. Once these entry points have been documented, they can be used to specify program granularity as the program matures
through its development cycle.
What are the six step in the change control process
The following are some necessary steps for a change control process:
1. Make a formal request for a change.
2. Analyze the request.
A. Develop the implementation strategy.
B. Calculate the costs of this implementation.
C. Review any security implications.
3. Record the change request.
4. Submit the change request for approval.
5. Develop the change.
A. Recode segments of the product and add or subtract functionality.
B. Link these changes in the code to the formal change control request.
C. Submit software for testing and quality approval.
D. Repeat until quality is adequate.
E. Make version changes.
6. Report results to management.
What is the Capability Maturity Model (CMM)
describes procedures, principles, and practices that underlie software development process maturity. This model was developed to
help software vendors improve their development processes by providing an evolutionary path from an ad hoc “fly by the seat of your pants” approach, to a more disciplined and repeatable method that improves software quality, reduces the life cycle of development,
provides better project management capabilities, allows for milestones to be created and met in a timely manner, and takes a more proactive approach than the less effective reactive approach. This model provides policies, procedures, guidelines, and best practices to allow an organization to develop a standardized approach to software development that can be used across many different groups. The goal is to continue to review and improve upon the processes to optimize output, increase capabilities, and provide higher-quality software at a lower cost.
What are the five maturity levels of the Capability Maturity Model (CMM)
• Initial Development process is ad hoc or even chaotic. The company does not use effective management procedures and plans. There is no assurance of consistency, and quality is unpredictable.

• Repeatable A formal management structure, change control, and quality assurance are in place. The company can properly repeat processes throughout each project. The company does not have formal process models defined.

• Defined Formal procedures are in place that outline and define processes carried out in each project. The organization has a way to allow for quantitative process improvement.

• Managed The company has formal processes in place to collect and analyze qualitative data, and metrics are defined and fed into the process improvement program.

• Optimizing The company has budgeted and integrated plans for continuous process improvement.
Define software escrow
In a software escrow, a third party keeps a copy of the source code, and possibly other materials, which it will release to the customer
only if specific circumstances arise, mainly if the vendor who developed the code goes out of business or for some reason is not meeting its obligations and responsibilities. This procedure protects the customer, because the customer pays the vendor to develop software code for it, and if the vendor goes out of business, the customer otherwise would no longer have access to the actual code. This means the customer code could never be updated or maintained properly.
Define Machine language:
Is in a form that the computer and processor can understand and work with directly
Define Assembly language:
Cannot be understood directly by the system and must be processed, which results into machine code language.
Define High-level language:
Cannot be understood directly by the system and must be processed, which results into machine code language.
What are the five generation of program languages
• Generation one: Machine language
• Generation two: Assembly language
• Generation three: High-level language
• Generation four: Very high-level language
• Generation five: Natural language

Mnemonic: MAHVN (MAHVN Gaye)
Various programs are used to turn high-level programming code (or source code) into object or machine code. They work as translators
These programs are interpreters, compilers, and assemblers. Define the responsibility of each
Interpreters translate one command at a time during execution,

Compilers translate large sections of code at a time.

Assemblers translate assembly language into machine language.

Most applications are compiled, whereas many scripting languages are interpreted.
What is Object-oriented programming (OOP)
OOP structures are made of Classes that are the characteristic and attributes of the objects. The object encapsulates the characteristics. The objects are re-usable. The commands are performed by the method. The object communicates using messages. The data are hidden (data hiding) from all other program outside the object. Information hiding - There is no need for other components to know how each object works internally.
Define Abstraction
Abstraction is the ability to view unnecessary details so that important ones are examined and reviewed. It looks at the whole picture instead of concentrating on small details. It allows
separation of conceptual aspects of the system.
Note: Data hiding is provided by encapsulation, which protects an object’s private data from outside access. No object should be allowed to, or have the need to, access another object’s internal data or processes.
Note: Data hiding is provided by encapsulation, which protects an object’s private data from outside access. No object should be allowed to, or have the need to, access another object’s internal data or processes.
Define Polymorphism
Polymorphism is when two objects send the same message but have different results. In other words objects respond to the same command in a different way. It is possible because objects belong to different classes. “Two objects can receive the same input and have different outputs.”

(example As a simplistic example of polymorphism, suppose three different objects receive
the input “Bob.” Object A would process this input and produce the output “43-yearold
white male.” Object B would receive the input “Bob” and produce the output “Hus- band of Sally.” Object C would produce the output of “Member of User group.” Each object received the same input, but responded with a different output.)
define Object-oriented analysis (OOA)
is the process of classifying objects that will be appropriate for a solution. A problem is analyzed to determine the classes of objects to be used in the application.
Define Object-oriented design (OOD)
creates a representation of a real-world problem and maps it to a software solution using OOP. The result of an OOD is a design that modularizes data and procedures. The design interconnects data objects and processing operations.
What is a structured analysis approach.
A full-structured analysis approach looks at all objects and subjects of an application and maps the interrelationships, communications paths, and inheritance properties.
What is data modeling
Which considers data independently of the way the data are processed and of the components that process the data. A data model follows an input value from beginning to end and verifies that the output is correct.
What is a data structure
is a representation of the logical relationship between elements of data. It dictates the degree of association between elements, methods of access, processing alternatives, and the organization of data elements.
Define Cohesion and coupling
reflects how many different types of tasks a module can carry out. If a module carries out only one task (subtraction of values) or several tasks that are very similar (subtract, add, multiply), it is described as having high cohesion, which is a
good thing. The higher the cohesion, the easier it is to update or modify and not affect other modules that interact with it.

complexity of the module, which makes it harder to maintain and reuse.
Coupling is a measurement that indicates how much interaction one module requires to carry out its tasks. If a module has low (loose) coupling, this means the module does not need to communicate to many other modules to carry out its job. High (tight) coupling means a module depends upon many other modules to carry out its tasks. Low coupling is more desirable because the modules are easier to understand, easier to reuse, and changes can take place and not affect many modules around it.
NOTE Modules should be self-contained and perform a single logical function, which is high cohesion. Modules should not drastically affect each other, which is low coupling.
NOTE Modules should be self-contained and perform a single logical function, which is high cohesion. Modules should not drastically affect each other, which is low coupling.
What is Common Object Request Broker Architecture (CORBA)
is an open object-oriented
CORBA allows different applications written in different languages to communicate. Standard set of interfaces and APIs for system
to communicate with various ORB.
Define (object request brokers[ORBs]
ORB is the middleware that establishes the client/server relationship between objects. When a client needs to access an object on a server for that object to perform an operation or method, the ORB intercepts the request and is responsible for finding the
object. Once the object is found, the ORB invokes a method (or operation), passes the parameters, and returns the result to the client. The client software does not need to know where the object resides or go through the trouble of finding it. That is the ORB’sjob.
Define Component Object Model(COM)
is a model that allows for interprocess communication within one application or between applications on the same computer system. The model was created by Microsoft and outlines standardized APIs, component naming schemes, and communication standards. So if I am a developer and I want my application to be able to interact with the Windows operating system and the different applications developed for this platform, I will follow the COM outlined standards.
Define Distributed Component Object Model

(DCOM)
supports the same model for component interaction, and also supports distributed IPC. COM enables applications to use
components on the same systems, while DCOM enables applications to access objects that reside in different parts of a network. So this is how the client/server-based activities are carried by COM-based operating systems and/or applications. Without DCOM, programmers would have to write much more complicated code to find necessary objects, set up network sockets, and incorporate the services necessary to allow communication. DCOM works as the middleware that enables distributed processing and provides
developers with services that support process-to-process communications across networks
what is Simple Object Access Protocol SOAP
What if we need programs running on different operating systems to communicate over web-based communication methods? We would use Simple Object Access Protocol (SOAP). SOAP is an XML-based protocol that encodes messages in a web service setup. So if you have a Windows 2000 computer, for instance, and you need to access a Windows 2008 computer that offers a specific web service,
the programs on both systems can communicate using SOAP without running into interoperability issues. This communication most commonly takes place over HTTP because it is readily available in basically all computers today.
Define Enterprise JavaBeans (EJB)
is a structural design for the development and implementation of distributed applications written in Java. EJB provides interfaces and methods to allow different applications to be able to communicate across a networked environment.
Define Object linking and embedding (OLE)
provides a way for objects to be shared on a local personal computer and to use COM as their foundation. OLE enables objects—such as graphics, pictures, and spreadsheets—to be embedded into documents
Define Distributed Computing Environment (DCE)
DCE / Distributed Computing Environment: Is a set of management services with a communication layer based on RPC.
Is a layer of software that sits on top of the network layer and provides services to the applications above it.
Uses universal unique identifier, UUID, to uniquely identify users, resources and components within an environment.
The RPC function collects the arguments and commands from the sending program and prepares them for transmission over the network.
The DFS / Distributed File Services provides a single integrated file system that all DCE users can use to share files.
What are Expert systems / knowledge based systems
The programs that can emulate human expertise in specific domains are called expert systems.
An expert system is a computer program containing a knowledge base and a set
of algorithms and rules used to infer new facts from data and incoming requests. Use artificial intelligence / emulate human knowledge to solve problems.
Define Rule-based programming
is a common way of developing expert systems. The rules
are based on if-then logic units and specify a set of actions to be performed for a given
situation.
Define pattern matching
based on if-then logic units and specify a set of actions to be performed for a given
situation
Define inference engine
A mechanism that automatically matches facts against patterns and determines which rules are applicable. The actions of the corresponding rules are executed when the inference engine is instructed to begin execution.
NOTE Expert systems use automatic logical processing, inference engine processing, and general methods of searching for problem solutions.
NOTE Expert systems use automatic logical processing, inference engine processing, and general methods of searching for problem solutions.
What is a Artificial Neural Networks
An artificial neural network (ANN) is a mathematical or computational model based on the neural structure of the brain. Computers perform activities like calculating large numbers, keeping large ledgers, and performing complex mathematical functions, but they cannot recognize patterns or learn from experience as the brain can. ANNs contain many units that stimulate neurons, each with a small amount of memory. The units work on data that are input through their many connections. Via training rules, the systems are able to learn from examples and have the capability to generalize.
NOTE Decisions by ANNs are only as good as the experiences they are given.
NOTE Decisions by ANNs are only as good as the experiences they are given.
Define Parameter validation
is where the values that are being received by the application are validated to be within defined limits before the server application processes them within the system.

Code that can be transmitted across a network, to be executed by a system or device on the other end, is called what
mobile code.
Java is an object-oriented, platform-independent programming language. It is employed as a full-fledged programming language and is used to write complete programs and short programs, which run in a user’s browser and called
Applets
Other languages are compiled to object code for a specific operating system and processor. This is why a particular application may run on Windows but not on Macintosh. An Intel processor does not necessarily understand machine code compiled for an Alpha processor, and vice versa. Java is platform independent because it creates intermediate code, which is not processor-specific. What is this code called
When an applet is executed, the JVM will create a virtual machine within an secure environment This virtual machine is an enclosed environment in which the applet carries out its activities. This secure environment is called
a sandbox.
Note:
Browser Settings Java applets and the actions they perform can be prevented and controlled by specific browser settings. These settings do not affect full-fledged Java applications running outside of the browser.
Note:
Browser Settings Java applets and the actions they perform can be prevented and controlled by specific browser settings. These settings do not affect full-fledged Java applications running outside of the browser.
Define ActiveX
is a Microsoft technology composed of a set of OOP technologies and tools based on COM and DCOM. A programmer uses these tools to create ActiveX controls, which are self-sufficient programs similar to Java applets. ActiveX controls can be reused by many applications within one system, or different systems, on the network. Instead of trying to keep ActiveX controls in a safe area for various computations and activities, this technology practices security by informing the user where the program came from. The user can decide whether or not to trust this origin. Unlike Java applets, ActiveX components are downloaded to a user’s hard drive when he chooses to add the functionality the component provides. This means the ActiveX component has far greater access to the user’s system compared to Java applets. ActiveX uses Authenticode technology, which relies on digital certificates and trusting certificate authorities.
What is a virus
a small application, or string of code, that infects applications. The main function of a virus is to reproduce, and it requires a host application to do this. In other words, viruses cannot replicate on their own. A virus infects a file by inserting or attaching a copy of itself to the file. The virus may also cause destruction by deleting system files, displaying graphics, reconfiguring systems, or overwhelming mail servers.
What is a macro virus
a virus written in one of these macro languages and is platform independent. They infect and replicate in templates and within documents. Macro viruses are common because they are extremely easy to write, and Office products are in wide use.
What are boot sector viruses
Some viruses infect the boot sector (boot sector viruses) of a computer and either move data within the boot sector or overwrite the sector with new information. Some boot sector viruses have part of their code in the boot sector, which can initiate the virus,
and the rest of their code in sectors on the hard drive it has marked off as bad. Because the sectors are marked as bad, the operating system and applications will not attempt to use those sectors; thus, they will not get overwritten.
What are compression viruses
Other types of viruses append themselves to executables on the system and compress them by using the user’s permissions. When the user chooses to use that executable, the system automatically decompresses it and the malicious code, which usually causes the malicious code to initialize and perform its dirty deeds.
What is a stealth viruses
A stealth virus hides the modifications it has made to files or boot records. This can be accomplished by monitoring system functions used to read files or sectors and forging the results. This means that when an antivirus program attempts to read an infected
file or sector, the original uninfected form will be presented instead of the actual infected form. The virus can hide itself by masking the size of the file it is hidden in or actually move itself temporarily to another location while an antivirus program is carrying
out its scanning process.
What is a polymorphic virus
produces varied but operational copies of itself. This is done in the hopes of outwitting a virus scanner. Even if one or two copies are found and disabled, other copies may still remain active within the system. The polymorphic virus can use different encryption schemes requiring different decryption routines. This would require an antivirus scan for several scan strings, one for each possible decryption method, in order to identify all copies of this type of virus. These viruses can also vary the sequence of their instructions by including noise, or bogus instructions, with other useful instructions. They can also use a mutation engine and a random-number generator to change the sequence of their instructions in the hopes of not being detected. A polymorphic virus has the capability to change its own code, enabling the virus to have hundreds or thousands of variants.
What is a multipart virus
infects both the boot sector of a hard drive and executable files. The virus first becomes resident in memory and then infects the boot sector. Once it is in memory, it can infect the entire system.
What is A self-garbling virus
attempts to hide from antivirus software by garbling its own code. As the virus spreads, it changes the way its code is formatted. A small portion of the virus code decodes the garbled code when activated.
What is a Meme viruses
are not actual computer viruses but types of e-mail messages that are continually forwarded around the Internet. They can be chain letters, e-mail hoax virus alerts, religious messages, or pyramid selling schemes. They are replicated by humans,
not software, and can waste bandwidth and spread fear.
What are Script viruses
have been quite popular and damaging over the last few years. Scripts are files that are executed by an interpreter—for example, Microsoft Windows Script Host, which interprets different types of scripting languages.
What are tunneling virus
attempts to install itself under the antivirus program. When the antivirus goes around doing its health check on critical files, file sizes, modification dates, and so on, it makes a request to the operating system to gather this information.
Define Worm
They can reproduce on their own with no need for a host application and that they are self-contained programs.
Define Logic bomb
Will execute a program, or string of code, when a certain event happens.
Define Trojan horse
Is a program disguised as another program.
Define DoS / Denial of Service
An attack consuming the victim’s bandwidth or resources, that cause the system to crash or stop processing other packet.
Define Smurf attack and its Countermeasures
Requires three players: the attacker, the victim and the amplifying network. The attacker spoofs, or changes the source IP address in a packet header, to make an ICMP ECHO packet seem as though it originated at the victim’s system. This ICMP ECHO message
is broadcasted to the amplifying network, which will reply to the message in full force. The victims system and victim’s network is overwhelmed.


Countermeasures
• Disable direct broadcast functionality at border routers to make sure a certain
network is not used as an amplifying site.
• Configure perimeter routers to reject as incoming messages any packets that
contain internal source IP addresses. These packets are spoofed.
• Allow only the necessary ICMP traffic into and out of an environment.
• Employ a network-based IDS to watch for suspicious activity.
• Some systems are more sensitive to certain types of DoS, and patches have
already been released. The appropriate patches should be applied.
Define a Fraggle attack
Uses UDP as its weapon of choice. The attacker broadcasts a spoofed UDP packet to the amplifying network, which in turn replies to the victim’s system

Countermeasures
• Disable direct broadcast functionality at perimeter routers to make sure a certain network is not used as an amplifying site.
• Packets that contain internal source IP addresses should not be accepted by perimeter routers as incoming messages. These packets are spoofed.
• Allow only the necessary UDP packets into and out of the environment.
• Employ a network-based IDS to watch for suspicious activity.
• Some systems are more sensitive to certain types of DoS, and certain patches may have already been released. The appropriate patches should thus be applied.
Define a SYN Flood
Continually sending the victim SYN messages with spoofed packets. The victim will commit the necessary resources to set up this communication socket and it will send its SYN/ACK message waiting for the ACK message in return.

Countermeasures
• Decrease the connection-established timeout period. (This only lessens the
effects of a SYN attack.)
• Increase the size of the connection queue in the IP stack.
• Install vendor-specific patches, where available, to deal with SYN attacks.
• Employ a network-based IDS to watch for this type of activity and alert the
responsible parties when this type of attack is under way.
• Install a firewall to watch for these types of attacks and alert the administrator
or cut off the connection.
Define a Teardrop
An attacker sending very small packets that would cause a system to freeze or reboot. Causes by the fact that some systems make sure that packets are not too large, but do not check to see if a packet is too small.

Countermeasures
• Install the necessary patch or upgrade the operating system.
• Disallow malformed fragments of packets to enter the environment.
• Use a router that combines all fragments into a full packet prior to routing it to the destination system.
Define a DDoS / Distributed Denial of Service
A record at a DNS server is replaced with a new record pointing at a fake/false IP address. Cache poisoning - The attacker inserting data into the cache of the server instead of replacing the actual records.


Is a logical extension of the DoS. The attacker creates master controllers that can in turn control slaves / zombie machines.

Countermeasures
• Use perimeter routers to restrict unnecessary ICMP and UDP traffic.
• Employ a network-based IDS to watch for this type of suspicious activity.
• Disable unused subsystems and services on computers.
• Rename the administrator account and implement strict password
management so systems cannot be used unknowingly.
• Configure perimeter routers to reject as incoming messages any packets that
contain internal source IP addresses. These packets are spoofed.
Define a DNS DoS Attacks
A record at a DNS server is replaced with a new record pointing at a fake/false IP address. Cache poisoning - The attacker inserting data into the cache of the server instead of replacing the actual records.
Define Remote Access Trojans (RATs)
are malicious programs that run on systems and allow intruders to access and use a system remotely. They mimic the functionality of legitimate remote control programs used for remote administration, but are used for sinister purposes instead of helpful activities. They are developed to allow for stealth installation and operation, and are usually hidden in some type of mobile code, such as Java applets or ActiveX controls, that are downloaded from web sites.
NOTE The Slammer worm was released in 2003. It took advantage of a buffer overflow within the Microsoft SQL Server 2000 software and caused excessive Denial-of-Service attacks. Several documented accounts have estimated the resulting damages to industry to be over $1 billion.
NOTE The Slammer worm was released in 2003. It took advantage of a buffer overflow within the Microsoft SQL Server 2000 software and caused excessive Denial-of-Service attacks. Several documented accounts have estimated the resulting damages to industry to be over $1 billion.
What is Signature-based detection (also called fingerprint detection)
is an effective way to detect malicious software, but there is a delayed response time to new threats. Once a virus is detected, the antivirus vendor must study it, develop and test a new signature, release the signature, and all customers must download it.
CAUTION Diskless workstations are still vulnerable to viruses, even though they do not have a hard disk and a full operating system. They can still get viruses that load and reside in memory. These systems can be rebooted remotely (remote booting) to bring the memory back to a clean state, which means the virus is “flushed” out of the system.
CAUTION Diskless workstations are still vulnerable to viruses, even though they do not have a hard disk and a full operating system. They can still get viruses that load and reside in memory. These systems can be rebooted remotely (remote booting) to bring the memory back to a clean state, which means the virus is “flushed” out of the system.
NOTE The virtual machine or sandbox is also sometimes referred to as an emulation buffer. They are all the same thing—a piece of memory that is segmented and protected so that if the code is malicious, the system is protected.
NOTE The virtual machine or sandbox is also sometimes referred to as an emulation buffer. They are all the same thing—a piece of memory that is segmented and protected so that if the code is malicious, the system is protected.
Define Bayesian filtering
Bayesian logic reviews prior events to predict future events, which is basically quantifying uncertainty. Conceptually, this is not too hard to understand. If you run into a brick wall three times and fall down, you should conclude that your future attempts
will result in the same painful outcomes. What is more interesting is when this logic is performed on activities that contain many more variables.
What is the goal of patch management?
the goal of which is to introduce patches and hot fixes into the production environment in a controlled fashion and to always have a
rollback plan