Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
51 Cards in this Set
- Front
- Back
Programming-in-the-many
|
The product is implemented by a team, working at the same time on different components of the product
|
|
Meaningful variable names
|
meaningful from the viewpoint of future maintenance programmers
|
|
Consistent variable names
|
Use of meaningful variable names
|
|
Self-documenting code
|
The implication is that their variable names are chosen so carefully and their code crafted so exquisitely that there is no need for comments
|
|
Prologue Comments
|
Every variable name be explained at the beginning of the code artifact
|
|
Integration
|
Code and test each code artifact separately, link together all 13 code artifacts, and test the product as a whole
|
|
Stub
|
An empty artifact
|
|
Driver
|
A code artifact that calls it one or more times, if possible checking the values returned by the artifact under test
|
|
Top-down integration
|
Implemented and integrated top down
|
|
Logic artifacts
|
Essentially incorporates the decision-making flow of control aspects of the product
|
|
Operational artifacts
|
Perform the actual operations of the product
|
|
Defensive programming
|
The type of design where calling artifacts includes a safety check
|
|
Sandwich Integration
|
When all artifacts have been appropriately integrated, the interfaces between the two groups of artifacts are tested one by one. There is fault isolation at all times during this process
|
|
Implementation workflow
|
To implement the target software product in the selected implementation language
|
|
Code Artifacts
|
codes that are broken down from subsystems
|
|
Unit testing
|
As soon as a code artifact has been coded, the programmer tests it
|
|
Non-execution-based testing
|
The artifact is reviewed by a team
|
|
Execution-based testing
|
The artifact is run against test cases
|
|
Test to specifications
|
the code itself is ignored; the only information used in drawing up test cases is the specification document; black-box, behavioral, data driven, functional, and input/output-driven testing
|
|
Test to code
|
Ignore the specification document when selecting test cases; glass-box, white-box, structural, logic-driven, and path-oriented testing
|
|
Reliable
|
Products exist for which some data exercising a given path detect a fault and different data exercising the same path do not.
|
|
Valid
|
Path-oriented testing is valid because it does not inherently preclude selecting test data that might reveal the fault
|
|
Equivalence class
|
A set of test cases such that cases such that any one member of the class is as good a test case as any other
|
|
Boundary value analysis
|
To maximize the chances of finding such a fault
|
|
Functional testing
|
The methods implemented in the code artifact under test are identified, and test data are devised to test each method seperately
|
|
Functional Analysis
|
To determine faults
|
|
Statement coverage
|
Running a series of test cases during which every statement is executed at least once
|
|
branch coverage
|
Running a series of tests to ensure that all branches are tested at least once
|
|
Structural tests
|
Techniques such as statement or branch coverage
|
|
Path coverage
|
Testing all paths
|
|
All-definition-use-path-coverage
|
Each occurrence of a variable in the source code is labeled either as a definition of a variable or a use of the variable
|
|
Complexity
|
An aid in determining which code artifacts are most likely to have faults
|
|
Cyclomatic complexity
|
The number of binary decisions plus 1
|
|
Structured testing
|
Cyclomatic complexity can be used as a metric for the number of test cases needed for branch coverage of a code artifact
|
|
Cleanroom technique
|
A combination of a number of different software development techniques, including an incremental life-cycle model, formal techniques for analysis and design, and non-execution-based unit-testing techniques, such as code reading and code walkthroughs and inspections
|
|
Testing fault rate
|
The total number of faults detected per thousand lines of code
|
|
Debugging
|
Detection of the fault and correction of the code
|
|
Integration testing
|
Each new code artifact must be tested when it is added to what has already been integrated
|
|
Product testing
|
When the integration process is complete, the product as a whole is tested
|
|
Acceptance testing
|
When developers are confident about the correctness of every aspect of the product, it is handed over to the client
|
|
Stress testing
|
Making sure that it behaves correctly when operating under a peak load, such as all terminals trying to log on at the same time
|
|
Volume testing
|
Making sure that it can handle large input files
|
|
Tool
|
An online interface checker or a build tool
|
|
Workbench
|
Tools can be combined that supports one or two activities within the software process, such as configuration control or coding
|
|
Environment
|
Provides computer aided support for most of the process
|
|
Integrated environment
|
Support the development and maintenance effort
|
|
User interface integration
|
All the tools in the environment share a common user interface
|
|
Tool integration
|
All the tools communicate via the same data format
|
|
Process integration
|
An environment that supports one specific software process
|
|
Technique-based environment
|
An environment of this type supports only a specific technique for developing software, rather than a complete process
|
|
Portable Common tool environment
|
An infrastructure that provides the services needed by CASE tools, in much the same way that UNIX provides the operating system services needed by user products
|