Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
34 Cards in this Set
- Front
- Back
Testing and debugging are different |
Blah |
|
Verification |
Does system meet reqs |
|
Validation |
Does system meet user needs |
|
Error Defect Failure |
Dev makes error, causing defect, failure is seen |
|
Test planning |
Identify objectives of testing and testing approach. Making test schedule and defining suitable test techniques |
|
Test monitoring and control |
Ongoing comparison of actual progress against planned progress using metrics. Taking actions necessary to meet objectives of test plan |
|
Test analysis |
Test basis is analyzed to identify testable features and define associated test conditions. Capturing traceability. Work products are test conditions |
|
Test design |
High level test cases, sets of test cases, and test ware are created. Test environment is designed. Identify test data. Test cases are work products |
|
Test implementation |
Test cases are sequenced into test procedures. Automated test scripts. Build test environment. Prepare test data . Test suites are work products |
|
Testing steps acronym PMCADIE |
Planning Monitoring Control Analysis Design Implementation Execution People make cakes and dine in excellence |
|
Functional tests |
Standard testing etc |
|
Non functional tests |
Performance testing, useability testing, etc |
|
Absence of errory fallacy |
Belief that expects that finding and fixing a large number of defects will ensure the success of a system |
|
Objective obtained by having testers involved in reqs reviews or user story refinement |
Reduces risk of incorrect or untestable functionality |
|
Quality management |
Includes both QA and qc |
|
Test condition |
Something that can be tested |
|
Main tasks of work product review process |
Planning Initiate review Individual review Issue communication and analysis Fixing and reporting |
|
In a formal review, who executes the control decisions in the event of inadequate outcomes? |
Manager |
|
Which step of the review process do we check that entry criteria are met? |
Planning |
|
Which step of review is the work product distributed? |
Initial review |
|
Which step of review process do we note potential recommendations and questions? |
Individual review |
|
Which step of review process do we gather the metrics? |
Fixing and reporting |
|
Which type of review is commonly used in agile |
Pair review |
|
What review type may take the form of scenarios, dry runs, or simulations? |
Walkthrough |
|
When should the expected results of a test case be defined? |
When the test case is written, prior to execution |
|
Which black box testing technique focuses on covering all combinations of triggering conditions? |
Decision table testing |
|
What do use cases describe? |
Process flows |
|
Types of structure based testing |
Decision testing Statement testing |
|
Types of experienced based testing |
Error guessing Exploratory Checklist based |
|
Best describes task partition between test managers and tester |
Manager plans, organizes and controls the testing activity, while tester specifies, automates, and executes tests |
|
True statement about test planning |
It should be a continuous activity |
|
Risk based testing is |
Analytical |
|
Configuration management system would not normally provide |
Facilities to compare test results with expected results |
|
Another attribute to indicate priority |
Urgency |