• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/41

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

41 Cards in this Set

  • Front
  • Back

(Section 1.1) Explain why testing is necessary and support that explanation with examples and evidence

Testing is necessary because software fails and produces unexpected behavior, causing the software to be nonfunctional. An example of this is if the software does not produce the correct results after a user has made an input. This will create unreliable information or may cause business failure. Or if a rocket ship made by NASA fails, much money and maybe lives will be lost. Testing works to prevent or reduce risks to an acceptable level.

(Section 1.1) Discuss how testing supports quality

When testing finds defects and those defects are repaired, the quality of the system is increased. However, debugging, not testing, is the activity that changed the quality of the system. The rate of defect discovery, the number of known defects, the extent of the test coverage., and the percentage of the tests passed all reflect on the quality of the system.

(Section 1.2) Relate how testing finds and prevents defects

- finding defects, such as the identification of failures during a test execution that lead to the discovery of the underlying defects


- preventing defects- such as when early test activities such as requirements reviews or early test design identify defects in requirements specification that are removed before they cause defects in the design specifications and subsequently the code itself.

(Section 1.3) Explain the fundamental principles of testing

- testing shows the presence of defects - however testing cannot prove that there are no defects left undiscovered.


- exhaustive testing is impossible - infinite number of tests available


- early testing - test early in the process


- defect clustering - 20% of modules accounting for 80% of defects


pesticide paradox - running the same sets of tests over and over on a system will detect fewer bugs and may not provide accurate coverage and will provide false confidence in the product


testing is context dependent- testing a safety critical software would differ to an e-commerce site


- absence-of-errors fallacy - many systems have been built which failed in user acceptance testing or in the marketplace, even though they had low level of defects.

(Section 1.4) Describe the fundamental test processes

- test planning and control


- test analysis and design


- test implementation and execution


- evaluating exit criteria and reporting


- test closure activities

test planning and control

- like a trip, you use a road map to pinpoint the start and final destination with the route to take. The test control is replanning in case there are deviations in the plan during testing

test analysis and design

review the test basis, design the test environment

test implementation and execution

ideally, test implementation tasks are completed before the execution begins. otherwise, valuable time is spent preparing during test execution time

evaluating exit criteria and reporting

check test logs and write test summary report for stakeholders

test closure activities

we collect data from completed test activities to consolidate experience, testware, facts and numbers

(Section 1.5) Explain the psychology of testing and how people influence testing success

clear objectives for testing, the proper roles and balance of self-testing and independent testing, clear, courteous communication and feedback on defects

(Section 1.5) Explain and contrast the mindset of testers and programmers and why they often conflict

- testers know there are defects to find in the code, and developers must have the confidence that they have the competence to create well functioning code.


- don't have an adversarial relationship with developers. you are on the same team.

(Section 1.6) Demonstrate an understanding of the ISTQB Code of Ethics

-Public - act consistently with the public interest---Client and employer - act in a manner that is in the best interests of their client and employer, consistent with the public interest


-Product - shall ensure that the deliverables they provide meet the highest standards possible--


-Judgement - shall maintain integrity and independence in their professional judgement


-Management - test managers and leaders shall subscribe to and promote an ethical approach to the management of software testing


-Profession - advance the integrity and reputation of the profession consistent with the public interest


-Colleagues - shall be fair to and supportive of their colleagues, and promote cooperation with software developers


-Self - participate in lifelong learning

test control

a test management task that deals with developing and applying a set of corrective actions to get a test project on track when monitoring shows a deviation from what was planned

debugging

the process of finding, analyzing and removing the causes of failures in software

regression testing

testing of a previously tested program following modification to ensure that dfects have not been introduced or uncovered in unchanged areas of the software, as a result of the changes made

exit criteria

the set of generic and specific conditions, agreed upon with stakeholders, for permitting a process to be officially completed

(Section 2.1) Explain the relationship between development and testing within a development life cycle

creating a product vs testing during the development cycle by either iterative testing or the integrating validation and verification in each testing phase

(Section 2.2) Relate the typical levels of testing with respect to their major objectives (related to v-model )

-Component testing: testing software components that are separately testable


-integration testing: tests interfaces btw components, different parts of the system and hardware. How everything works together


-system testing: verification against specified requirements


-acceptance testing: validation testing with respect to user

(Section 2.2) Identify which persons perform the testing activities at various test levels

Component testing: usually involves the programmer who wrote the code, sometimes by a different programmer depending on the level of risk


Integration testing: may be carried out by the developers, but can be done by a separate team of specialist integration testers, or non-functional specialist


System testing: executed by the development organization in a properly controlled environment


Acceptance testing: developer’s site or independent testers, and then users that use it in a real world situation

(Section 2.3) Relate the four major types of test (functional, non-functional, structural and change-related) and show concrete examples for each

1. Functional testing focuses on suitability, interoperability testing, security, accuracy and compliance.


2. non-functional testing includes, performance testing, load testing, stress testing, usability testing, maintainability testing, reliability testing, portability testing. for example in performance testing we can measure transaction throughput, resource utilization, and response times.


3. structural - testing the structure of the component or system. At component integration level it may be based on the architecture of the system, such as the calling hierarchy.


4. change related- confirmation testing (re-testing), regression testing

Functional testing from two perspectives:

requirements- based: list what needs to be tested and what doesn't, test riskiest first and prioritize risky tests




business process- based: eg. someone gets hired, gets paid biweekly, the leaves the company

non-functional testing expanded:

--Functionality:suitability, accuracy, security, interoperability, and compliance


--Reliability: maturity (robustness), fault-tolerance, recoverability and compliance


--Usability: understandability, learnability, operability, attractiveness and compliance


--Efficiency: time behavior (performance), resource utilization, and compliance


--Maintainability: analyzability, changeability, stability, testability and compliance


--Portability: adaptability, installability, co-existence, replaceability and compliance

(Section 2.4) Compare maintenance testing with testing new applications

A ‘catching-up’ operation is frequently required when systems are maintained. Testing after a system is deployed. This involved migration testing and also conversion testing, where data from one application will be migrated into the system being maintained. Same levels of testing are carried out: a component test, an integration test, a system test and an acceptance test.

(Section 2.4) Identify triggers and reasons for maintenance testing

maintenance testing is triggered by modifications, migration, or retirement of the system

verification

is concerned with evaluating a work product, component or system to determine whether it meets the requirements set.

validation

is concerned with evaluating a work product, component or system to determine whether it meets the user needs and requirements

v-model

a framework to describe the software development lifecycle activities from requirements specification to maintenance.

incremental development model

a development lifecycle where a project is broken into a series of increments, each of which delivers a portion of the functionality in the overall project requirements


eg. prototyping, RAD, RUP, and agile development

iterative development model

a developement lifecycle where a project is broken into a usually large number of iterations. An iteration is a complete development loop resulting in a release of an executable product (grows from iteration to iteration to become the final product)

black-box testing (specification based testing)

testing, either functional or non-functional, without reference to the internal structure of the component or system

white-box testing (structural teesting, structure based testing)

testing based on an analysis of the internal structure of the component or system

Rapid Application Development (RAD)

is formally a parallel development of functions and subsequent integration

agile software development

is a group of software development methodologies based on iterative incremental development

eg. Scrum

integration testing (if testing how component A and B integrate)

testing the communication between the components, not the functionality of either one

161. In the context of wireless signal propagation, the phenomenon that occurs when an electromagnetic wave encounters and obstruction and splits into secondary waves. The secondary waves continue to propagate in the direction in which they were split.

d. diffraction

162. An 802.11 frame type that is responsible for carrying data between stations. Two other frame types include management frames, which are involved in association and reassociation, and control frames, which are related to medium access and data recovery.

c. data frame

163. A method used by wireless stations to detect the presence of an access point. Using this method, the station issues a probe to each channel in its frequency range and waits for the access point to respond.

a. active scanning

164. The act of driving around an area while running a laptop configured to detect and capture wireless data transmissions.

i. war driving

165. A security exploit in which a WPS PIN is discovered by means of a brute force attack, giving the attacker access to the network's WPA2 key. The PIN feature in WPS should be disabled if possible.

j. WPS attack

166. The throughput experienced at the application level, such as the quality of a video feed or the speed of a Web page loading in the browser.

e. goodput