• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/293

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

293 Cards in this Set

  • Front
  • Back

Playability What is the common theme of the English and Redman definitions of data quality?

The common theme of the English and Redman definitions of data quality is "meeting customer expectations".:
1.1 - Ref: IDMA 2, DQFG, C.14, IDW C.2, O.1

What is the primary difference between the English and Redman definitions of data?
English uses the term "fact" in defining data while Redman prefers not to use the word "fact". Redman defines data in terms of data models and data values, which would be data in context and more similar to what English calls information.:
1.2 - Ref: IDMA 2, DQFG, C.14, IDW C.2, O.1

How does English distinguish between data and information?
Information is data in context. While data is a "fact" about a "thing", information consists of data that is defined and presented in such a way that the "fact" about a "thing" becomes understandable:
1.3 - Ref: IDMA 2, IDW, C.2, P.19, O.1

What is inherent information quality?
Inherent information quality is, simply stated, data accuracy. It is the degree to which data accurately reflects the real-world object that the data represents.:
1.4 - Ref: IDMA 2, IDW, C.2, P.22, O.2

What are the components of information quality as defined by English?
The components of information quality are data definition quality, data content quality, and data presentation quality.:
1.5 - Ref: IDMA 2, IDW, C.2, P.27, O.2

Why are CEOs aware of the impact of data quality?
CEOs are aware of the impact of data quality because CEOs know that so much is at stake for the business and its customers.:
1.6 - Ref: IDMA 2, DQFG, C.1, P.5, O.3

Data consumers have certain expectations about websites. According to Redman, these expectations can be separated into six categories. What are they?
1. Privacy
2. Content
3. Quality of Values
4. Presentation
5. Improvement
6. Commitment
:
1.7 - Ref: IDMA 2, DQFG, C.2, P.8 and 9, O.3

What are Internet users most concerned about?
The quality issue of greatest importance is privacy.:
1.8 - Ref: IDMA 2, DQFG, C.2, P.10, O.3

Why is the quality of Internet data so important?
The quality of internet data is important for gaining and sustaining competitive advantage.:
1.9 - Ref: IDMA 2, DQFG, C.2, P.13, O.3

Why are CFOs concerned about data quality?
CFOs are concerned about data quality because all their knowledge of company finances comes from the data.:
1.10 - Ref: IDMA 2, DQFG, C.3, P.15, O.3

Over-billing can lead to a host of customer-related problems. According to Redman, what other problem may over-billing indicate?
If an organization over-bills, it is probably under-billing as well, by the same amount, or more, and that is seldom reported by customers.:
1.11 - Ref: IDMA 2, DQFG, C.3, P.15, O.3

According to Redman, what is the primary reason for billing issues?
The primary reason for billing issues is that there are so many hand-offs in the end-to-end billing chain that are not properly managed.:
1.12 - Ref: IDMA 2, DQFG, C.3, P.19, O.3

It is said that leading companies, consciously or not, pursue one of three strategies – customer intimacy, product leadership, or price leadership. From a data quality perspective, why is it important to determine which of these applies to your company?
Identifying the strategy being pursued by your company is important because the most important data are those required for executing the most important strategies, and that's where data quality efforts should be concentrated.:
1.13 - Ref: IDMA 2, DQFG, C.4, P.22, O.3

According to Redman, what is the most frequently-cited issue with the quality of customer data?
The most frequently cited issue with the quality of customer data is the high rate of return for direct mail.:
1.14 - Ref: IDMA 2, DQFG, C.4, P.23, O.3

According to Redman, which of these describes the goal of getting to a common definition of customer?
A. Almost impossible
B. Necessary to the business
C. Easy, but only if done the right way
D. Most important to companies pursuing the product leadership strategy
A. Almost impossible:
1.15 - Ref: IDMA 2, DQFG, C.4, P.25, O.3

For what data is the CIO responsible?
The CIO is responsible for data resource and data warehouse data. :
1.16 - Ref: IDMA 2, DQFG, C.5, P.28, O.3

Redman identified five reasons why building data quality into a data warehouse is so challenging. What are they?
1. There are different customers for data warehouses than for operational systems
2. The underlying decision-making chains may be more poorly understood than are operational processes, further complicating the understanding of customer needs
3. In operational systems, new data tends to be more important than historical data – in data warehouses, historical data is important too
4. Many warehouses draw data from numerous operational sources, which can be difficult to standardize
5. Data must be cleaned-up both upon entry into the data warehouse and day-in and day-out, which can be a daunting challenge
:
1.17 - Ref: IDMA 2, DQFG, C.5, P.31, O.3

According to Redman, which of these ensures that metadata is well-defined, kept up-to-date, and made easily available to all?
A. Data flow diagrams
B. Data documentation standards
C. Data dictionary
D. Data resource chain
D. Data resource chain:
1.18 - Ref: IDMA 2, DQFG, C.5, P.33, O.3

According to Redman, why should everyone be concerned about data quality?
Everyone should be concerned about data quality because the only people who need not worry about it are those who neither create nor use data. No one participating in any modern economy can make that claim.:
1.19 - Ref: IDMA 2, DQFG, C.6, P.35, O.3

Redman states that although data underlies almost everything we do, we still don't think much about data or data quality, and so poor data quality can be insidious. He lists three reasons for this — what are they?
1. Although virtually every activity the Information Age organization undertakes requires data, the primary results are not data.
2. Data are invisible — you don't really touch data per se.
3. Individuals may recognize that poor data hinder their work, but few are concerned enough to worry about how the next person will be impacted by the data he or she creates.
:
1.20 - Ref: IDMA 2, DQFG, C.8, P.43 and 44, O.4

Which of these summarizes Redman's statement about poor data quality?
A. It is impossible to cost
B. It is a contributing cause to the organization’s most important business problems
C. It can be seen and reported on by everyone in the organization
D. Most companies affected by poor data quality have quantified the impact, but believe that fixing it is too expensive
B. It is a contributing cause to the organization’s most important business problems:
1.21 - Ref: IDMA 2, DQFG, C.8, P.46, O.4

According to Redman, what financial results can quality improvement projects show?
Many successful projects report, somewhat informally, cost reductions of 66% to 75% as a result of quality improvement projects.:
1.22 - Ref: IDMA 2, DQFG, C.9, P.47, O.4

What is Redman's recommendation for developing a case for data quality?
You can develop a case for data quality by demonstrating how improvements in data quality will lead to competitive advantage.:
1.23 - Ref: IDMA 2, DQFG, C.9, P.49, O.4

According to English, how does information quality improvement reduce business costs?
Information quality improvements reduce business costs by eliminating costly scrap and rework caused by defective data.:
1.24 - Ref: IDMA 2, IDW, C.1, P.3, O.4

According to English, what is the cause of data warehouse failures?
Data warehousing projects fail for many reasons, all of which can be traced to a single cause: nonquality. :
1.25 - Ref: IDMA 2, IDW, C.1, P.4, O.4

According to English, why have information quality issues not been addressed?
Data quality is not a sexy topic, and management has either deemed the costs of the status quo and the current level of low-quality data as acceptable and normal costs of doing business or they are unaware of the real costs of nonquality data.:
1.26 - Ref: IDMA 2, IDW, C.1, P.6, O.4

What is the origin of quality cost measurement and the necessity of continuous improvement?
A. Manufacturing industries
B. Service industries
C. Financial industries
D. The quality gurus
A. Manufacturing industries:
1.27 - Ref: IDMA 2, IDW, C.1, P.6 and 7, O.4,

According to English, what is the estimated business cost of nonquality data?
A. 10% to 25% of IT budget
B. 10% to 25% of revenue
C. 3% to 5% of revenue
D. 8% to 10% of profits
B. 10% to 25% of revenue:
1.28 - Ref: IDMA 2, IDW, C.1, P.12, O.4

According to English, what is the purpose for improving information quality?
There is and must be only one purpose for improving information quality: to improve customer and stakeholder satisfaction by increasing the efficiency and effectiveness of the business processes.:
1.29 - Ref: IDMA 2, IDW, C.1, P.13, O.4
In "Insurance Data Management - Defining The Profession", there are four data management functional areas identified in which core data managers are involved. What are they?
1. External Data Reporting
2. Internal Data Coordination
3. Information Systems Development
4. Data Administration:
2.1 - Ref: IDMA 2,R.1, P.2, O.1

According to "Insurance Data Management - Defining The Profession", which of the following tasks is included in the external data reporting function?

A. Charting the relationship between data needs and resources
B. Preparing system specifications
C. Prepare basic cost analyses
D. Working with systems developers to maximize use of technology
C- Prepare basic cost analyses:
2.2 - Ref: IDMA 2, R.1, P.2, O.1

List and briefly describe the five critical success factors of an insurance data management
function, as identified in "Insurance Data Management - Defining The Profession".

1. Perception -to anticipate and understand business conditions as they relate to reporting data - is essential to success.
2. Communication skills - creating new ideas, listening, writing, speaking, encoding, and understanding
3.Staffing development - training, education, and goal-setting, and instilling a stakeholder attitude to understand various types of data.
4. Technological awareness -up-to-date knowledge and access to new information.
5. Ability to control data - developing procedures and tests for data integrity - is the only way to ensure data accuracy, availability, and security.:
2.3 - Ref: IDMA 2, R.1, P.3 and 4, O.1

What are the 11 basic key skills and knowledge of a data manager, as identified in "Insurance Data Management - Defining The Profession"?
1. Extensive knowledge of insurance company operations
2. Understand the business needs for data
3. Skilled in the application of data definition and data quality audit standards
4. Comprehend and follow logical procedures in data base organization
5. Skilled in leading data modeling and enterprise modeling sessions
6. Understand theory and technology
7. Display initiative and creativity in strategic data planning
8. Skilled in presenting clear alternatives and solutions
9. Ensure constructive participation from colleagues
10. Handle diverse assignments
11. Ability to balance the ideal and the practical:
2.4 - Ref: IDMA 2, R.1, P.5, O.1

What is the purpose of the Data Quality Certification Model for Insurance Data Management?
a. A framework for use in attesting to the quality of an organization's data.
b. Guidelines for the data manager to use in controlling, monitoring, and measuring the validity, accuracy, reasonability, and completeness of data.
It is not intended to be a manual of procedures.:
2.5 - Ref: IDMA 2, R.2, P.1, O.2

What is the scope of the Data Quality Certification Model for Insurance Data Management?
A. All insurance policy information
B. Statistical data
C. Materially significant data, such as data used in strategic planning
D. Data used in reports or products provided to customers
B. Statistical data:
2.6 - Ref: IDMA 2, R.2, P.1, O.2

List and describe the six major components of the Data Quality Certification Model commentary.
1. Disclosure of performance results
2. List of performance reports and/or monitoring tools used
3. Review and analysis of significant data problems
4. Plan of action to correct data problems
5. Statement certifying that commentary is true, accurate, and complete
6. Assessment of materiality of data elements:
2.7 - Ref: IDMA 2, R.2, P.1, O.2

According to the Data Quality Certification Model for Insurance Data Management, what are the primary considerations for the data manager in determining the materiality of data elements?
The data manager considers the intended use of the data, and the importance of the data element to that use:
2.8 - Ref: IDMA 2, R.2, P.1, O.2

Who is responsible for developing the standards to be used in the Data Quality Certification
Model commentary?
A. Business managers
B. Users of data
C. Corporate data managers
D. Internal audit
B. Users of the Data:
2.9 - Ref: IDMA 2, R.2, P.1, O.2

According to the Data Quality Certification Model for Insurance Data Management, the results of functional audits are a principal tool in ascertaining which of the following?
A. Validity
B. Accuracy
C. Reasonability
D. Completeness
B. Accuracy:
2.10 - Ref: IDMA 2, R.2, P.2, O.3

The Data Quality Certification Model for Insurance Data Management lists four tools or means that can be used to ascertain the accuracy of data. What are they?
1. Checks that independently compare the reported data to the source data
2. Periodic tests to guarantee the accuracy of any encoding process
3. Premium and claim matches to verify the consistency of the reported data
4. Results of functional audits such as premium audits, market conduct examinations, and
claims audits:
2.11 - Ref: IDMA 2, R.2, P.2, O.3

The Data Quality Certification Model lists five tools or means that can be used to ascertain the reasonability of data. What are they?
1. Distributional analysis
2. Profiles of expected results
3. Trend analysis
4. Average rate checks
5. Loss ratio analysis:
2.12 - Ref: IDMA 2, R.2, P.2, O.3

The Data Quality Certification Model lists two tools or means that can be used to ascertain the completeness of data. What are they?

1. Identification of the significant discrepancies in the reconciliation results and monitoring
of the corrective actions
2. Documentation that compares statistical data with financial data and explains the
differences:
2.13 - Ref: IDMA 2, R.2, P.3, O.3

What is the three-fold purpose of the Actuarial Standard of Practice No. 23 on Data Quality?

Gives guidance to the actuary in:
1. selecting the data that underlie the actuarial work product;
2. reviewing these data for appropriateness, reasonableness, and comprehensiveness;
3. making appropriate disclosures.:
2.14 - Ref: IDMA 2, R.3, P.1, O.4

According to the Actuarial Standard of Practice No. 23 on Data Quality, what is the definition of comprehensive data?
Data obtained from inventory or sampling methods that contain each data element or record needed for the analysis.:
2.15 - Ref: IDMA 2, R.3, P.1, O.4

What are the options used by actuaries in dealing with the quality of the data they are using and what is recommended by the Actuarial Standard of Practice No. 23 on Data Quality?
Informed judgment is used by actuaries to determine what kinds of data are appropriate for a particular analysis. The standard does not recommend that an actuary audit data.:
2.16 -
Ref: IDMA 2, R.3, P.2, O.4

According to the Actuarial Standard of Practice No. 23 on Data Quality, what are the six considerations the actuary should use in selecting the data for an actuarial analysis?
1. consider the data elements that are desired, and possible alternative data elements;
2. data are sufficiently current;
3. reasonableness and comprehensiveness of the necessary data elements, with
particular attention to internal and external consistency;
4. any limitations of the data, and modifications or assumptions needed in order to
use the data;
5. the cost and feasibility of alternatives, including the ability to obtain the information
in a reasonable time frame (the benefit to be gained from an alternative data
element or data source should be balanced with its relative availability and the cost
to collect and compile it);
6. sampling methods, if used to collect data.:
2.17 -
Ref: IDMA 2, R.3, P.2, O.5

According to the Actuarial Standard of Practice No. 23 on Data Quality, what are the six disclosures that should be included in the actuary's report?
1. the source(s) of the data;
2. the materiality of any potential biases of which the actuary is aware that are due to
imperfect data;
3. adjustments or modifications made because of imperfect data, other than routine
corrections made by reference to source documents;
4. the extent of reliance on data supplied by others;
5. in the event that the actuary has not sufficiently reviewed the data, any resulting
limitation on the use of the actuarial work product; and
6. any unresolved concern the actuary may have about the data that could have a material
effect on the actuarial work product.:
2.18 -
Ref: IDMA 2, R.3, P.3, O.5

According to the CAS's "White Paper on Data Quality", what are the five benefits of recognizing that data is an asset?
1. improving business opportunities (e.g., for marketing purposes);
2. greater fraud detection;
3. enhanced underwriting review (e.g., via motor vehicle reports);
4. greater evaluation of loss control factors or risk management procedures; and,
5. greater ability to use the data in actuarial analyses (e.g., for pricing, loss reserve analyses).:
2.19 -
Ref: IDMA 2, R.4, P.1, O.6

Explain the concepts of absolute accuracy, effective accuracy, and relative accuracy as
described in the CAS's "White Paper on Data Quality".
1.Absolute Accuracy is simply that the data are 100% correct - every data element on each and every transaction record is properly and accurately coded.
2. Effective Accuracy means that there are some imperfections in the data but
they are generally usable in most analyses. Two types of effective accuracy 1) erroneous data is at a level or in a data element that does not impact the analysis at hand, and 2) imperfect data does not have a material impact on the results of the analysis.
3. Relative Accuracy is data coded inaccurately as to its definition but reported consistently over time.:
2.20 -
Ref: IDMA 2, R.4, P.7 and 8, O.7

What are the five benefits of audits ( 'its identified in the CAS's "White Paper on Data Quality"?
1. Accuracy and completeness of the data;
2. Ensure consistent handling;
3. Determine the quality of systems control procedures;
4. Measure and improve timeliness of data;
5. Increase the reliability of results.:
2.21 - Ref: IDMA 2, R.4, P.8, O.8

What are the eight elements or characteristics of successful audits identified in the CAS's "White Paper on Data Quality"?

1. are properly planned;
2. measure results according to established standards;
3. are statistically sound, regarding the sampling techniques;
4. perform data checks from source to end product and end product back to source;
5. verify data according to their intended use and definition, including assuring that all data
elements resulting from calculations, mappings and other programming algorithms are
correct as intended;
6. audit the data preparation and data entry processes, and reviews all program and output
controls (assuring that the input and output data balances, as well as reconciles with prior
data processed);
7. determine whether the company's entire process detects errors adequately and corrects
them properly; and finally,
8. provide adequate documentation of the results with recommendations for improvement (if any) and follow-up implementation review.:
2.22 - Ref: IDMA 2, R.4, P.8, O.8

Describe the six basic components of the Statistical Data Monitoring System (SDMS) identified in the CAS's "White Paper on Data Quality" and explain why they are a good model for addressing the reliability of data and the disclosure of any data quality issues.
1. process description and review of control procedures-provides understanding of process and basis for research of causes of error and corrective actions
2. detailed data verification via sampling tests-provides for validity, accuracy, and completeness measures and follow-up cause and correction of problems
3. summary data verification via reasonability reviews- provides for consistency and reasonability reviews of summary data and therefore provides confidence in the overall integrity of data
4. financial reconciliation- provides for completeness and financial integrity of data
5. annual review and certification- provides confidence that an ongoing process to ensure the reliability of data is being carried out
6. review and evaluation by state examiners- provides for independent review by outside auditors and is a basis for improvement of the audit process:
2.23 -
Ref: IDMA 2, R.4, P.9, O.8

According to the CAS's "White Paper on Data Quality", what are the four key tests or checks
that should be considered in a review of the reasonableness of the data?
1. Distributional edit review;
2. Consistency checks;
3. Statistical tests, such as, chi-square goodness of fit tests or non-parametric rank tests;
4. Industry comparisons, including reasonable range of results comparisons.:
2.24 -
Ref: IDMA 2, R.4, P.10, O.9

According to the CAS's "White Paper on Data Quality", What are the six steps that an insurance data manager should take if errors or imperfections are detected in the data?

1. Determine the reasons and cause(s) for the error.
2. Inform the actuary undertaking the current study and incorporate needed adjustments,
modifications or corrections to the source data for use in the current analysis.
3. Stop the error by fixing the system or revising the data handling and collection process.
4. Quantify, if possible, the impact and magnitude of the error on the data underlying the
current study.
5. Describe if the error may materially impact prior analyses and whether these prior analyses may need to be retroactively corrected.
6. If materially significant, make disclosures regarding past analyses appropriately.:
2.25 - Ref: IDMA 2, R.4, P.12, O.9

What are the three current technology topics that the CAS's "White Paper on Data Quality" identified as affecting future data quality efforts?

1. Data Warehouse Concept - which allows broad use of data in great detail by many areas of the company;
2. Greater use of complementary databases - ZIP code, motor vehicle reports, geographic mapping - in improving data validation and accuracy;
3. Pattern Recognition/Expert Systems/Fuzzy Logic Systems - that enhance automation efforts and allow graphical views of the data.:
2.26 - Ref: IDMA 2, R.4, P.13, O.10

The paper "Guidance Regarding Management Data and Information" identified three areas in which actuaries could be making significant contributions. List and describe them.
1. Data collection must collect appropriate data with data collection capabilities considered.
2. Data design must be managed as a critical resource to avoid redundancy, to be consistently derived and defined, sharable, and efficiently organized to maximize use, value, and flexibility to respond to different requests.
3. Management information considerations include how the data will be used and the different types of data needed in different levels of detail for different purposes.
:
2.27 - Ref: IDMA 2, R.5, P.1-8, O.11

Describe the two components of data collection identified in the paper "Guidance Regarding Management Data and Information" and the quality principles for each of them.
1. Data capture
• Data requirements should be compatible and consistent
• Data elements should have only one meaning
• Common data elements should be defined similarly
• Flexibility should accommodate expansion of data elements
• Codes should meaningfully represent information
• Consider frequency of update
• Use codes that are established and understood when possible
2. Data Quality Control
• Data quality function should be established
• Standards of data quality should be developed and monitored
• Critical processing points should be identified and control procedures at these points should be developed
• Edits should be installed to check accuracy, validity, and reasonableness and these should be performed as closely as possible to the data entry with error correction as close to the point of discovery as possible
• Balancing and reconciliation procedures and standards should be established
• Monitor data quality on an ongoing basis
• Changes must be thoroughly tested
:
2.28 - Ref: IDMA 2, R.5, P.1:3, O.11

What are the five data design principles and concepts identified in the paper "Guidance Regarding Management Data and Information"?
1. Manage data as a critical resource
2. Data should not be redundant
3. Data should be consistently defined, consistently derived, and shareable
4. Organize data to maximize use and value
5. Designs should be flexible to respond to different requests:
2.29 - Ref: IDMA 2, R.5, P.2, O.11

What are the six data design concepts identified in the paper "Guidance Regarding Management Data and Information"?

1. Central data base -The ideal repository of data collected is a single central location.
2. Detailed data base - The database should contain all data elements needed for internal and external users.
3. A Data dictionary helps to ensure consistency by the various users of a system.
4. Data base design- The design or organization of the data should address Low redundancy of data, fast processing, flexible access to data, and low storage costs, and summarized or segmented data should be updated automatically from the central source
5. Non-standard requests - database should be flexible and organized to facilitate ad hoc reports requests and direct user access
6. The retention period needs to accommodate meaningful analysis and legal and regulatory requirements
:
2.30 - Ref: IDMA 2, R.5, P.2 and 3, O.11

Briefly describe the kinds of data and information needed for each of the six major insurance functions as listed in the paper "Guidance Regarding Management Data and Information".

1. Ratemaking – Premium and exposure on a written or earned basis; loss and claim information in the same business categories showing historical loss development patterns; and expense information, all by calendar/accident year, report year, or policy year
2. Reserving - premium reserves which vary for each type of reserve, and loss reserves by accident date, policy effective and expiration date, and booking date with their historical development by various data groupings and showing various counts and money amounts.
3. Underwriting/Marketing – distribution of current book of business and trends; underwriting results by distribution system; transaction types, use of modifications, changes in premium
4. Claims – claim counts by type of transaction and open claims and closed claims by various business data groups, reported weekly, monthly, quarterly, year-to-date, and latest twelve months, and showing lag times.
5. Financial Analysis/Investments – all current cash items and liabilities with trends; operating results such as mix of current investments; premium income; loss and loss expense dividends, taxes and other expenses
6. Financial Reporting - information to meet financial reporting obligations of statutory reporting, trade associations and bureaus, shareholder reporting, and income tax reporting, such as direct and next calendar period premiums, losses, expenses, and investments.
:
2.31 - Ref: IDMA 2, R.5, P.5-8, O.12

For which of the major insurance functions are premium, exposure, and loss information
showing historical loss development patterns MOST needed?
A. Ratemaking
B. Reserving
C. Underwriting/marketing
D. Claims processing
A. Ratemaking:
2.32 - Ref: IDMA 2, R.5, P.3 and 4, O.12

What are the eight basic questions used in ascertaining insurance information needs identified in the paper "Planning for Insurance Data Quality"?
1. What are the basic types of data and information access needs?
2. Who are the current and potential users of the information?
3. What process should be used to select the data and method of collection?
4. What are the timeliness considerations?
5. How accurate do the data need to be?
6. How complete do the data need to be?
7. Are all the needed data and information available and when?
8. What are the access considerations?:
2.33 - Ref: IDMA 2, R.6, P.1, O.13

What are the six specific questions identified in the paper "Planning for Insurance Data Quality" that are used to ascertain information needs for ratemaking?
1. What level of detail? Variation by line of insurance?
2. What basis for aggregation? Accident Year? Policy Year? Calendar Year?
3. What supplementary data are necessary for adjusting premiums / exposure / losses / claims?
4. What expense information is needed?
5. What calculations and inferred fields need be made?
6. What is the process from receipt of data to final product?
:
2.34 - Ref: IDMA 2, R.6, P.2 and 3, O.13

What are the four specific questions identified in the paper "Planning for Insurance Data Quality" that are used to ascertain information needs for reserving?
1. How are the data to be aggregated by report period and evaluation date?
2. What are the detailed groupings?
3. What calculations are to be done?
4. What is the degree of standardization? Ad hoc reports?
:
2.35 - Ref: IDMA 2, R.6, P.2, O.13

What are the four specific questions identified in the paper "Planning for Insurance Data Quality" that are used to ascertain information needs for underwriting and marketing?
1. What type of statistical information and level of detail are necessary to accept or reject the
risk?
2. What non-insurance information (e.g., credit reports) is necessary?
3. What information is required to assist in determining the underwriting management
philosophy of the company?
4. What unique requirements does the marketing area have?
:
2.36 - Ref: IDMA 2, R.6, P.3 and 4, O.13

What is the difference between transactional statistical plans and summarized statistical plans?
Transactional statistical plans collect data on a 'unit transaction' basis. This means that one or more premium and/or loss records are generated each time a policy is written or a loss occurs.

Summarized statistical plans require less detailed coding on records and generally require reporting on an annual basis. Summarized plans allow unit transaction records containing the same coding information to be combined and reported as one summarized record.:
2.37 - Ref: IDMA 2, R.6, P.4, O.14

Which of the following is identified in the reading "Planning for Insurance Data Quality" as a data quality recommendation for internal data collection mechanisms?
A. Establish a committee of the business and systems groups involved in managing the data collection systems to coordinate data requirements
B. Have all data collection systems use the central data dictionary for editing and to display the meaning of codes
C. Standards and guidelines for data elements should be strictly followed wherever possible
D. Combine all like data entry processes to centralize the management and control of data
collection mechanisms.
C. Standards and guidelines for data elements should be strictly followed wherever possible:
2.38 - Ref: IDMA 2, R.6, P.5, O.14

What are the eight considerations for a well-planned, quality insurance data system as identified in the reading ""Planning for Insurance Data Quality""?
1. accuracy, so that the data represent what's intended;
2. completeness, so that all the data required is coded, processed and available;
3. timeliness, so that data are posted (and available to users) as currently as possible;
4. validity, so that the proper codes are stored and used;
5. evaluation, so that performance of the system and the data management process are monitored on an ongoing basis;
6. communication, so that the current information is known correctly throughout the business organization;
7. flexibility, so that changing information needs can be planned properly; and
8. simplicity, so that the system and process can be well understood by the broadest number of users of the organization.
:
2.39 -
Ref: IDMA 2, R.6, P.6, O.15

"What are the 10 tools and products identified in the reading "Planning for Insurance Data Quality" as being available for data quality checking of company reports to external
organizations?
1. Company Edit Packages (CEP), which are available for most statistical plans;
2. Pre-edit service for submissions;
3. Submission Analysis Reports (SAR), which are provided regarding submissions.
4. Detailed error listings of submitted data;
5. Company Performance Reports, which are issued quarterly or annually depending on the agent or bureau;
6. Pre-delinquency lists and policy information lists, which maintain a level of quality control in reported data;
7. Distributional Edit Reports (DER), which compare aggregate submitted data to past reporting;
8. Reconciliation reports to annual statement experience;
9. Data quality criticisms on Unit Statistical Reports (USR); and
10. User guides for statistical reporting, which are available from statistical agents.
:
2.40 -
Ref: IDMA 2, R.6, P.7, O.15

Which of the following is NOT listed in the reading "Planning for Insurance Data Quality" as one of the three types of documentation that need to be maintained in support of data quality?

A. Data dictionary
B. Regressive tests
C. Systems development
D. Data quality program
B. Regressive Tests:
2.41 - Ref: IDMA 2, R. 6, P.7, O.15

Which of the following is NOT one of the four latest industry information trends listed in the
reading "Planning for Insurance Data Quality"?

A. Internet access and on-line availability of information
B. Data warehouse approach to data design
C. Increased use of cloud-based computing
D. Non-insurance information as part of the decision-making process
C. Increased use of cloud-based computing:
2.42 - Ref: IDMA 2, R.6, P.8, O.16

According to English, in the end who determines quality?
A. Stakeholders
B. Knowledge workers
C. Customers
D. Regulators
C. Customers:
3.1 - Ref: IDMA 2, IDW, C.3, P.34, O.1

According to English, what are the two categories of information customers or markets?
1. External customers and stakeholders
2. Internal knowledge workers
:
3.2 - Ref: IDMA 2, IDW, C.3, P.34, O.2

According to English, what are the components of process definition?
Process definition includes:
• Process definition – what should be done
• Process objectives – why it should be done
• Location – where it should be performed as well as where the input is coming from
• Roles – who should do it
• Triggers or dependence – when it should be performed
• Procedures or steps – how it should be performed
• Quality measures – how to know it was performed successfully
:
3.3 - Ref: IDMA 2, IDW, C.3, P.40, O.1

"According to English, what process improvement requires most of all is:
A. Reorganization
B. Process redesign
C. Teamwork
D. Process documentation
C. Teamwork:
3.4 - Ref: IDMA 2, IDW, C.3, P.41, O.1

"According to English, what are the three requirements of statistical quality control, also called statistical process control?
1. Process management
2. Process measurement
3. Process improvement
:
3.5 - Ref: IDMA 2, IDW, C.3, P.42, O.1

"“Shewhart” is primarily associated with which of the following?
A. PDCA
B. The Fishbone Diagram
C. Kaizen
D. Total Quality Control
A. PDCA:
3.6 - Ref: IDMA 2, IDW, C.3, P.42, O.1

"Juran identified a trilogy of managerial processes required for quality. What are they?
1.Quality planning
2. Quality control
3. Quality improvement
:
3.7 - Ref: IDMA 2, IDW, C.3, P.44, O.1

"Identify the particular concept primarily associated with each of these quality leaders: Shewhart, Deming, Juran, Ishikawa, Imai, and Crosby.
• Shewhart PDCA (plan, do, check, act)
• Deming 14 Points of Quality
• Juran Trilogy of managerial processes
• Ishikawa saw quality as a contribution to the national economy, not just an enterprise problem
• Imai Kaizen (continuous process improvement)
• Crosby Quality is free:
3.8 - Ref: IDMA 2, IDW, C.3, P.42–47, O.1

"ISO 9000 contains:
A. Principles of quality improvement
B. Standards for quality management systems
C. An approach for certifying quality processes
D. Standards for product quality
B. Standards for quality management systems:
3.9 - Ref: IDMA 2, IDW, C.3, P.48, O.1

"What are the seven Categories of Performance Excellence associated with the Malcolm Baldrige National Quality Award?
1. Leadership
2. Strategic planning
3. Customer and market focus
4. Information and analysis
5. Human resource development and management
6. Process management
7. Business results
:
3.1 - Ref: IDMA 2, IDW, C.3, P.50–52, O.1

"Which of the following correctly identifies one of the major differences between ISO 9000 and the Malcolm Baldrige National Quality Award?
A. ISO 9000 involves an assessment by outside parties, but the Baldrige Award does not
B. The Baldrige Award process involves an assessment of both systems and results, but ISO 9000 assesses systems only
C. ISO 9000 involves an assessment framework, whereas the Baldrige Award involves standards
D. ISO 9000 involves a triad of assessment processes, whereas the Baldrige Award does not
B. The Baldrige Award process involves an assessment of both systems and results, but ISO 9000 assesses systems only:
3.11 - Ref: IDMA 2, IDW, C.3, P.48–52, O.1

"What are the three types of information customers identified by English?
1. End customers
2. Internal knowledge workers
3. External stakeholders
:
3.12 - Ref: IDMA 2, IDW, C.3, P.52, O.3

"According to English, what is information float?
Information float is a delay in the potential accessibility of data. It is introduced by data intermediaries.:
3.13 - Ref: IDMA 2, IDW, C.3, P.53, O.3

"List and describe the nine roles that people play in the information quality process, as identified by English.
1. Strategic and tactical knowledge workers – the absolute starting point for data definition and information quality
2. Operational knowledge workers – includes both downstream knowledge workers as well as the immediate or departmental knowledge workers; because knowledge workers perform the business processes of the enterprise, every application and database must be designed to capture information to meet their needs
3. Information producers – as the actual originators of information, they are key to information quality
4. Data intermediaries – they must assist with data definition (to the extent of defining what they need to know in order to perform their jobs efficiently) and in the application design process (to provide input as to the human factors required to facilitate efficient and effective data transcription)
5. Information managers, architects, modelers, and analysts – they create information and data models, and must help in data definition so that the needs of all information stakeholders are met
6. Database designers and administrators – they transform conceptual or ""logical"" data models into physical database designs and databases; they are accountable for the integrity of the physical design of databases, and for recoverability and security
7. Business and systems analysts – they facilitate the definition of the business processes and prepare specifications for applications
8. Application designers and developers – they design and implement the defined requirements and specifications
9. Data warehouse architects and designers – they design and plan data warehouses, data marts, and executive information systems; they support strategic and tactical processes versus operational processes
:
3.14 - Ref: IDMA 2, IDW, C.3, P.54–60, O.4

"Which of the following is the BEST description of knowledge workers?
A. Information customers
B. Information custodians
C. Information producers
D. Information managers
A. Information customers:
3.15 - Ref: IDMA 2, IDW, C.3, P.60, O.4

"According to English, why is the information value chain important?
The information value chain is important because it documents the customer-supplier relationships. It identifies who supplies which information to which customers. :
3.16 - Ref: IDMA 2, IDW, C.2, P.61, O.3

"According to English, what are the two reasons why the principles of quality, although they are well known, have not been applied vigorously to information quality?
1. Senior management has not fully understood the ramifications of data and information as strategic enterprise resources, and they have not understood the principles for managing the information resources like other enterprise resources.
2. Senior management is also unaware of the real costs of non-quality information.
:
3.17 - Ref: IDMA 2, IDW, C.3, P.65 and 66, O.3

"What are the three reasons for measuring the costs of poor quality information cited by English?
1. To determine the real business impact of information quality problems
2. To establish the business case for information quality improvement initiatives
3. To provide a baseline for measuring the effectiveness of information quality improvement projects:
4.1 - Ref: IDMA 2, IDW, C.7, P.199, O.1

"English divides costs into two types. Name and define them.
1. Fixed – costs to create the capacity to produce something or create a record, which do not vary with the number of items produced or records created
2. Variable – costs per item or per record which increase incrementally as more items are produced or records are created:
4.2 - Ref: IDMA 2, IDW, C.7, P.200, O.2

"According to English, information costs can be broken out into two areas. Name and define them.
1. Cost basis – costs of developing and maintaining the infrastructure
2. Value basis – costs of applying information:
4.3 - Ref: IDMA 2, IDW, C.7, P.200, O.2

"Name the five processes in the resource life cycle as identified by English and, for each process, identify the cost area to which it belongs.
1. Plan – cost basis
2. Acquire – cost basis
3. Maintain – cost basis
4. Dispose – cost basis
5. Apply – value basis:
4.4 - Ref: IDMA 2, IDW, C.7, P.201, O.3

"Two of the processes in the resource life cycle as identified by English are “maintain resource” and “apply resource”. Of the following, which correctly identifies the cost areas to which they belong?
A. Both represent the cost basis of the resource
B. “Maintain resource” represents the cost basis of the resource, and “apply resource” represents the value basis
C. “Maintain resource” represents the value basis of the resource, and “apply resource” represents the cost basis
D. Both represent the value basis of the resource
B. “Maintain resource” represents the cost basis of the resource, and “apply resource” represents the value basis:
4.5 - Ref: IDMA 2, IDW, C.7, P.201, O.3

"What are the three requirements that English identifies for maximizing information value?
1. Well-defined information architecture designed to meet all knowledge worker needs
2. Information create processes that acquire all data needed about the business events or objects processed
3. Information maintenance processes that update data on the most feasible and timely basis:
4.6 - Ref: IDMA 2, IDW, C.7, P.206, O.3

"According to English, there are four ways in which an unmanaged information life cycle leads to significantly increased costs of information. What are they?
1. Creating application-specific databases without regard to enterprise information architecture or downstream knowledge worker needs
2. Developing application-specific programs that only capture data required by the immediate beneficiaries or departments and not to meet the downstream knowledge worker needs
3. Acquiring application software packages without paying attention to the data architecture of the application
4. Creating unnecessary interface programs to transform data:
4.7 - Ref: IDMA 2, IDW, C.7, P.207 and 208, O.3

"8. What are the three categories of information quality costs that English identifies?
1. Nonquality information costs
2. Information quality assessment or inspection costs
3. Information quality process improvement and defect prevention costs:
4.8 - Ref: IDMA 2, IDW, C.7, P.209, O.4

"9. According to English, which of the following is a category of information quality costs that includes business rework costs?
A. Information quality process improvement and defect prevention costs
B. Nonquality information costs
C. Information quality management costs
D. Information quality assessment or inspection costs
B. Nonquality information costs:
4.9 - Ref: IDMA 2, IDW, C.7, P.209, O.4

"According to English, what is the guidance for selection in identifying business performance measures for measuring nonquality information costs?
Select those measures that are most significantly hindered as a result of poor quality information.:
4.1 - Ref: IDMA 2, IDW, C.7, P.215, O.5

"According to English, in calculating information costs it's important to break down the costs into three categories. What are they?
1. Infrastructure basis or investment
2. Value basis
3. Cost-adding basis
:
4.11 - Ref: IDMA 2, IDW, C.7, P.216, O.5

"According to English, there are five kinds of resources for which cost data is needed in order to calculate nonquality information costs. What are they?
1. Time
2. Money
3. Materials
4. Facilities and equipment
5. Computing:
4.12 - Ref: IDMA 2, IDW, C.7, P.222 and 223, O.5

"According to English, what is the objective of calculating information value?
The objective is to establish the relative importance of information to the business and to establish an economic justification for information quality improvement.:
4.13 - Ref: IDMA 2, IDW, C.7, P.231, O.5

"The term “obligation to the customer never ceases” is a critical aspect of which of the following Points of Information Quality?
A. #7 Institute leadership for information quality
B. #5 Continually improve the process of data development and information production
C. #1 Create constancy of purpose for information quality improvement
D, #14 Accomplish the transformation for information and business quality
C. #1 Create constancy of purpose for information quality improvement:
5.1 - Ref: IDMA 2, IDW, C.11, P.340, O.1

"“The cheapest and most effective way to assure information quality is at the source” – this is the message of which of the following Points of Information Quality?
A. #6 Institute training on information quality
B. #3 Cease dependence on information product quality assessments alone
C. #5 Continually improve the process of data development and information production
D. #8 Drive out fear of data uncertainty
B. #3 Cease dependence on information product quality assessments alone:
5.2 - Ref: IDMA 2, IDW, C.11, P.348, O.1

"What are the three bad practices that are affected by Point 4 of Information Quality (“End the practice of awarding information business on the basis of cost alone”)?
1. Reward project development for on-time and within-budget alone
2. Model and build databases based upon the application requirements alone
3. Capture data where it is convenient for a given business area or application
:
5.3 - Ref: IDMA 2, IDW, C.11, P.350–355, O.1

"Which of the following is so important that it is the subject of two of the 14 Points of Information Quality?
A. Vision
B. Training
C. Communication
D. Teamwork
B. Training:
5.4 - Ref: IDMA 2, IDW, C.11, P.364, O.1

"According to English, what are the differences between common causes and special causes of quality problems?
Common causes of problems are those within the system or process, and can be corrected by process improvement. Special causes are those outside the system -- for example, producing a duplicate customer record because a computer system was down and the record was manually entered.:
5.5 - Ref: IDMA 2, IDW, C.11, P.368 and 369, O.1

"In the discussion of Point 10 of Information Quality (“Eliminate slogans and exhortations”), it is noted that three things must occur so that setting targets for information quality will have more than a temporary impact. What are they?
1. Processes to improve the processes to eliminate data-defect causes are put in place
2. Incentives are changed to incent quality production rather than simple “productivity”
3. Management and process owners have accountability for the quality of the information products produced by the processes in their charge
:
5.6 - Ref: IDMA 2, IDW, C.11, P.387, O.1

"Which of the following Points of Information Quality involves the PDCA cycle?
A. #14 Accomplish the transformation for information and business quality
B. #5 Continually improve the process of data development and information production
C. #9 Break down barriers between information systems and business areas
D. #6 Institute training on information quality
A. #14 Accomplish the transformation for information and business quality:
5.7 - Ref: IDMA 2, IDW, C.11, P.397, O.1

"Differentiate the three “generations” of data quality systems, as described by Redman.
1. First-generation systems are characterized by finding and correcting errors
2. Second-generation systems are characterized by preventing errors
3. Third-generation systems are characterized by making it (nearly) impossible to make errors:
5.8 - Ref: IDMA 2, DQFG, C.15, P.75 and 76, O.2

"List, and state the rationale for, the 12 management infrastructure elements of Redman’s second-generation data quality systems.
1. Data quality council
• Emphasizes senior management commitment to quality
• Provides the cross-functional coordination and support necessary to carry out the policy and projects
2. Data quality vision
• Forces organization to think about its long-term data and information needs and their connection to business success
• Broad communication and alignment
• Motivates action in the right direction
• Provides basis for decision making
3. Data quality policy
• Forces organization to think broadly and deeply about quality
• Provides insiders and outsiders a superior form of “predictability”
• Broad communication and alignment
• Reduces “lone ranger” mentality
4. Business case for data quality
• Forces the enterprise to be clear about its objectives
- Cost reduction
- Customer service
- Improved decision making
• Demonstrates need through cost/benefit analysis
• Helps align decision makers
5. Data supplier management
• Much data comes from suppliers; it is difficult to find and correct errors downstream
• Predictable input into information chains
6. Information chain management
• Data and information cross organizational boundaries as information products are created
• Most “problems” and/or opportunities occur on boundaries
• Proven methods for making and sustaining improvements
• “Control” yields predictable performance
7. Innovation
• New data, or exactly the right data, is a source of competitive advantage
• New ways of looking at data (data mining) yield new value from “old” data
• “Informationalize” product and service
8. Standardization
• Common definitions of key terms promote communication and alignment
• Other common definitions can create new opportunities
• Other standards promote cross-organizational comparison
9. Management of data culture
• Most enterprises/organizations have first-generation data quality systems; second-generation systems require them to think and act differently
• Experience shows that change is always risky, but risks can be managed and/or reduced
• Data engender politics and passions like no other resource
10. Database of record
• Enterprise can assure itself that certain data are of high quality
• Helps manage and reduce redundancy
• Especially useful for common data such as customer and employee data
• Stewardship provides a means to coordinate the needs of the various information chains traversing a given database
• Stewards have clear standards to work toward
11. Strategic data quality management
• Data and information are the critical assets of the Information Age; those who manage them have a distinct advantage
• Data and information are related to strategy and not bound up with information technology
12. Training and education
• People must have the background and tools to carry out their assignments
:
5.9 - Ref: IDMA 2, DQFG, C.15, P.78–84, O.3

"List, and state the rationale for, the 15 technical capabilities elements of Redman’s second-generation data quality systems.
1. Identification of information chains
• Focuses attention on the most important assets
2. Information chain description
• The simple act of describing what is actually happening often surfaces incongruities
• Improvement consists of changes to the information chain, to better meet needs of specific customers
• Key steps can be measured and controlled
3. Customer needs analysis
• The customer is the final arbiter of quality
• Alignment of those working on information chains in the direction of customer needs
• Identifies needed planning and improvement projects
4. Measurement
• Replaces opinion with fact
• Quantification of costs, benefits, trade-offs, etc.
• Permits quality control of information chain performance
5. Quality control
• Predictability
• Assures information chain performance to standards
6. Quality planning
• Helps assure that information chains can consistently meet customer needs
7. Quality improvement
• Experience suggests that improvement works best “project by project”
• Systematic process helps make improvement routine
• Inculcation of “quality culture”
8. Information chain (re)design
• New information chains are often needed to satisfy new customer needs
• Redesign is an important component of reengineering
• Well-planned and designed information chains perform better and faster and at lower total cost, and ease and speed development and implementation of supporting information technologies
9. Inspection and test (data editing)
• Prevents bad data from proceeding further into or through an information chain
10. Quality assurance
• To provide management confidence that the organization is performing at optimal levels of effectiveness and efficiency
• To provide management confidence that control plans are being executed and the quality system is deployed and functioning
• To highlight areas for improvement
11. Document assurance
• To mitigate the use of non-applicable data policies, procedures, documents, standards, etc.
12. Rewards and recognition
• To continue to achieve improvements
• To maintain enthusiasm and commitment to meeting customer needs
• To help assure that standards are consistently met
• To enhance the extent to which staff embrace the data quality system
13. Domain knowledge
• Good quality systems always take advantage of “domain information”
• Data and information are elusive concepts, which need to be clarified and disseminated
• They differ in some critical ways from other assets
14. Standards
• Help make future performance predictable
• Acceptance/rejection of an information product
15. Quality handbook
• Codifies the quality system
• Increases the “reach” of the quality council
• Many enterprises become comfortable with the concepts through the creation of the handbook
:
5.1 - Ref: IDMA 2, DQFG, C.15, P.78, 85–92, O.3

"Which of the following is NOT one of the elements of a second-generation data quality system that are common to most successful programs?
A. Data quality policy
B. Strategic data quality management
C. Management of data culture
D. Information chain management
B. Strategic data quality management:
5.11 - Ref: IDMA 2, DQFG, C.15, P.78, O.3

"Which of the following is NOT one of the elements of a second-generation data quality system that are common to most successful programs?
A. Quality handbook
B. Quality improvement
C. Quality control
D. Quality planning
A. Quality handbook:
5.12 - Ref: IDMA 2, DQFG, C.15, P.78, O.3

"Which of the following is one of the elements of a second-generation data quality system that are common to most successful programs?
A. Quality handbook
B. Domain knowledge
C. Information chain management
D. Strategic data quality management
C. Information chain management:
5.13 - Ref: IDMA 2, DQFG, C.15, P.78, O.3

"Which of the following is one of the elements of a second-generation data quality system that are common to most successful programs?
A. Data quality vision
B. Training and education
C. Quality assurance
D. Customer needs analysis
D. Customer needs analysis:
5.14 - Ref: IDMA 2, DQFG, C.15, P.78, O.3

"Of the reasons below, which is NOT one of the seven reasons why quality improvement initiatives fail?
A. Failure to fully implement an oversight team
B. Failure to fully understand the concept
C. Lack of management understanding and active involvement
D. Failure to manage change
A. Failure to fully implement an oversight team:
6.1 - Ref: IDMA 2, IDW, C.13, P.423, O.1

"What is information fraud?
When people hoard information for their own benefit rather than for the benefit of the enterprise, it is tantamount to fraud. It forces the enterprise to spend money and time to re-acquire information that is already known.:
6.2 - Ref: IDMA 2, IDW, C.13, P.425, O.1

"Identify and briefly describe the seven critical success factors for implementing an information quality environment
1. Understand fully what information quality improvement is and why you are doing it. The purpose of using information quality improvement as a tool must be to make the business perform better and to accomplish its mission successfully. Information quality improvement for any other reason will fail.
2. Implement information quality improvement effectively
Information quality improvement must be implemented as a culture change and habit.
3. Implementing information quality improvement on the right problem
Improving trivial information will cause the initiative to fail. Don’t improve a process that should never have been performed in the first place.
4. Training and communication
Training is paramount. Discover the concerns of people whose procedures may be changing and get their feedback.
5. Incentives for information quality
Management must replace the performance measures that encourage the creation of nonquality information with performance measures that encourage creation of quality information.
6. Management commitment to information quality improvement as a management tool. Permanent management commitment can be obtained when the information quality improvement initiatives help management to achieve and sustain their business objectives. This commitment can be sustained when management fully comprehends the relationship between information quality improvement and the accomplishment of their business objectives.
7. Managing change
Change in behavior at its best disrupts enterprise function, and at its worst threatens enterprise survival. You must assure that change is “managed”: plan, organize, lead, and control for change.
:
6.3 - Ref: IDMA 2, IDW, C.13, P.425 and 426, O.2

"Which of the following steps in English’s action plan for implementing information quality comes before the others?
A. Conduct an information quality management maturity assessment with senior management and provide formal education
B. Identify and empower an information quality leader to take some action and get started
C. Define information quality principles, processes, and objectives
D. Define the business problem to be solved, and the measures for the information quality improvement project success
B. Identify and empower an information quality leader to take some action and get started:
6.4 - Ref: IDMA 2, IDW, C.13, P.426 and 427, O.3

"Which of the following steps in English’s action plan for implementing information quality comes after the others?
A. Establish a regular mechanism of communication and education with senior management to sustain their involvement and commitment
B. Conduct a customer satisfaction survey of the information stakeholders to find out their frustrations and barriers as a result of nonquality information
C. Analyze the systemic barriers to information quality and recommend changes
D. Calculate customer lifetime value, if you can, to estimate missed and lost opportunity resulting from nonquality information
Establish a regular mechanism of communication and education with senior management to sustain their involvement and commitment:
6.5 - Ref: IDMA 2, IDW, C.13, P.426 and 427, O.3

"What are the five stages of maturity in the information quality management maturity grid?
1. Stage 1 – Uncertainty (Initial)
2. Stage 2 – Awakening (Repeatable)
3. Stage 3 – Enlightenment (Defined)
4. Stage 4 – Wisdom (Managed)
5. Stage 5 – Certainty (Optimizing):
6.6 - Ref: IDMA 2, IDW, C.13, P.429–432, O.4

"Identify and briefly describe the five measurement categories in the information quality management maturity grid.
1. Management understanding and attitude
Measures management’s comprehension of information quality management as a tool for managing and improving the business.
2. Information quality organization status
Measures whether the enterprise is organized in a way to lead it to make quality information happen.
3. Information quality problem handling
Measures how the organization acts or reacts to information quality problems.
4. Cost of information quality as a percent of revenue or operating budget
Measures how well the organization knows its cost of information quality.
5. Information quality improvement actions
Measures how well the organization progresses in implementing improvement initiatives.:
6.7 - Ref: IDMA 2, IDW, C.13, P.432–437, O.5

"Which of the following is NOT one of the three categories of costs of quality?
A. Costs of information quality assessments
B. Costs of information quality remediation
C. Costs of information process quality improvement and data-defect prevention
D. Costs of information scrap and rework, and process failure
B. Costs of information quality remediation:
6.8 - Ref: IDMA 2, IDW, C.13, P.435, O.6

"If an organization has just implemented the 14 Points of Information Quality, what stage of information quality management maturity has it reached?
A. Stage 2
B. Stage 5
C. Stage 3
D. Stage 4
C. Stage 3:
6.9 - Ref: IDMA 2, IDW, C.13, P.448, O.7

"Which one of the items below is one of the tasks that English identifies as being involved in sustaining information quality?
A. Communicate information regarding the cause of any information quality initiative “failure” to the information quality team
B. Quantify the additional costs involved in the improvement process
C. Develop a system of incentives to reward ideas for improving information quality
D. Quantify the costs of the status quo before the quality initiative begins
D. Quantify the costs of the status quo before the quality initiative begins:
6.1 - Ref: IDMA 2, IDW, C.13, P.449 and 450, O.8

"What are the six key information quality job functions?
1. Information quality manager or leader
2. Information architecture quality analyst
3. Data cleanup coordinator, data quality coordinator, or data warehouse quality coordinator
4. Information quality analyst
5. Information quality process improvement facilitator
6. Information quality training coordinator:
6.11 - Ref: IDMA 2, IDW, C.13, P.451–453, O.9

"Which of the following is NOT one of the three steps Redman lists that can be taken to advance the data culture?
A. Manage change actively
B. Be proactive in soliciting ideas for data quality improvement
C. Establish an appropriate relationship between data, information, and information technology
D. Embrace the organization’s current culture and work (most of the time) within it
B. Be proactive in soliciting ideas for data quality improvement:
6.12 - Ref: IDMA 2, DQFG, C.34, P.197, O.10

"Which of the following is NOT one of the four components of a successful change listed by Redman?
A. Actionable first steps
B. A clear, shared vision
C. A sense of urgency
D. An emphasis on communication
D. An emphasis on communication:
6.13 - Ref: IDMA 2, DQFG, C.34, P.200, O.11

"For each of the four components of a successful change, describe what happens to the change process when that component is missing.
1. Missing a sense of urgency – result is a low-priority process with little activity
2. Missing a clear, shared vision – result is a fast start that fizzles, directionless
3. Missing the capacity to change – result is anxiety, frustration
4. Missing actionable first steps – result is haphazard efforts, false starts
:
6.14 - Ref: IDMA 2, DQFG, C.34, P.200, O.11

"Define “information stewardship”.
Information stewardship is the willingness to be accountable for a set of business information for the well-being of the larger organization by operating in service, rather than in control of those around us.:
7.1 - Ref: IDMA 2, IDW, C.12, P.402, O.1

"Compare and contrast “ownership” versus “stewardship”.
An owner possesses the rights to something. A steward has accountability for managing something that belongs to someone else.:
7.2 - Ref: IDMA 2, IDW, C.12, P.402, O.1

"Identify and describe the two key information stewardship teams.
1. Business information stewardship team, which consists of all or a subset of the business information stewards. Typical responsibilities include:
• resolving information-related issues across information groups
• providing support and knowledge transfer among business information stewards
• providing or recommending education for stewards at all levels
• identifying or recommending information policies
• identifying, recommending, or approving data standards
2. Executive information steering team, made up of the senior management team. Typical responsibilities include:
• establishing enterprise vision, mission, values, and strategies that address information as a business resource
• being accountable to the information stakeholders
• establishing and issuing information policy
• resolving major information-related issues that cannot be resolved by the business information stewardship team
• establishing performance measures for information
• effecting culture change for information accountability and information quality as a management tool:
7.3 - Ref: IDMA 2, IDW, C.12, P.413, O.2

"Redman identified several support tools for information stewards. Which of the following is NOT one of these tools?
A. Data definition conflict resolution process
B. Training
C. Information stewardship guidelines
D. Information policy
A. Data definition conflict resolution process:
7.4 - Ref: IDMA 2, IDW, C.12, P.416 and 417, O.3

"What are the four factors that Redman says complicate the attempt to understand customer needs?
1. Customers don’t know what they want
2. Customers can have a stunning array of needs
3. Many customers have different needs, which may conflict
4. Customer needs change all the time:
7.5 - Ref: IDMA 2, DQFG, C.17, P.101, O.4

"The step “rate existing information product” is part of which chain?
A. Customer needs chain
B. Product value chain
C. Quality improvement chain
D. Business value chain
A. Customer needs chain:
7.6 - Ref: IDMA 2, DQFG, C.17, P.103, O.5

"List the four steps involved in creating and applying edits.
1. Specify the domains of allowed values
2. Translate them into business rules
3. Apply them to the data
4. For each rule, list all records that fail:
7.7 - Ref: IDMA 2, DQFG, C.20, P.120, O.6

"What are the three ways Redman identifies for how data edits may be used?
1. Editing can be applied to an existing database
2. Edits can be applied as input criteria to a database
3. Edits can be applied as people enter data (in-chain editing):
7.8 - Ref: IDMA 2, DQFG, C.20, P.121, O.7

"Compare and contrast reengineering versus data quality improvement.
In some respects, reengineering is the large-scale version of data quality improvement. Reengineering aims for quantum improvements by considering all aspects of large information chains and potential new roles for information technology … [D]ata quality improvement proceeds incrementally.:
7.9 - Ref: IDMA 2, DQFG, C.22, P.131 and 132, O.8

"Which of the following is NOT part of the data quality improvement cycle that Redman describes?
A. Document costs of status quo
B. Conduct root cause analysis
C. Hold the gains
D. Form and charter project team
A. Document costs of status quo:
7.1 - Ref: IDMA 2, DQFG, C.22, P.132, O.9

"Which of the following is NOT one of the important components of setting targets for improved quality listed by Redman?
A. Clear linkage to business purpose
B. Focus on rate of improvement
C. Focus on quality levels
D. Clear and aggressive time-frames
C. Focus on quality levels:
7.11 - Ref: IDMA 2, DQFG, C.23, P.137 and 138, O.10

"What are the five categories of principles of information chain design identified by Redman?
1. Overall focus
2. Design principles
3. Design needed measurements into the new information chain
4. Design for error handling
5. Data design:
7.12 - Ref: IDMA 2, DQFG, C.24, P.143 and 144, O.11

"Which of the following is NOT one of the factors Redman lists for why reengineering has gone out of favor?
A. Reengineering is often too incremental rather than transformative
B. High failure rate
C. Came to be associated with down-sizing
D. Many organizations failed to realize how hard it was
A. Reengineering is often too incremental rather than transformative:
7.13 - Ref: IDMA 2, DQFG, C.25, P.147, O.12

"List the seven situations for which Redman says reengineering should be considered.
1. The gap between customer needs and current performance is too large (and growing)
2. A competitor has achieved a significant advantage through a better information chain
3. It is clear that thinking differently is required
4. One already has an idea that can, if successfully implemented, produce enormous advantages
5. Your organization can’t routinely deliver a (correct) printed invoice and credit the proper customer
6. A competitor is using the internet to invoice and receive payment and has a tremendous cost advantage
7. Customers are fleeing to your competitor in droves:
7.14 - Ref: IDMA 2, DQFG, C.25, P.148, O.13

"What are the two major approaches to improving data quality as identified by Redman?
1. Finding and fixing errors (clean-up)
2. Preventing errors:
8.1 - Ref: IDMA 2, DQFG, Part C, P.51, O.1

"What are the four choices Redman identifies for dealing with a “polluted” database?
1. Deal with the impacts of the erred data -- don’t correct it
2. Conduct major database clean-ups
3. Conduct small database clean-ups daily; for example, cleaning data as it is added to the database, or clean data prior to its use
4. Determine the sources of erred data upstream, in the information chains that create new data:
8.2 - Ref: IDMA 2, DQFG, C.10, P.54, O.2)

"What does Redman identify as the most common choice for dealing with a “polluted” database, and why?
The most common choice is to perform complete database clean-ups. The reasons for the “popularity” of this choice include:
• the large number of good computer tools that can automate error detection
• error correction can often be handled by relatively cheap staff
• it is easy to rally support for it
• it can be fairly well-delineated and completed in a reasonable amount of time:
8.3 - IDMA 2, DQFG, C.10, P.54 and 55, O.2

"What are the two most pertinent conclusions identified by Redman that can be derived by reviewing the plots of database and information chain quality levels over time?
1. Over time, clean-ups do not address the upstream information chains, so those chains never improve
2. Preventing errors does not address existing data although, over time, as new high-quality data replaces old, erred data, the quality of the database improves
:
8.4 - Ref: IDMA 2, DQFG, C.11, P.57, O.3

"Which of the following is NOT one of the three data-related factors that Redman identifies as bearing on the choice of approach to data quality improvement?
A. Rate of new data creation
B. Extent of existing data quality problems
C. Utility of data
D. Expected lifetime of data
B. Extent of existing data quality problems:
8.5 - Ref: IDMA 2, DQFG, C.12, P.61, O.3

"Redman identifies two kinds of information chains that create new data – what are they?
1. Those that define new types of data (i.e., new data models)
2. Those that create new data values:
8.6 - Ref: IDMA 2, DQFG, C.12, P.61, O.3

"Which of the following is one of the rules Redman identifies for selecting the best approach to addressing data quality issues?
A. Look for quick fixes first
B. Prevent, then clean
C. Don’t ignore any data -- one can’t predict what will be important in the future
D. Data clean-up alone is almost never a viable short-term strategy
B. Prevent, then clean:
8.7 - Ref: IDMA 2, DQFG, C.13, P.66, O.4

"What are the three categories of information product improvement identified by English?
1. Data cleansing of source data in place
2. Data cleansing for data conversion
3. Data cleansing and reengineering for data warehousing:
8.8 - Ref: IDMA 2, IDW, C.8, P.238, O.5

"Which of the following statements concerning the difference between information product improvement and information process improvement is MOST correct?
A. Ishikawa charts are rarely used in information process improvement
B. Eliminating domain value redundancy is part of information process improvement
C. Data cleansing and reengineering for data warehousing is not one of the categories of information product improvement
D. Information process improvement attacks the root causes of defective data
D. Information process improvement attacks the root causes of defective data:
8.9 - Ref: IDMA 2, IDW, C.8, P.238, O.6

"Which of the following is one of the four components of a data warehouse identified by English?
A. Data analysis and presentation processes
B. Object data store
C. Operational database
D. Data mart
A. Data analysis and presentation processes :
8.1 - Ref: IDMA 2, IDW, C.8, P.240 and 241, O.7

"List and describe the two categories of data defects identified by English.
1. Data definition and architecture defects: defects in product/data specification, such as inconsistent definitions of attributes
2. Data content defects: defects in the actual produced information products, such as missing or incorrect data values:
8.11 - Ref: IDMA 2, IDW, C.8, P.242–244, O.8

"What are the nine steps in English’s “reengineering and data cleansing process”?
1. Identify data sources
2. Extract and analyze source data
3. Standardize data
4. Correct and complete data
5. Match and consolidate data
6. Analyze data defect types
7. Transform and enhance data into target
8. Calculate derivations and summary data
9. Audit and control data extract, transformation and loading:
8.12 - Ref: IDMA 2, IDW, C.8, P.245–276, O.9

"Which of the following is NOT an example of nonstandardized data values?
A. Format inconsistencies
B. Literal (non-numeric) data values
C. Embedded meaning in data values
D. Domain value redundancy
B. Literal (non-numeric) data values:
8.13 - Ref: IDMA 2, IDW, C.8, P.253 and 254, O.9

"Listed below are general categories of data. Which of the combinations shown below are the three general categories of data identified by English that have specific cleansing techniques?
I Name and address data
II Event entity types
III Data attribute types
IV Permanent object entity types
A. I, II, IV
B. I, III, IV
C. II, III, IV
D. I, II, III
A. I, II, IV:
8.14 - Ref: IDMA 2, IDW, C.8, P.258, O.9

"What are the eight data transformation categories identified by English?
1. Simple data extraction
2. Domain value conversion
3. Codify or classify textual data
4. Vertical filter
5. Horizontal filter
6. Matching and consolidation
7. Data evaluation and selection
8. Integration:
8.15 - Ref: IDMA 2, IDW, C.8, P.269–273, O.9

"What are the two fundamental approaches English lists for dealing with reoccurring problems?
1. Be reactive to problems and fix the symptoms or consequences
2. Be proactive by analyzing the cause of problems and eliminating the cause:
8.16 - Ref: IDMA 2, IDW, C.9, P.286, O.10

"What are the three categories English identifies for the costs of quality?
1. Cost of information scrap and rework
2. Cost of information quality assessment
3. Cost of information process quality improvement:
8.17 - Ref: IDMA 2, IDW, C.9, P.289, O.10

"What are the five steps in English’s “improve information process quality” process?
1. Select process for information quality improvement
2. Develop plan for information quality improvement
3. Implement information quality improvements
4. Check impact of information quality improvements
5. Act to standardize information quality improvements:
8.18 - Ref: IDMA 2, IDW, C.9, P.289–300, O.10

"What are the four “M’s” in a cause-and-effect diagram?
1. Manpower
2. Materials
3. Machines
4. Methods:
8.19 - Ref: IDMA 2, IDW, C.9, P.294, O.10

"The Fishbone Diagram is associated with which of the following?
A. Shewart
B. Taguchi
C. Ishikawa
D. English
C. Ishikawa:
8.2 - Ref: IDMA 2, IDW, C.9, P.294, O.10

"Which of the following is NOT one of the characteristics of a high-quality database listed by English?
A. Reusable
B. Stable
C. Flexible
D. Single-platform
D. Single-platform:
8.21 - Ref: IDMA 2, IDW, C.9, P.302, O.10

"English identified four categories of best practices for information quality. Into which of these categories does English place the best practice of “appointing business information stewards for critical data”?
A. Business process and application design
B. Data definition and information architecture
C. Management and environment
D. Business procedures and data capture
B. Data definition and information architecture:
8.22 - Ref: IDMA 2, IDW, C.9, P.302 and 303, O.11

"What are the six requirements English identifies for the effective use of information quality tools?
1. Understanding the problem you are solving
2. Understanding the kinds of technologies available and their general functionality
3. Understanding the capabilities of the tools
4. Understanding the limitations of the tools
5. Selecting the right tools based on your requirements
6. Using the tools properly:
9.1 - Ref: IDMA 2, IDW, C.10, P.312, O.1

"Describe information quality analysis tools.
Information quality analysis tools are tools that extract data from a database or process, measure its quality, such as validity or conformance to business rules, and report its analysis.:
9.2 - Ref: IDMA 2, IDW, C.10, P.312, O.2

"Describe business rule discovery tools
Business rule discovery tools are tools that analyze data to discover patterns and relationships in the data itself. The purpose is to identify business rules as actually practiced by analyzing patterns in the data.:
9.3 - Ref: IDMA 2, IDW, C.10, P.312, O.2

"Describe data reengineering, cleansing, and transformation tools
Data reengineering, cleansing, and transformation tools are data “correction” tools that extract, standardize, transform, correct (where possible) and enhance data, either in place of or in preparation for migrating the data into a data warehouse. :
9.4 - Ref: IDMA 2, IDW, C.10, P.312, O.2

"5. Describe information quality defect prevention tools
Information quality defect prevention tools are tools that prevent data errors or violations of business rules from getting into a database in the first place. Application programs that create and update data call defect-prevention product modules or routines. These products apply business rule and quality tests during the create and update processes.:
9.5 - Ref: IDMA 2, IDW, C.10, P.312, O.2

"Describe metadata management and quality tools.
Metadata management and control tools are tools that provide quality management of metadata, such as definition and control of business rules, data transformation rules, or provide for quality assessment or control of metadata itself, such as conformance to data naming standards.:
9.6 - Ref: IDMA 2, IDW, C.10, P.312 and 313, O.2)

"Which of the following information quality tool categories includes tools that are involved in the most parts of the information value chain?
A. Data reengineering, cleansing, and transformation tools
B. Business rule discovery tools
C. Information quality defect prevention tools
D. Information quality analysis tools
D. Information quality analysis tools:
9.7 - Ref: IDMA 2, IDW, C.10, P.313, O.2

"What are the nine information quality tool classifications?
1. Analysis
2. Cleansing
3. Cleansing: general data types
4. Cleansing: name and address data
5. Metadata quality
6. Data defect prevention
7. Rule discovery
8. Service provider also
9. Service provider only:
9.8 - Ref: IDMA 2, IDW, C.10, P.314, O.3

"In which of the following information quality tool categories will you find tools that are used to match and consolidate duplicate data?
A. Data reengineering, cleansing, and transformation tools
B. Metadata management and quality tools
C. Business rule discovery tools
D. Information quality analysis tools
A. Data reengineering, cleansing, and transformation tools:
9.9 - Ref: IDMA 2, IDW, C.10, P.320, O.2

"In which of the following information quality tool categories will you find tools that check that data names and abbreviations conform to data standards, but have the limitation that they cannot assess whether the data standards are “good” standards?
A. Information quality analysis tools
B. Business rule discovery tools
C. Information quality defect prevention tools
D. Metadata management and quality tools
D. Metadata management and quality tools:
9.1 - Ref: IDMA 2, IDW, C.10, P.325, O.2

"Of the following, which is NOT one of the steps English listed in evaluating information quality tools?
A. Determine whether an in-house tool meets requirements
B. Determine what category/categories of information quality function automation are required
C. Define all the business problems you are solving
D. Work with information stakeholders to define requirements
A. Determine whether an in-house tool meets requirements:
9.11 - Ref: IDMA 2, IDW, C.10, P.326, O.4

"Of the following, which is NOT one of the information quality management technique categories?
A. Information gathering and analysis techniques
B. Information quality control techniques
C. Problem-solving and improvement techniques
D. Cost/benefit analysis techniques
D. Cost/benefit analysis techniques:
9.12 - Ref: IDMA 2, IDW, C.10, P.330 and 331, O.5

"What are the three essential data quality tools that Redman identified?
1. Good measurement tool to quantify data quality levels
2. Good, basic statistical package
3. Basic project management tool:
9.13 - Ref: IDMA 2, DQFG, C.29, P.171, O.6

"Of the following, which is NOT one of the data quality tools Redman identified?
A. ETL tools (extract, transform and load)
B. Scorecard tools
C. Data repositories
D. In-line editors
A. ETL tools (extract, transform and load):
9.14 - Ref: IDMA 2, DQFG, C.29, P.171, O.6

"What are five ways that measurements can help address complaints such as “There are too many missing values” and “These data have too many errors,” as identified by Redman?
1. Measurements replace anecdote with fact
2. Measurements inform management of the depth of a problem, allowing it to be meaningfully compared with other problems
3. Measurements help localize sources of important problems
4. Measurements confirm that solutions really work
5. Measurements help customers understand what they really get:
10.1 - Ref: IDMA 2, DQFG, C.18, P.107, O.1

"The right measurements depend on the maturity of an organization. Describe the right measurement for an organization just starting out.
Organizations just starting out do not need sophisticated, scientifically defensible measurements. They need simple measures that indicate where they are, the impact(s), and the first couple of opportunities for improvement.:
10.2 - Ref: IDMA 2, DQFG, C.18, P.108, O.2

"Redman identified eight steps in a simple measurement chain that could be used to answer the question: “Is data quality an issue?” What are they?
1. Expected action/by whom
2. Select business operation
3. Select needed data fields
4. Draw small sample
5. Inspect sampled records
6. Estimate impact on business operation
7. Summarize and present results
8. Follow-up:
10.3 - Ref: IDMA 2, DQFG, C.18, P.108, O.3

"According to Redman, what question should be asked before measurement is undertaken?
Who is expected to do what with the results?:
10.4 - Ref: IDMA 2, DQFG, C.18, P.108–110, O.3

"Which of the following is NOT one of the four types of changes or errors to data fields listed by Redman?
A. Operations changes
B. Digitization changes
C. Normalization changes
D. Translation changes
B. Digitization changes:
10.5 - Ref: IDMA 2, DQFG, C.19, P.114 and 115, O.4

"In Redman’s discussion of data tracking, a department’s change of a data value because it doubts the work performed at a previous step is identified as:
A. A transposition change
B. An operations change
C. A normalization change
D. A translation change
B. An operations change:
10.6 - Ref: IDMA 2, DQFG, C.19, P.115, O.4

"What are the five key steps to the data tracking chain as identified by Redman?
1. Select a sample of records to track
2. Obtain the information needed for each
3. Determine which changes represent errors
4. Analyze the errors and cycle time data, present results
5. Use results to establish control and/or make improvements:
10.7 - Ref: IDMA 2, DQFG, C.19, P.116, O.4

"The paper “Monitoring the Quality of Data” notes that indications on four aspects of a company’s data are produced by periodic audits performed using the guidelines outlined in ISO’s Quality of Data Audit Guide. Which of the following is NOT one of the indications listed?
A. How current the processing of data is
B. How accurately and completely a company’s data reflects the true composition of the risks insured
C. How actual processing cycle-time compares with the cycle-time displayed in the procedures documentation
D. Differences between statistical and other insurance company data to be reconciled
C. How actual processing cycle-time compares with the cycle-time displayed in the procedures documentation:
10.8 - Ref: IDMA 2, R.1, P.2, O.5

"What are the seven steps involved in auditing data within an organization, as listed in ISO’s Quality of Data Audit Guide?
1. Audit preparation
2. Test completeness (reconciliation)
3. Select sample criteria
4. Secure and test sample
5. Error detection and correction
6. Analysis recommendation and report
7. Follow-up:
10.9 - Ref: IDMA 2, R.1, P.2–4, O.6

"As shown in ISO’s Quality of Data Audit Guide, which of the following is the step in performing a data quality audit in which “audit data entry and transfer” is carried out?
A. Error detection and correction
B. Audit preparation
C. Select sample criteria
D. Secure sample and test
D. Secure sample and test:
10.1 - Ref: IDMA 2, R.1, P.3, O.6

"As shown in ISO’s Quality of Data Audit Guide, which of the following is the step in performing a data quality audit in which the adequacy of re-entry procedures and re-entry controls is determined?
A. Error detection and correction
B. Audit preparation
C. Select sample criteria
D. Secure sample and test
A. Error detection and correction:
10.11 - Ref: IDMA 2, R.1, P.3, O.6

"ISO’s Quality of Data Audit Guide identifies three types of accuracy ratios commonly used in audits. Which of the following is NOT one of these ratios?
A. Data element accuracy ratio
B. Policy accuracy ratio
C. Relationship edit accuracy ratio
D. Transaction accuracy ratio
C. Relationship edit accuracy ratio:
10.12 - Ref: IDMA 2, R.1, P.6, O.7

"According to ISO’s Quality of Data Audit Guide, what is possibly one of the most critical steps of the entire audit?
A. Error detection and correction
B. Test completeness (reconciliation)
C. Audit preparation
D. Follow-up
B. Test completeness (reconciliation):
10.13 - Ref: IDMA 2, R.1, P.6, O.7

"According to ISO’s Quality of Data Audit Guide, which method of sampling makes it most likely that every element of the population has an equal chance of getting into the sample?
A. Interval sampling
B. Random number sampling
C. Stop-or-go sampling
D. All of the above
B. Random number sampling:
10.14 - Ref: IDMA 2, R.1, P.8, O.7

"According to ISO’s Quality of Data Audit Guide, to conduct a thorough data audit for accuracy and completeness, it is important to check the data from the source document to the finished report and vice versa. Explain the ways in which accuracy and completeness are assured utilizing this approach.
To assure accuracy it is necessary to draw a sample of data reported, obtain the corresponding source documents and verify the accuracy directly to the source. Inaccuracies should be traced to determine the cause.

To assure completeness it is necessary to draw a second sample from source documents to determine their disposition. How many from the sample appear on the valid data file? How many appear on the error listing? How many are missing?:
10.15 - Ref: IDMA 2, R.1, P.9, O.7

"List and explain the four areas of audit performed in this step of the process described in ISO’s Quality of Data Audit Guide: Step 4 – Secure Sample and Test.
1. 4.1 – Audit data preparation, which ensures that all data is prepared using current information and is processed completely
.) 4.2 – Audit data entry and transfer, which ensures that all data input for processing is entered properly and according to documented control procedures; and transferred data is moved intact and on time
3. 4.3 – Audit program controls, which ensures that only authorized data is entered for processing; data is completely processed accurately in a controlled environment; and that the process is reflected by an accurate audit trail
4. 4.4 – Audit output controls, which ensures that controls over output are accurate; that all required outputs are accurately produced; and that outputs are distributed in a timely manner:
10.16 - Ref: IDMA 2, R.1, P.10–12, O.7

"According to ISO’s Quality of Data Audit Guide, edit location in the processing stream ensures which of the following?
A. That errors are controlled, corrected and fixed in a timely manner
B. That edit errors are accurately corrected and re-entered into the appropriate place in the system in a timely manner
C. That all data is edited as early as possible in the processing system
D. That reliance on front-end manual edits is minimized
C. That all data is edited as early as possible in the processing system:
10.17 - Ref: IDMA 2, R.1, P.13, O.7

"ISO’s Quality of Data Audit Guide notes that the contents of the audit memorandum should be in a standardized format and contain at least seven items. What are they?
1. A definition of the population, identifying its attributes
2. The numbers and type of data elements audited
3. The number of errors encountered
4. The quantitative conclusions based on the sample
5. Substantive general conditions
6. A comparison with similar and/or previous reviews
7. Recommendations based on the audit findings:
10.18 - Ref: IDMA 2, R.1, P.15, O.7

"What is the purpose of Step 7 (Follow-up to the audit process) in the audit process described in ISO’s Quality of Data Audit Guide?
To ensure that the audit and resulting audit report are not merely staff exercises.:
10.19 - Ref: IDMA 2, R.1, P.16, O.7

"List the two major responsibilities regulators have for overseeing the operations of property/casualty insurers that are most relevant to statistical collection, as identified in the NAIC Statistical Handbook.
1) To ensure that the rates meet statutory standards; i.e., that rates are not inadequate, excessive or unfairly discriminatory
2) To monitor the market structure and performance and act if necessary to restore competition or remedy the problems caused by market failure:
11.1 - Ref: IDMA 2, R.1, P.1, O.1

"Explain the difference in how the regulators utilize financial versus statistical data, as discussed in the NAIC Statistical Handbook.
The financial data that insurers must report focuses on quarterly or annual performance as well as current financial status. Regulators can use this financial data as a “snap-shot” view of a financial picture that is both larger in scope and longer in duration. With this kind of information, regulators evaluate financial solvency and decide whether to take regulatory action to conserve an insurer’s assets and protect the interests of policyholders.

Regulators use statistical data to evaluate the rates and rating structures used by insurers in a state. In most cases, calendar year financial “snapshots” do not provide the necessary match of premiums and losses for such an analysis. Statistical data address this and other information needs by providing the essential match of premiums and losses for comparable policies.:
11.2 - Ref: IDMA 2, R.1, P.1, O.2

"According to the NAIC Statistical Handbook, why is combining and aggregating statistics important in insurance pricing and regulatory monitoring?
Virtually no insurer has enough loss experience to produce a credible data base for all aspects of its own pricing decisions. To improve statistical credibility, it is necessary that insurers’ data be combined into aggregate databases. To produce more reliable analyses of historic experience and predictions of future costs, both insurers and regulators must commonly look to pooled data.:
11.3 - Ref: IDMA 2, R1, P.2, O.3

"The selling of insurance is commerce and, therefore, can be governed by federal laws regulating interstate commerce -- this was established by which of the following?
A. Merritt Committee Report of 1910
B. NAIC All-Industry Model Rating Law
C. McCarran-Ferguson Act
D. Southeastern Underwriters Association case
D. Southeastern Underwriters Association case:
11.4 - Ref: IDMA 2, R.1, P.2, O.4

"The NAIC Statistical Handbook discusses four cornerstones in the statutory foundation for statistical reporting. In which of the following do they appear in the correct chronological order (oldest to most recent)?
1) NAIC All-Industry Model Rating Law
2) Merritt Committee Report of 1910
3) McCarran-Ferguson Act
4) Southeastern Underwriters Association case
A. 4, 2, 3, 1
B. 2, 4, 3, 1
C. 4, 2, 1, 3
D. 2, 4, 1, 3
B. 2, 4, 3, 1:
11.5 - Ref: IDMA 2, R.1, P.2, O.4

"According to the NAIC Statistical Handbook, why are statistical plans or similar tools important?
The statistical plans instruct insurers how to code and submit their premium and loss data to the statistical agent.

Although “statistical plans” in the traditional sense may be superseded for some insurers and statistical agents by other tools designed to accomplish the same results, it remains imperative that statistical plans and/or such other procedures result in data that can be meaningfully combined. For data from different insurers to be meaningfully combinable, it must conform to common data definitions. Standard definitions provide for stable and reliable databases and are, therefore, the basis of meaningful aggregated insurance data. In addition, standard coverage programs, where prevalent, permit the collection of comparable statistics and help aggregate statistics to be a valid starting point for regulatory monitoring.

The Statistical Information (C) Task Force [of the NAIC] has adopted a uniform set of suggested minimum statistical reporting requirements for all insurers. Statistical agents have designed their data collection procedures to ensure that they are able to at least meet these minimum requirements. This provides regulators with the ability to aggregate the experience of all insurers using a common set of classifications and definitions, or they can request the statistical agents to do this for them.:
11.6 - Ref: IDMA 2, R.1, P.3 and 4, O.5

"What are the methods used by regulators to examine statistical accuracy?
• Through visits to a company’s processing location(s), known as field reviews
• Having files forwarded to a centralized Audit team location (i.e., a company’s home office or insurance department), known as in-house reviews
• On-line viewing of input validity, known as automated edit reviews
• Combinations of any or all of these methods:
11.7 - Ref: IDMA 2, R.2, P.1, O.6

"Briefly describe each of the five types of statistical examinations.
1. Work-in-progress
This examination is focused on the current state of data quality. The exam looks at source documents and coding activities that are within a short period before the examiner’s review. This type of exam does not provide a front-to-back verification of data, however, it does provide timely data entry error ratios and can detect problems before serious error conditions can occur.
2. Interval Sampling
This examination focuses on use of an automated sampling system, which selects individual or groups of records meeting specified criteria for examination. The sampling is performed against records that are in the reporting data of the company.
3. Automated System Review
This examination approach reviews the unique system configurations of each company’s operating system.
4. Automated System Inquiries
The intent of this type of examination is to sample system source information and test for the following:
• Error rates in data transfer
• Input or translation errors
• Adherence to approved methods, practices and procedures
5. Stop-and-Go Examination
This type of examination does not require the use of sampling methodologies. The examiner would define the information (population) to be examined and then selects the population criteria. The entire population is examined.:
11.8 - Ref: IDMA 2, R.2, P.2, O.7

"Which type of statistical examination focuses on the use of an automated sampling system, which selects individual or groups of records meeting specified criteria for examination?
A. Stop-and-Go
B. Work-in-progress
C. Interval sampling
D. Automated system review
C. Interval sampling:
11.9 - Ref: IDMA 2, R.2, P.2, O.7

"The intent of the Automated System Inquiries examination is to sample system source information. What does the examination test for and what two basic system sampling techniques are used?
The examination tests for:
• Error rates in data transfer
• Input or translation errors
• Adherence to approved methods, practices and procedures

The two basic sampling techniques are:
1. Master file update transaction sampling
2. Master file record sampling:
11.1 - Ref: IDMA 2, R.2, P.2, O.7

"What is the difference in focus between rate examinations and market conduct examinations?
A rate examination focuses on an in-depth review of policy handling, including underwriting, rating and policy processing during all phases. It must be determined that a company is accurately applying the rules, rates and forms filed by it or any advisory organization. Attention is given to compliance with insurance laws and/or regulation as they pertain to policies and rates.

A market conduct examination reviews a company’s practices in relation to a state’s unfair trade practices laws and regulations. This review may cover a wide spectrum of activity. This type of examination may also have an error tolerance standard.:
11.11 - Ref: IDMA 2, R.2, P.3, O.8

"Market conduct examinations review a company’s practices across a wide spectrum of activity. Which of the following is NOT one of the covered practices listed in the Reading?
A. Hiring practices
B. Agent cancellations
C. Licensing
D. Advertising practices
A. Hiring practices:
11.12 - Ref: IDMA 2, R.2, P.3, O.8

"Identify the three types of costs associated with examinations and provide a brief description of each.
1. Administrative penalties
• Administrative penalties are levied for errors identified
2. Direct examination costs
• The costs associated with the examination, such as workspace, supplies, telephones, etc., examiner per diem charges
3. Indirect (hidden) examination costs
• Disruptions experienced in the company’s offices during the examination
• Costs associated with required re-rating due to policy errors
• Staff assistance and consultation time
• Retrieval of, duplication of, access to countless records and reports
• Time and personnel required to review questions and defend business practices:
11.13 - Ref: IDMA 2, R.2, P.5 and 6, O.9

"What are the five features and benefits of the Statistical Data Monitoring System?
1. Informs the company of how its controls over errors compare with respect to other companies in the industry through error rate reporting
2. Provides regulators with a measure of the accuracy of data they are using, from the perspectives of an individual company’s data, as well as the collective summaries prepared by statistical agents through error rate reporting
3. Provides statistical agents with the tools needed to monitor the quality of the data received from each individual company, as well as a means to control the accuracy of its own operations
4. Provides the regulator with the tools to verify adherence by each company and statistical agent to the stated requirements of annual certification and periodic independent review
5. Provides each company with the tools needed to control the accuracy of its own statistical data through the use of the procedural control checklist, sampling procedures, reasonability testing procedures, and financial reconciliation:
11.14 - Ref: IDMA 2, R.3, P.2, O.10)

"What are the four responsibilities of statistical agents under the Statistical Data Monitoring System?
1. Collecting and summarizing specific reports and documentation from individual companies
2. Performing required monitoring system tests and maintaining documentation of its own operations
3. Adhering to monitoring requirements and performance standards
4. Self-certifying performance and adherence:
11.15 - Ref: IDMA 2, R.3, P.2, O.10

"List the five duties a company must perform in order to satisfy the requirements of the Statistical Data Monitoring System.
1. Determine the specific sections of the Statistical Data Monitoring System (SDMS) that apply to their environment
2. Perform the required tests and complete documentation for each applicable section of the SDMS
3. Perform an annual review to confirm compliance with all provisions of the SDMS
4. Certify to the effective completion of required tests and documentation
5. Retain the required documentation for a period of five years, and have it available for review by regulatory examiners:
11.16 - Ref: IDMA 2, R3, P.3, O.10

"Under the Statistical Data Monitoring System, sample sizes are set to assure those data element errors which affect one percent or more of the transactions are discovered with a certain probability over the course of the year. What is that specified probability?
A. 90%
B. 80%
C. 99.99%
D. 95%
D. 95%:
11.17 - Ref: IDMA 2, R.3, P.4, O.10

"Briefly describe the objective of reasonability testing under the Statistical Data Monitoring System.
Under the Statistical Data Monitoring System, the objective of reasonability testing is to determine whether a company’s data submitted to its statistical agent is reasonably consistent with prior periods, or that apparent inconsistencies can be explained. Reasonability testing provides a means of detecting gross errors at the coverage, class and territory level.:
11.18 - Ref: IDMA 2, R.3, P.4, O.10

"Under the Statistical Data Monitoring System, several of the requirements are different based on the size of the system versions used by the company. What is the current threshold (or minimum volume) for a “Small” written premium system version as specified in the Reading?
A. $300 million
B. $170 million
C. $250 million
D. $999 million
A. $300 million:
11.19 - Ref: IDMA 2, R.3, P.5, O.10

"Under the Statistical Data Monitoring System, the Financial Reconciliation must be completed and sent to the company’s statistical agent following the close of the calendar year. What is the due date for the reconciliation?
A. April 15
B. June 30
C. January 31
D. March 31
B. June 30:
11.2 - Ref: IDMA 2, R.3, P.7, O.10

"Unit Report Control (URC) Program first reports may be corrected in which way(s)?
1. Submitting a replacement report
2. Submitting a second report
3. Submitting a correction report
4. Submitting a subsequent report at any time
A. 3 only
B. 1, 2 and 3 only
C. 3 and 4 only
D. 1 and 3 only
D. 1 and 3 only:
11.21 - Ref: IDMA 2, R.3, P.8, O.11

"Identify what indications may be provided through analysis of the NAIC Market Conduct Annual Statement data.
The intent of the data collection and analysis is to provide indications such as:

Life/Annuity Companies
• Indications of possible churning or unusual replacement activity
• Consumer complaint ratios
• Rates of contested claims
• Payment timeframes for death claims
Property/Casualty Companies
• Who takes the longest time to pay claims
• The highest rates of first-party and third-party lawsuits
• Changes in private passenger automobile underwriting
• Changes in homeowners underwriting:
11.22 - Ref: IDMA 2, R.4, P.1 and 2, O.12

"Which of the following lines of business is/are included in the Market Conduct Annual Statement?
1. General Liability
2. Private Passenger
3. Commercial Property
4. Life
A. 2
B. 2 and 4
C. 1, 2, and 3
D. 1, 2, 3, and 4
B. 2 and 4:
11.23 - Ref: IDMA 2, R.4, P.1, O.12

"Identify and briefly explain the six essentials of software testing.
1. The quality of the test process determines the success of the test effort – this means that an immature test process can result in an unproductive, chaotic environment that produces low-quality results.
2. Prevent defect migration by using early life-cycle testing techniques – more than half the errors are usually introduced in the requirements phase. The cost of errors are minimized if they are detected in the same phase as they are introduced, and an effective test program prevents the migration of errors from any development phase to any subsequent phases.
3. The time for software testing tools is now – automated testing products are mature and available from a variety of vendors, and can help shorten the test-cycle time and discover whether a product has been thoroughly tested.
4. A real person must take responsibility for improving the testing process – software testing is a process that requires people who take responsibility for its improvement.
5. Testing is a professional discipline requiring trained, skilled people – testing is not an entry-level job; it should not be subservient to, or positioned for easy overrule by the product development team.
6. Cultivate a positive team attitude of creative destruction – good testing is devising and executing tests that discover defects in a product; it requires ingenuity, and may be viewed as destructive. Establishing the proper “test to break” mental attitude has a profound effect on testing success.:
12.1 - Ref: IDMA 2, STRW, C.1, P.3-6, O.1

"What is the difference between debugging and software testing?
Debugging is fixing the known bugs in the software, while software testing is detecting bugs in the software.:
12.2 - Ref: IDMA 2, STRW, C.2, P.8, O.2

"Kit uses four terms to describe a “general failure to translate successfully” -- Error, Failure, Fault, and Mistake (presented here in alphabetical order). In which of the following are they presented in the sequence Kit identifies?
A. Mistake, Failure, Error
B. Error, Fault, Failure
C. Failure, Fault, Error
D. Mistake, Error, Fault
A. Mistake, Failure, Error:
12.3 - Ref: IDMA 2, STRW, C.4, P.18, O.3

"According to Kit, how should software testing be positioned in the development life-cycle, and why?
Testing should be done as a continuous activity throughout the development life-cycle, to prevent undetected errors from migrating downstream.:
12.4 - Ref: IDMA 2, STRW, C.4, P.20, O.4

"According to Kit, what is the best definition of testing, for the tester?
The purpose of testing is to discover errors. Testing is the process of trying to discover every conceivable fault or weakness in a work product.:
12.5 - Ref: IDMA 2, STRW, C.4, P.22, O.5

"Kit identified several methods used to detect errors. Which of the following is NOT one of those methods?
A. Examining the users’ requirements
B. Examining external structure and design
C. Examining the functional user interface
D. Executing code
B. Examining external structure and design:
12.6 - Ref: IDMA 2, STRW, C.4, P.24, O.5

"According to Kit, risk-based considerations in testing focus include all of the following, EXCEPT:
A. Focus on the parts of the system or program most likely to have errors
B. Focus on the parts of the system that will be frequently used
C. Focus on the parts of the system that are executed first, because errors there will affect the downstream part of the system
D. Focus on the parts of the system where failure has greater consequences
C. Focus on the parts of the system that are executed first, because errors there will affect the downstream part of the system:
12.7 - Ref: IDMA 2, STRW, C.5, P.27, O.6

"Testing can be separated into two basic forms – verification and validation. According to IEEE/ANSI, how do they differ?
Verification is the process of evaluating a system or component to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase. It is the process of evaluating, reviewing, inspecting, and doing desk checks of work products such as requirements specifications, design specifications, and code. For code, it means the static analysis of the code – a code review – not the dynamic execution of the code. It is also called “human” testing because it involves looking at documents on paper.
Validation is the process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements. It normally involves executing the actual software or a simulated mock-up. Validation is a computer-based testing process, and usually exposes symptoms of errors.:
12.8 - Ref: IDMA 2, STRW, C.5, P.29 and 30, O.7

"What are the six phases identified by Kit in the software development life-cycle?
1. Concept
2. Requirements
3. Design
4. Implementation (or Code)
5. Test
6. Operation and maintenance:
12.9 - Ref: IDMA 2, STRW, C.5, P.31, O.8

"In which phase or phases of the typical software development life-cycle is validation testing usually performed?
I Requirements
II Design
III Implementation (or code)
A. I
B. I and II
C. II and III
D. I, II, and III
C. II and III:
12.1 - Ref: IDMA 2, STRW, C.5, P.29–31, O.7, O.8

"According to Kit, what three products must be produced in the software development process to enable an effective testing effort?
1. Requirements specifications
2. Functional design specifications
3 Internal design specifications:
12.11 - Ref: IDMA 2, STRW, C.5, P.33, O.9

"According to Kit, what are the seven questions that should be asked in determining the cost-effectiveness of the testing effort?
1. Do we know what testing is really costing us?
2. How is the testing effort affecting development time?
3. Do we know what percentage of our development resource testing represents?
4. Are testing tools saving us time?
5. Are testing tools paying for themselves?
6. Are we trained to exploit these tools to the full, or have they become shelfware?
7. Are we using our testing resources to find errors that represent the biggest financial risk?:
12.12 - Ref: IDMA 2, STRW, C.5, P.34, O.9

"What is the Software Engineering Institute? Describe its mission.
The Software Engineering Institute (SEI) is a federally-funded research and development center, sponsored by the Department of Defense. It was established by Congress in 1984 to address a two-fold shortage of trained software professionals and quality software, produced on schedule and within budget. The mission of the SEI is to provide leadership to advance the state-of-the-practice of software engineering to improve the quality of systems that depend on software.:
12.13 - Ref: IDMA 2, STRW, C.6, P.38, O.10

"Which of the following is NOT one of the five levels in the SEI’s Capability Maturity Model?
A. Repeatable
B. Defined
C. Final
D. Managed
C. Final:
12.14 - Ref: IDMA 2, STRW, C.6, P.38, O.10

"To which of the following parts of Configuration Management (CM) does “prevent unnecessary or marginal changes” belong?
A. Configuration status accounting
B. Configuration audit
C. Configuration identification
D. Configuration and change control
D. Configuration and change control:
12.15 - Ref: IDMA 2, STRW, C.6, P.42, O.11

"Configuration Management (CM) answers all of the following questions EXCEPT:
A. What is our current hardware configuration?
B. What is our current software configuration?
C. How do we control changes to our configuration?
D. Do anyone else’s changes affect our software?
A. What is our current hardware configuration?:
12.16 - Ref: IDMA 2, STRW, C.6, P.43, O.11

"Which of the following is NOT one of the organizations identified by Kit as being primarily concerned with designing and promoting the use of standards?
A. IEEE
B. MCSE
C. ANSI
D. SPICE
B. MCSE:
12.17 - Ref: IDMA 2, STRW, C.6, P.45–48, O.12

"Which of the following is NOT one of the three formal document deliverables in software development?
A. Functional design specification
B. Requirements specification
C. User interface specification
D. Internal design specification
C. User interface specification:
12.18 - Ref: IDMA 2, STRW, C.6, P.48 and 49, O.13

"What is testware?
Testware includes verification checklists, verification error statistics, test data, and supporting documentation like test plans, test specifications, test procedures, test cases, and test reports. Testware is so named because it has a life beyond its initial use, and should be placed under the control of a configuration management system, saved, and maintained.:
12.19 - Ref: IDMA 2, STRW, C.6, P.49, O.13

"According to Kit, verification is a “human” examination or review of the work product, and inspections are generally considered the most formal of the verification methods. Name and describe two other types of verification methods.
1. Walkthroughs – less formal than inspections; the participants come to the presenter’s meeting to detect defects and become familiar with the material.
2. Buddy checks – simply giving a document to someone else and asking him or her to look at it closely.:
13.1 - Ref: IDMA 2, STRW, C.7, P.57–61, O.1

"According to Kit, which of the following is NOT a characteristic of a walkthrough?
A. Main goal is to familiarize others with the product
B. Reviews in walkthroughs are more objective than in inspections
C. Walkthroughs can cover more material than inspections
D. Limited workload for the other participants
B. Reviews in walkthroughs are more objective than in inspections:
13.2 - Ref: IDMA 2, STRW, C.7, P.61, O.1

"According to Kit, which of the following is NOT true of verification testing?
A. Requirements verification offers the biggest potential savings to software development efforts
B. In general, buddy checks and walkthroughs are recommended more than formal reviews as verification methods
C. Verification allows for the finding and detecting of defects at the earliest possible time
D. An important tool for verification testing is the checklist
B. In general, buddy checks and walkthroughs are recommended more than formal reviews as verification methods :
13.3 - Ref: IDMA 2, STRW, C.7, P.61–63, O.1

"According to Kit, all of the below are properties of good requirements specifications EXCEPT:
A. Frozen – so that the programmers are not presented with a moving target
B. Consistent – conflicting requirements are prioritized
C. Testable – so that during program development and acceptance testing, it will be possible to determine whether the item has been satisfied
D. Traceable – so that during program development and testing, it will be possible to trace each item through the various stages of development
A. Frozen – so that the programmers are not presented with a moving target:
13.4 - Ref: IDMA 2, STRW, C.7, P.64 and 65, O.2

"According to Kit, what is the goal in verifying a functional design?
The goal in verifying a functional design is to determine how successfully the user requirements have been incorporated into the functional design. Embracing the concept of traceability, the tester should see that every paragraph in the requirements is reflected in the functional design specifications.:
13.5 - Ref: IDMA 2, STRW, C.7, P.68, O.2

"According to Kit, what is one of the most common failings in functional design specifications? What should the inspector/reviewer do to discover such failings?
One of the most common failings of the functional design specifications is incompleteness. Good inspectors/reviewers read more than just what’s in front of them; they must constantly ask, “What’s missing?”:
13.6 - Ref: IDMA 2, STRW, C.7, P.68, O.2

"Kit identified three critical success factors for implementing verification -- list and describe them.
1. Process Ownership – the process needs a champion, someone who truly cares about the process and will take ownership. This can be a developer, a quality assurance person, or a process expert.
2. Management Support – managers must be well briefed on inspections and their benefits, since it is unreasonable to expect them to spend resources and support the effort if they don’t see the long-term gains.
3. Training – training is crucial, including specific training on how to perform reviews and inspections, including costs, benefits, and dealing with human and cultural issues. :
13.7 - Ref: IDMA 2, STRW, C.7, P.74 and 75, O.3

"According to Kit, what are the eight axioms that apply to all validation testing?
1. Testing can be used to show the presence of errors, but never their absence.
2. One of the most difficult problems in testing is knowing when to stop.
3. Avoid unplanned, non-reusable, throw-away test cases unless the program is truly a throw-away program.
4. A necessary part of a test case is a definition of the expected output or result. Always carefully compare the actual versus the expected results of each test.
5. Test cases must be written for invalid and unexpected, as well as valid and expected, input conditions. “Invalid” is defined as a condition that is outside the set of valid conditions and should be diagnosed as such by the program being tested.
6. Test cases must be written to generate desired output conditions. Less experienced testers tend to think only from the input perspective. Experienced testers determine the inputs required to generate a pre-designed set of outputs.
7. With the exception of unit and integration testing, a program should not be tested by the person or organization that developed it. Practical cost considerations usually require developers do unit and integration testing.
8. The number of undiscovered errors is directly proportional to the number of errors already discovered. :
13.8 - Ref: IDMA 2, STRW, C.8, P.77, O.4

"According to Kit, what are the two keys in testing to determine whether a program meets its requirements?
1. Developing tests that will determine whether the product satisfies the user’s requirements, as stated in the requirements specification.
2. Developing tests that will determine whether the product’s actual behavior matches the desire behavior, as described in the functional design specifications. :
13.9 - Ref: IDMA 2, STRW, C.8, P.78, O.5

"Kit identified two fundamental validation testing strategies–black-box testing and white-box testing. Describe them
1. Black-box tests are derived from the functional design specification, without regard to internal program structure. Black-box testing tests the product against the external end-user specifications. Ideally black-box testing is done with little or no knowledge of the code.
2. White-box testing requires knowledge of the internal program structure and is derived from the internal design specification or the code.:
13.1 - Ref: IDMA 2, STRW, C.8, P.79 and 80, O.6

"According to Kit, which of the following statements is MOST accurate regarding validation strategies?
A. Requirements-based tests should employ the black-box strategy
B. Function-based tests should employ the white-box strategy
C. Internals-based tests should employ the black-box strategy
D. Externals-based tests should employ the white-box strategy
A. Requirements-based tests should employ the black-box strategy:
13.11 - Ref: IDMA 2, STRW, C.8, P.81, O.6

"According to Kit, validation activities can be divided into low-level testing and high-level testing. Describe low-level testing and its two forms.
Low-level testing involves testing individual program components, one at a time or in combination. It requires knowledge of the program’s internal structure and is therefore most appropriately performed by development.
Forms include:
1. Unit (module) testing – the process of testing the individual program components to discover discrepancies between the module’s interface specification and its actual behavior;and
2. Integration testing – the process of combining and testing multiple components together, to discover errors in the interface between the components.:
13.12 - Ref: IDMA 2, STRW, C.8, P.93–95, O.7

"According to Kit, which of the following statements is MOST accurate regarding low-level and high-level testing?
A. Low-level testing involves testing multiple components simultaneously
B. Low-level testing includes function testing
C. Low level-testing requires no knowledge of the program’s internal structure and can therefore be performed by technical staff or end-users
D. Low-level testing tests that modules work -- high-level testing tests that they work together
A. Low-level testing involves testing multiple components simultaneously:
13.13 - Ref: IDMA 2, STRW, C.8, P.93-95, O.7

"Describe high-level testing and list its four forms.
High-level testing involves testing whole, complete products, and is most appropriately performed by an independent test group outside the development organization. Forms of high-level testing include:
1. Usability testing
2. Function testing
3. System testing
4. Acceptance testing:
13.14 - Ref: IDMA 2, STRW, C.8, P.96, O.7

"List and describe four usability characteristics that can be tested.
1. Accessibility – Can users enter, navigate, and exit with relative ease?
2. Responsiveness – Can users do what they want, when they want, in a way that’s clear?
3. Efficiency – Can users do what they want in a minimum amount of steps and time?
4. Comprehensibility – Do users understand the product structure, its help system, and the documentation?:
13.15 - Ref: IDMA 2, STRW, C.8, P.97, O.7

"System testing is the process of attempting to demonstrate that a program or system does not meet its original requirements and objectives as stated in the requirements specification. According to Kit, all of the following are types/goals of system testing EXCEPT:
A. Usability testing, to identify those operations that will be difficult or inconvenient for uses
B. Security testing, to determine whether the program’s security requirements can be subverted
C. Resource usage testing, to determine whether the program uses resources at levels which exceed requirements
D. Data definition testing, to determine whether the data definitions in the program/system meet the organization’s data dictionary requirements
D. Data definition testing, to determine whether the data definitions in the program/system meet the organization’s data dictionary requirements:
13.16 - Ref: IDMA 2, STRW, C.8, P.100–102, O.7

"Distinguish regressive testing from progressive testing.

Progressive testing is the process of testing new code to determine whether it contains errors. Regressive testing is the process of testing a program to determine whether a change has introduced errors in the unchanged code – that is, has the quality of the program regressed.:
13.17 - Ref: IDMA 2, STRW, C.8, P.105 and 106, O.8


"According to Kit, what is testware?
Testware is the collection of major work products (deliverables) of testing.:
13.18 - Ref: IDMA 2, STRW, C.9, P.110, O.9

"According to Kit, what is the PRIMARY objective of testware?
A. To minimize costs by developing testing software in-house
B. To maximize the testing yield by maximizing the potential for detecting errors and minimizing the number of tests required
C. To make it easier to check results, although it requires higher-level staffers to check them
D. To standardize testing procedures across projects
B. To maximize the testing yield by maximizing the potential for detecting errors and minimizing the number of tests required:
13.19 - Ref: IDMA 2, STRW, C.9, P.110, O.9

"According to Kit, which of the following is the BEST way to control the costs of validation testing?
A. Avoid automation in post-run checking of test results
B. Avoid re-use of testware
C. Build testware only if absolutely necessary (buy instead)
D. Modify programs with temporary testing hooks to make testing easier
C. Build testware only if absolutely necessary (buy instead):
13.2 - Ref: IDMA 2, STRW, C.9, P.115, O.9

"What are the five risk management considerations Kit identifies for master test planning?
1. Size and complexity of the product to be tested
2. Criticality of the product: could its failure impact safety or cause a large financial or social loss?
3. (SEI) development process maturity level
4. Form of testing (full, partial, endgame, or audit)
5. Staffing, expertise, and organization:
14.1 - Ref: IDMA 2, STRW, C.10, P.119, O.1

"What are the six considerations Kit identifies for verification test planning?
1. The verification activity to be performed
2. The methods to be used (inspections, walkthroughs, etc.)
3. The specific areas of the work product that will and will not be verified
4. The risks associated with any areas of the work product that will not be verified
5. Prioritizing areas of the work product to be verified
6. Resources, schedule, facilities, tools, responsibilities:
14.2 - Ref: IDMA 2, STRW, C.10, P.121, O.2

"According to Kit, what are the verification execution deliverables?
A. Inspection report and verification test report
B. Verification test report and summary report
C. Incident report and inspection report
D. Defect analysis report and risk assessment
A. Inspection report and verification test report:
14.3 - Ref: IDMA 2, STRW, C.10, P.122, O.2

"The validation activities that Kit lists include unit testing, usability testing, function testing, system testing, and acceptance testing. What are the six test planning considerations Kit identified for validation testing?
1. Test methods
2. Facilities (for test development vs. test execution)
3. Test automation
4. Support software (shared by development and test)
5. Configuration management
6. Risks (budget, resources, schedule, training, etc.):
14.4 - Ref: IDMA 2, STRW, C.10, P.124, O.3

"According to Kit, what are the four objectives in designing testware?
1. Detect as many errors as possible
2. Minimize test development costs
3. Minimize test execution costs
4. Minimize test maintenance costs:
14.5 - Ref: IDMA 2, STRW, C.10, P.126, O.4

"The tasks listed by Kit under test execution include test case selection, pre-run setup, execution, post-run analysis, recording activities, results, and incidents, determining whether failures are caused by product errors or errors in the tests, and measuring internal logic coverage. Identify and describe the purpose of the three key test execution deliverables.
1. Test logs – provide a chronological record of relevant details about the execution of tests
2. Test incident reports – document any test execution event that requires further investigation
3. Logic coverage report (tool output) – documents how much of the software logic has been tested:
14.6 - Ref: IDMA 2, STRW, C.10, P.131–133, O.5

"According to Kit, what are the two classic test completion criteria?
1. The clock runs out (allocated testing time elapses)
2. All tests run to completion without detecting any errors:
14.7 - Ref: IDMA 2, STRW, C.10, P.134, O.5

"Kit identifies a model for monitoring test execution status, by counting the number of test cases in four categories and tracking them against time. List and describe these four categories.
1. Planned – test cases that are planned to be developed
2. Available – planned test cases available for execution
3. Executed – available test cases that have been executed
4. Passed – executed test cases whose most recent executions have no detected errors:
14.8 - Ref: IDMA 2, STRW, C.10, P.134 and 135, O.5

"According to Kit, there are three ways to categorize test tools. What are they?
1. By the testing activity or task in which it is employed
2. By descriptive functional keyword, such as capture/playback
3. By major areas of classification – high-level classes or groupings of tools:
14.9 - Ref: IDMA 2, STRW, C.11, P.144, O.6

"According to Kit, testing tools can be categorized based on the testing activity or task in which they are employed. What are the five types of testing activities he identified?
1. Reviews and inspections
2. Test planning
3. Test design and development
4. Test execution and evaluation
5. Test support:
14.1 - Ref: IDMA 2, STRW, C.11, P.144 and 145, O.7

"Kit identified all of the following as tool types used for reviews and inspections EXCEPT:
A. Coverage analysis
B. Complexity analysis
C. Code comprehension
D. Syntax and semantic analysis
A. Coverage analysis:
14.11 - Ref: IDMA 2, STRW, C.11, P.145, O.8

"Kit identified all of the following as tool types used for test planning EXCEPT:
A. Templates for test plan documentation
B. Syntax and semantic analysis
C. Test schedule and staffing estimates
D. Complexity analyzer
B. Syntax and semantic analysis:
14.12 - Ref: IDMA 2, STRW, C.11, P.146, O.9

"What are the four tool types Kit identified for use in test design and development?
1. Test data generator
2. Requirements-based test design tool
3. Capture/playback
4. Coverage analysis:
14.13 - Ref: IDMA 2, STRW, C.11, P.147, O.10

"List and briefly describe the four main types of tools Kit identified for test execution and evaluation.
1. Capture/Playback – automates the execution of tests and is used to run tests unattended for hours, overnight, or 24 hours a day. They capture user operations including keystrokes, mouse activity, and display output.
2. Coverage Analysis – a way to find out if the software is being thoroughly tested, by telling us which parts of the product have been executed (covered) by our current tests
3. Memory testing – tools that have the ability to detect memory problems, overwriting/overreading array bounds, other memory issues
4. Simulators and performance – simulators take the place of software or hardware that interacts with the software to be tested, used frequently with telecommunications and networks:
14.14 - Ref: IDMA 2, STRW, C.11, P.148–151, O.11

"Identify and describe the two types of tools Kit identified as being required for software testing support.
1. Problem Management – also known as defect tracking tools, bug management tools, incident control systems, etc. These tools are used to record, track, and assist with the management of defects and enhancements throughout the lifecycle of software products.
2. Configuration Management – the key to managing, controlling, and coordinating changes to documents and anything else that is important to the software development:
14.15 - Ref: IDMA 2, STRW, C.11, P.151, O.12

"Which of the following is a tool type that Kit identified as being required for more than one testing activity?
A. Coverage analysis
B. Memory testing
C. Data generator
D. Code comprehension
A. Coverage analysis:
14.16 - Ref: IDMA 2, STRW, C.11, P.145–151, O.8-O.12

"What are the five useful testing measures identified by Kit?
1. Measuring complexity
2. Measuring verification efficiency
3. Measuring test coverage
4. Measuring/tracking test execution status
5. Measuring/tracking incident reports:
14.17 - Ref: IDMA 2, STRW, C.12, P.155–158, O.13

"What are the two common methods Kit identifies for measuring complexity?
A. Procedure analysis and function points
B. Lines of code and function points
C. Procedure analysis and code comprehension factor
D. Lines of code and semantic analysis
B. Lines of code and function points:
14.18 - Ref: IDMA 2, STRW, C.12, P.155 and 156, O.13

"According to Kit, what is the simplest way of measuring/tracking test execution status?
Test execution tracking is performed most simply by using a spreadsheet. Each column consists of a time stamp and the four test categories (planned, available, executed, and passed). Each row is a periodic observation of the number of test cases in each category.:
14.19 - Ref: IDMA 2, STRW, C.12, P.157, O.13

"Which of the following is listed by Kit as a principle of measuring/tracking incident reports?
A. Incident reports should be maintained in functionally-related repositories
B. Every incident must be reported immediately, informally if necessary, with formal reporting to follow if the incident cannot be resolved quickly
C. Incident reports should be filed by customer/end-users only, as only they can appropriately measure the impact -- all incidents must be funneled through them
D. Incident reports must rigorously identify the software configuration in which the incident occurred
D. Incident reports must rigorously identify the software configuration in which the incident occurred:
14.2 - Ref: IDMA 2, STRW, C.12, P.157 and 158, O.13

"List and describe the six organizational structural design elements that Kit discussed as part of the process of developing a testing structure.
1. Tall or flat – reflects the number of levels between the CEO and the person on the shipping floor; more levels is a “tall” organization, fewer is “flat”
2. Market or product – the organization may be structured to serve different markets or different products
3. Centralized or decentralized – helps determine if the test organization should be centralized or not
4. Hierarchical or diffused – an organization may have successively higher levels of authority and rank (hierarchical) or may have authority widely spread or scattered or matrixed (diffused)
5. Line or staff – organization will have a certain mix of line and/or staff roles
6. Functional or project – organization may have functional or project orientations:
15.1 - Ref: IDMA 2, STRW, C.13, P.166, O.1

"Kit identified seven approaches to organizing testing that reflect the evolution of a maturing development organization. List and describe them.
1. Testing is each person’s responsibility – occurs often in the real world; product developers are responsible for testing their own code
2. Testing is each unit’s responsibility – product developers within a group are responsible for testing each other’s code
3. Testing is performed by a dedicated resource – product developers are converted to full-time test developers
4. The test organization in QA – place the test organization within the Quality Assurance (QA) area
5. The test organization in development – place the test organization in the development organization
6. Centralized test organization – create a central test organization that lives within and serves a product development division
7. Centralized test organization with a test technology center – in addition to a test development group, this approach provides a test technology group, as part of software engineering, to assure consistency of test methods across areas within the company:
15.2 - Ref: IDMA 2, STRW, C.13, P.166, O.1

"What did Kit identify as the major disadvantage of the approach to testing in which testing is each unit’s responsibility?
It is asking too much of a product developer to find the time to understand the job of the software testing professional; ultimately, these developers will concentrate their efforts on the area of responsibility for which they are evaluated – product developer.:
15.3 - Ref: IDMA 2, STRW, C.13, P.167 and 168, O.2

"All of the following are advantages Kit identified as being gained by the creation of a centralized test organization within the product development division EXCEPT:
A. Establishes a career path for first line test managers
B. Permits senior test manager to find and hire strong first line test managers
C. Ensures that test managers can concentrate on validation testing rather than verification testing
D. Permits coordination of consistent training for test managers
C. Ensures that test managers can concentrate on validation testing rather than verification testing:
15.4 - Ref: IDMA 2, STRW, C.13, P.171 and 172, O.2

"Kit listed six responsibilities of a test technology group. What are they?
1. Leading and managing testing process and testing productivity improvement efforts
2. Driving and coordinating testing training programs
3. Coordinating the planning and implementation of testing tool programs
4. Documenting test process, standards, policies, and guidelines as necessary
5. Recognizing and leveraging best practices within the testing groups
6. Recommending, obtaining consensus for, and implementing key testing measurements:
15.5 - Ref: IDMA 2, STRW, C.13, P.173, O.2

"The traditional type of user interface that preceded GUI is known as:
A. Green-screen interface
B. Character-based user interface
C. ASCII interface
D. IEEE/ANSI interface
B. Character-based user interface:
15.6 - Ref: IDMA 2, STRW, C.14, P.176, O.3

"Describe the process of Usage Testing.
Usage testing reflects expected operational use, and thus the emphasis in testing is on detecting errors that are most likely to occur in operational use. The process involves:
• initial testing based on estimated usage patterns
• measuring (collecting data on) actual usage (after the product is complete and functional), and developing an operational profile
• adjusting priorities, developing new tests, and retesting, based on the operational profile:
15.7 - Ref: IDMA 2, STRW, C.14, P.177, O.4

"According to Kit, usage testing would most commonly be performed in which of the following testing phases?
A. Pilot testing
B. Unit testing
C. Integration/system testing
D. Acceptance testing
D. Acceptance testing:
15.8 - Ref: IDMA 2, STRW, C.14, P.177, O.4

"Of the following tester-to-developer ratios, according to Kit which is the most typical?
A. 1:10
B. 1:15
C. 1:7
D. 1:3
D. 1:3:
15.9 - Ref: IDMA 2, STRW, C.14, P.178, O.5

"What are the seven “best projects” findings of the benchmark software measures and practices study conducted in the joint initiative by Xerox Corporation and Software Quality Engineering?
1. Best projects emphasize strong up-front planning and close tracking and reporting of status on an ongoing basis
2. Best projects rely on management fundamentals (teamwork, communication, controls), not on technology and state-of-the-art methodologies
3. Best projects emphasize reviews, inspections, and very strong and independent high-level testing
4. Measurement used to track progress, quality problems, and issues
5. There were seven practices in common use related to planning and up-front requirements and design specifications
6. Best projects utilize recommended practices – the investment required to achieve a more disciplined and structured process is evident in the results
7. Best projects excel in different areas and emphasize different phases of the life cycle; no single project was superior in all areas:
15.1 - Ref: IDMA 2, STRW, C.14, P.178-180, O.6

"Which of the following is NOT one of the basic issues Kit identified as being a key to success in achieving software excellence?
A. Ongoing measurements
B. Good controls
C. State-of-the-art technology
D. Good management
C. State-of-the-art technology:
15.11 - Ref: IDMA 2, STRW, C.14, P.180, O.7

"Kit identified three categories of sources for help in advancing software testing within an organization. List and describe them.
1. Software testing books and newsletters – current books on the subject, and a classic or two such as ‘The Art of Software Testing’ by Glenford Myers; there are also journals and newsletters that are a valuable source of information on practice and general comparisons of experience within the testing community.
2. Consulting and training services – they can help instigate change and promote a neutral and unbiased viewpoint without political risk. A consultant can provide an industry-wide perspective including experiences drawn from many different organizations. Training provides people at all levels with a sound knowledge of the techniques and concepts they need, and is a critical measure of the commitment of the organization to invest in developing the expertise of its staff.
3. Software testing conferences – attendees include software testers, managers, QA specialists, systems analysts, project managers, and programmers. Since many companies send senior staff, networking opportunities can be very valuable.:
15.12 - Ref: IDMA 2, STRW, C.15, P.182 and 183, O.8

"According to Kit, what are the four elements typically featured in a software testing conference?
1. Pre-conference tutorials – experts focus on selected topics for one-half to two days
2. General sessions – experts and industry leaders share technical insights and perspectives on trends in the field and offer recommendations to attendees
3. Testing tool exhibit – held for one or two days of the conference, drawing 20 to 30 vendors and displaying products and services that support software testing; most vendors prepare demonstrations of their latest software testing tools
4. Presentation tracks – where multiple track sessions run in parallel; attendees select tracks that emphasize practitioner presentations on topics of interest:
15.13 - Ref: IDMA 2, STRW, C.15, P.183 and 184, O.8