• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/47

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

47 Cards in this Set

  • Front
  • Back

State the purpose of a testing program.

To ensure a quality testing process is implemented to effectively assess the trainee's achievement of learning objectives.

State the roles and responsibilities of the following for an effective testing program:



Naval Education Training Command (NETC)

Testing policy and guidance

State the roles and responsibilities of the following for an effective testing program:



NETC N7

Testing policy and guidance oversight and monitors compliance by Centers

State the roles and responsibilities of the following for an effective testing program:



Learning Center CO

Serves as CCA, Manages Sites, DET, resolves difference, incorporates TYCOM test banks, as appropriate.

State the roles and responsibilities of the following for an effective testing program:



Learning Center Director of Training

Ensures testing program(s) are conducted, oversees development of testing plans

State the roles and responsibilities of the following for an effective testing program:



Learning Center Learning Standards Officer

Provides guidance to curriculum developers on testing, monitors Total Quality Indicators (TQI), and test item analysis and remediation programs. Approves KTAGs and PTAGs.

State the roles and responsibilities of the following for an effective testing program:



Course Curriculum Model Manager (CCMM)

Approves test design, maintains master test item bank.

State the roles and responsibilities of the following for an effective testing program:



Curriculum Developer

Designs and develops the testing plan, admin guides, and the tests.

State the roles and responsibilities of the following for an effective testing program:



Learning Site CO/OIC

Implements testing plan, designates Testing Officer(s), designates the course supervisor.

State the roles and responsibilities of the following for an effective testing program:



Learning Site Testing Officer

Test administration, oversees grading, secures tests, maintains test bank(s), coordinates/manages revisions, conducts IS training.

State the roles and responsibilities of the following for an effective testing program:



Course Supervisor

Ensures, monitors, and validates admin, security, and test item analysis.

State the roles and responsibilities of the following for an effective testing program:



Participating Activities

Provides comments , feedback, new test items and maintains test and test item analysis data.

State the primary course source data for creating test items.

JDTA


OCCSTDS


CTTL / PPP Table


COI

List usable course source data to be used when the primary course source data is not available or has not been created.

Will bridge the absence of JDTA data using data elements from a combination of:



OCCSTDS


CTTL


PPP Table


COI

What is a Formal test?

Test is graded and is used in the calculation of the trainee's final grade.

What is a Informal test?

May or may not be graded. Regardless, the grade will not be used in the calculation of the trainee's final grade.

For the below items, define the three levels of proficiency levels contained within:



Skill

Level 1: Imitation


Level 2: Repetition


Level 3: Habit

For the below items, define the three levels of proficiency levels contained within:



Knowledge

Level 1: Knowledge/Comprehension


Level 2: Application/Analysis


Level 3: Synthesis/Evaluation

List the five categories for performance and knowledge tests.

Pre-Test - For Validation of Material, Acceleration, Pre-requisite, Advanced Organizer



Progress - Test Blocks of instruction



Comprehensive Test - Within Course or Final Exam



Oral Test - Normally by board. Assesses trainees comprehension



Quiz - Short test to assess achievement of recently taught material

Discuss the process of piloting a test.

It is a review process to assess test reliability and validity and make corrective adjustments before actually collecting data from the target population.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Job Sheet

Directs the trainees in the step by step performance of a practical task they will encounter in their job assignment.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Problem Sheet

Presents practical problems requiring analysis and decision making similar to those encountered on the job.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Assignment Sheet

Designed to direct the study or homework efforts of trainees.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Multiple-choice

Is the most versatile of all knowledge test item formats.

Describe the use of each test instrument as they relate to knowledge and performance tests:



True or false

Provide only two answers.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Matching

Defined as two lists of connected words, phrases, pictures, or symbols.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Completion

Free response test items in which the trainees must supply the missing information from memory.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Labeling

Used to measure the trainee's ability to recall facts and label parts in pictures, schematics, diagrams, or drawings.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Essay

Require trainees to answer a question with a written response.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Case Study

Should be used when posing a complex issue, when a comprehensive understanding of material is required.

Describe the use of each test instrument as they relate to knowledge and performance tests:



Validation of Test Instruments

After test instruments have been constructed, and before they are actually assembled into a test, the content must be validated.

What are the two types of testing methods used in testing?

Criterion-Referenced Test - Assesses whether required level of skill or knowledge is met.



Norm-Referenced - Estimates individual skill or knowledge in relation to a group norm (e.g., Navy Advancement Exams).

Discuss test failure policies and associated grading criteria within your learning environment.

Test (if failed), Re-train, Re-test. If passed, the highest score the student can receive is an 80%.

Discuss during performance test design how the skill learning objective criticality is determined.

Will be developed using job sheets. Problem sheets are normally not used as a means of performance assessment, but may be used to evaluate achievement of less critical learning objectives.



Levels of Criticality -


High - value of 3 - Skill used during job performance.


Moderate - value of 2 - Skill influences job performance.


Low - value of 1 - Skill has little influence on job performance.

Discuss during knowledge test design how the knowledge learning objective criticality is determined to perform a task.

Will be developed using test items.




What are the ten sections of a testing plan?

Course Data


Course Roles and Responsibilities


Course Waivers


Test Development


Test Administration


Course Tests and Test Types


Grading Criteria


Remediation


Test and Test Item Analysis


Documentation

What is the purpose of test and test item analysis?

To determine statistical validity, test and test item analysis techniques are required.



The three types of analysis discussed and required for use are: difficulty index, index of discrimination, and effectiveness of alternatives.

In a remediation program, what are the primary and secondary goals?

Primary goal is to motivate and assist trainees in achieving the critical learning objectives of a course by providing additional study time.



Secondary goal is to remove barriers to learning.



Because trainees learn in different ways, it may be necessary to use different methods of remediation to realize the most effective results.

What are the three methods of remediation available to instructors?

Targeted


Scalable


Iterative

Discuss targeted remediation.

Is designed to assist the trainee who is having difficulty in accomplishing and objective(s) and/or understanding the material during normal classroom time.



Involves limited one-on-one mentorship or SME engagement of the objective(s) are that the trainee is having difficulty with, using text and/or lab material.

Discuss scalable remediation.

Designed to assist the trainee who is having difficulty in accomplishing objectives or understanding the material for a major portion of a course, during normal classroom time.



Involves one-on-one mentorship or SME engagement of each major objective are that the trainee is having difficulty with using a total recall approach using one or a combination of: text, lab material, flashcards, mentor question and answer sessions.

Discuss iterative remediation.

Involves one-on-one mentorship or SME engagement of each major objective are that the trainee is having difficulty with using a total recall approach using one or a combination of: text, lab material, flashcards, mentor question and answer session.



To complete remediation the trainee must complete a minimum of 20 questions per each objective are with a minimum score of 80% and/or successfully complete two practice exercises or scenarios per each objective area.

Define the retest section of the remediation program.

When the trainee does not achieve a test's minimum passing grade, the retest may cover the portion of the test the trainee had difficulty with or the entire test. This decision should be based on the degree of difficulty the trainee had with the test.

Define the setback section of the remediation program.

A setback occurs when the trainee is unable to complete the training in the designated time allotted for a course.



Setbacks are classified as either academic or non-academic.



Because setbacks are costly, they should be granted only after all methods of remediation have been exhausted and/or there is an indication that a setback is in the best interest of the Navy and trainee.

Define the "Drop from training and attrites" section of the remediation program.

Every effort will be made to help trainees succeed. However, there are there are times when the trainee is clearly unsuited, unable, and/or unwilling to complete the course.



If this occurs, the trainee is dropped from the training. Trainees may be classified as an academic drop, nonacademic drop, or disenrollment.



Trainees who are discharged from the Navy will be classified as attrites.

Define the counseling section of the remediation program.

Preventive counseling will be instituted in "A" and "C" schools and should include counseling for performance and personal problems.

Define the Academic Review Boards (ARBs) section of the remediation program.

Will be convened when other means of academic counseling, remediation, and an initial academic setback have failed to improve trainee performance.