Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
81 Cards in this Set
- Front
- Back
Measurement
|
Concerned with establishing characteristics of individuals or groups.
*weight 110 lbs *science 83 |
|
Evaluation
|
Outcome of measurement after value has been added
*weight= just right, depends on height and/or age, need context *83%, 83/83, 83/200 |
|
Educational Measurement
|
Process of establishing characteristics of individuals or groups and characteristics have edu relevance
|
|
Assessment
|
Refers to a related series of measurements
*weight over time *several exams over year |
|
Tests
|
Vehicle or instrument used to observe the characteristics of focus
* scale * test and/or exam |
|
Test Score
|
Indication of what was observed thru the test
*weight = 110 lbs *83/83 = 100 |
|
Preliminary
|
At beginning of year or unit to see what is already known
|
|
Formative
|
Occurs during instruction to monitor progress
ex) quiz, hmwk, cswk, informal Q's -- occurs more frequently |
|
Summative
|
Occurs at end of a unit of instruction --> summary check of learning, test, project
|
|
Diagnostic
|
Learning problem, try to determine cause of problem
|
|
Criterion-referenced
|
Well defined content domain --> can determine which aspects are mastered (used a lot in elem schools). "check list"
|
|
Norm-referenced
|
Comparing student performance to the performance of others. "making the cut". some win, some loose
|
|
Neither
|
Not criterion or norm referenced (used in elem)
|
|
Growth-referenced
|
Improvements in performance, used to motivate students, not to give them grades
|
|
Ability-referenced
|
Comparing students current performance to their potential performance --> How do you know potential? Not used for grades
|
|
Why have measurable objectives?
|
Teachers must have focus for learning, objective help focus planning, teaching, & assessing
|
|
Student Learning
|
Not directly observable. Can't tell by looking at student what he/she knows
|
|
Objectives
|
Help to describe student learning in observable terms.
ex) identify nouns in sentences |
|
Categories of Learning Outcomes
|
A) Declarative Knowledge
B) Procedural Knowledge C) Problem Solving D) Attitudes E) Motor Skills |
|
Declarative Knowledge
|
Refers to info that can be stated verbally, such as recall of specific facts, principals, trends, etc.
ex) Areat of triangle: 1/2bh, know def of noun, state capitals |
|
Procedural Knowledge
|
Knowledge of how to do thing, involves using what you know.
ex) identifying nouns, using formulas |
|
Problem Solving
|
Exists when you know your goal but haven't identified a means for reaching the goal.
ex) science project, lizard temp |
|
Attitudes
|
Learned mental states that influence choice of behavior
ex) work on challenge instead of giving up |
|
Motor Skills
|
Coordinated patterns of muscle movement.
ex) handwriting, holding pencil, etc |
|
3 Parts to Performance Objective
|
1. Performance
2. Condition 3. Criterion |
|
Performance
|
Statement of observable behavior for the student to do. Don't use words like "think", "know", "understand", use "identify", "will read aloud"
|
|
Condition
|
Describe circumstances under which performance will occur. "Presented with stack of sight words"
|
|
Criterion
|
How well performance must be completed for us to know mastery has occured. "Missing no more than1 of 10 words"
|
|
Example of Performance Objective
|
"When presented with a stack of sight words, student will read words a loud missing no more than 1 of 10 words"
|
|
Selecting/Writing Performance Objectives
|
1. Objective should describe the results of student learning
2. Assess/describe all critical aspects of knowledge - may have mulitple objetives for a given lesson or topic 3. Help to assess an appropriate sample of knowledge |
|
Two Critical Attributes of Assessment
|
1. Reliability
2. Validity |
|
Reliability
|
Refers to the consistency of measurement of any instrument, refers to the degree to which instrument is free of measurement error
|
|
Measurement Error
|
What you know +/- other facts that effect your grade
ex) bathroom scales giving you different readings |
|
True Score
|
Measurement of what you really know
|
|
Observed Score
|
True score +/- measurement error
|
|
Examples of Measurement Error
|
Wording of questions, personal factors, evironmental factors, class coverage and/or inconsistency coverage, sampling error, format, interdependece of items, inappropriate point distribution, not enough or too much on test, guessing
|
|
Three Types of Random Influence of Reliability
|
1. Test take may change from one day to the next
2. Conditions may change 3. Small sample of Q's, poor sample, clarity issue |
|
Three types of Reliability
|
1. Stability
2. Alternate form of Reliability 3. Internal Consistency Reliability |
|
Stability
|
Consistency of test results over time, want assessments to yeild similar results at diff administrations. Asked in same way - clarity of Q's, minimize oppertunities for guessing - have open ended Q's or have more MC, don't use true/false
|
|
Alternate Form of Reliability
|
The degree to which supposedly equvilent forms of the same test are equivalent - mix up Q's
|
|
Internal Consistency Reliability
|
Deals with extent to which items on a test are functioning in a consistent fashion (content, structure, difficulty of items)
ex) adding fractions: 3/5 + 1/5, 2 1/8 + 4 3/7 |
|
Validity
|
Does this question measure what its suppose to measure (or test). The most important attribute of assessment
|
|
Why to teachers assess students?
|
To find out what students know. If summative, what was learned -> helps teachers certify that learning has occured
|
|
Types of Validity
|
1. Content Validity
2. Criterion related Validity 3. Construct-related Validity |
|
Content Validity
|
Degree to which assessment procedure adequately represents the content standard being measured.
|
|
Curricum alignment
|
Does every aspect of curriculum align or go together? Should be no disconnect or surprises
|
|
Criterion Related Validity
|
Degree to which performance on an assessment accurately measures a studets performance on another assessment of measure o the same type
|
|
Construct Related Validity
|
Degree to which evidence confirms that an infered construct exists and the assessment procedures measure it. (an abstract idea) Looks for indicators of intelligence
|
|
Validity vs Reliability
|
Can't have validity without reliability. Can have reliability without validity. Options:
reliable & valid, reliable, & not valid, or not reliable and not valid |
|
Test Anaysis to improve test items
|
1. Improve quatlity of test itmes (& tests)
2. Identifies instructional problems 3. Components of Test anaylsis |
|
Components of Test Analysis
|
A) Item difficulty
B) Item Discrimination C) Distractor Analysis |
|
Item Difficulty
|
Proportion of students answering item correctly.
- # answering item correctly divided by total # of students in testing group, the highter the # the more students answered right, the easier the item |
|
Item Discrimination
|
The ability of the item to sort ppl who know the material from those who don't.
- Identify two subgroup of test takers: top half and bottom half --> on a specific test. |
|
Distractor Analysis
|
looking at options (pattern of responses) for a given item - % of students choosing each option
|
|
Portfolios
|
- Widely used in elem schools
- Contains sample of student work, selected by student, based on established guidelines |
|
Characteristics of Portfolios
|
1. Adaptable to individualize instructional goals
2. Focus on Assessment of products 3. Meant to identify student strengths rather than weaknesses 4. Actively involve students in evaluation process 5. Communicate student achievement to others 6. Time Intensive |
|
Adaptable to individualize instructional goals (portfolios)
|
Each student prepares own portfolio & has individual conferences w/teacher, allows for individualization
|
|
Focus on Assessment of Products (portfolios)
|
Samples of student work are examined; but don't observe process of learning
|
|
Identifies Strengths rather than weaknesses (portfolios)
|
Be careful not to overlook weaknesses
|
|
Involve student in evaulation Process (portfolio)
|
Teacher develops guidelines, students select own work to put in portfolio
|
|
Communicate Studen Achievement (portfolios0
|
Good framework for showing work in progress, improvement
|
|
Time Intensive (portfolio)
|
Periodic review --> every 4 to 6 weeks, 30 - 60 mins per review
|
|
Designing Portfolios
|
a) Establish how portfolio will be used
b) Center portfolio on instructional goals c) Translate goals into student performance d) Plan student into assessment process e) Take steps to make review efficient |
|
Authentic Assessment
|
All are performance assessment. Involves using a skill in real world context, doing something the way "real ppl" do.
|
|
Characteristics of Performance Assessments
|
1. Specific behaviors or outcomes observed
2. Is it possible to judge appropriateness of behaviors or at least if one behavior is more appropriate than another 3. Target outcome can not be measured with pencil & paper task |
|
Categories of Performance Assessments
|
1. Process and/or product
(may evaluate both) 2. Simulated vs real settings |
|
Process
|
Looking at procedure student used to complete task
ex) presentation |
|
Product
|
Tangible outcome that maybe a result of completing process
ex) paper & visual for presentation |
|
Simulated vs Real Settings
|
May use simulations for performance assessments bc real situations may be too dangerous, costly, or not avaliable. Classroom assessments often use simulations
|
|
Advantages of Performance Assessments
|
- Allow evaluation of complex skills
- Positive effect on learning instruction: kids may learn more & work harder - Can be used to evaluate both process & product |
|
Limitations to Performance Assessments
|
- Considerable time to administer and score
- Student responses cannot easily be scoared later - Scoring is suseptible to rater error |
|
Creating a Performance Assessment
|
1. Identify concise goals to be assessed
2. Make a list of important components of task 3. Develop a scoring plan 4. Develop clear, detailed instructions for students |
|
Creating PA: Identify Concise Goals
|
- Usually broader than objectives, may be several objectives
- Must be expressed as observable student performance ex) oral presentation skills <-- goals, look at several objectives: eye contact, posture, etc |
|
Creating PA: Making a List of Important Components
|
- Content - what is/an target behavior(s). ex) eye contact
- Process and/or product: steps to complete task or land outcome of steps - Situation and Materials: what condition or circumstances must exist to perform task, what materials are needed to complete task ** includes hints/cues provided |
|
Creating PA: Develop Scoring Plan
|
- May be check list, rating scale, or rubric
- Useful to share with students, must be developed in advanced of student doing task |
|
Analytically Scoring
|
Judging appropriateness of response on basis of several distinct attributes
ex) checklist, rating scale, or rubric |
|
Holistically Scoring
|
Can't break performance into distinct attributes, scoring performance as a whole
ex) rubrics, grading scales |
|
Checklist
|
Find if a behavior is present or not, can be several behaviors or items
|
|
Rating Scale
|
Allows you to rate student performance on one of more continua
- Similar to checklist except there are a range of responses for each item - Descriptive words must be used carefully so interpreted same way by everyone - Want to have only one characteristic per scale (or one part of a performance). If same description is used for multiple scales, you can key descriptions to #'s & place #'s on scales - Be sure entire scale is useful - don't include categories you won't use |
|
Rubric
|
- Include descriptions of expected behaviors & standards for evaluations for those behaviors
- Help teachers clearly define expectations & can help students meet expectations - Help w/student self-assessment of work - Allow teachers to evaluate work consistentcy & accurately - Allow teachers to give detailed feed back to students - When? * w/complex task to be evaluated *multiple tasks student must complete & they deomonstrate at varying levels |
|
Creating a Scoring Rubric
|
1. Determine major goals or outcomes of the assessment and be sure they are linked w/goals of course (validity)
2. List all criteria or indicators that are important for completing a task 3. Determine how many performance levels are needed for each indicator. Clearly describe attributes for each level of performance. 4. Determine relative importance of one outcome to another & weigh accordingly 5. Determine total possible points on rubric & equate to some sort of grading scale 6. Present in organized, easy to read format 7. Field test rubric on old assignments or few new assignments 8. Revise as necessary |