Study your flashcards anywhere!

Download the official Cram app for free >

  • Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

How to study your flashcards.

Right/Left arrow keys: Navigate between flashcards.right arrow keyleft arrow key

Up/Down arrow keys: Flip the card between the front and back.down keyup key

H key: Show hint (3rd side).h key

A key: Read text to speech.a key

image

Play button

image

Play button

image

Progress

1/96

Click to flip

96 Cards in this Set

  • Front
  • Back
Purpose of assessment is?
to determine the extent to which students have learned
Important questions that teachers need to ask themselves related to assessments are?
"Did my students learn?"
"What evidence do I have for that learning?"
6 Assessment Guidelines: for High-Quality standards-based assessments?
1) Plan for assessment- part of planning involves communicating the assessment along with criteria for different performance levels to the the students prior to assessment.

2) Ensure the the assessment is valid, reliable, objective, and standardized--think about the meaning of these words!

3) Allow students to select an assessment tool from several that represent their learning modality or primary intelligence--allows students to select different assessment tools in order to utilize their strengths. providing different assessment options for a single standard is called: Multiple Measures

4) Ensure that student dignity is respected throughout the assessment process--protect students' self-esteem and provide the most supportive environment for students to demonstrate their learning.

5) Use embedded assessment and record the best evidence of the student's performance related to each standard:means assessing students while they are actually practicing--as a part of instruction or simultaneously with instruction. This type of documentation provides evidence of student growth over the instructional unit.

6) Use assessment to inform students and parents of student progress toward the achievement of the standards, and to enhance instruction--provides 2 important purposes--determination of individual student's progress toward achievement of the standards, second provide teacher with information on the quality of his/her instruction, especially in formative assessment.
What are the Assessment Development Process or Steps?
Step 1-- Identify the standard noting the verb, type of verb, and skill, activity, or content

Step 2-- Choose the assessment tools that best allow students to demonstrate their achievement relative to the standard

Step 3-- State the criteria for competence

Step 4-- Describe the levels of quality

Step 5-- Develop samples of student work that illustrate each level of quality
Step 1--Identify the Standard
Part 1-- restate the grade or unit-level standard

Part 2-- identify the verb, along with verb type, doing this provides direction for the selection of assessment tool.

Verb Type(domains of learning):
Cognitive-------Explain, Describe, etc.---thinking/analyze

Psychomotor---- Demonstrate, motor movement

Affective--- Accepts, feelings, thoughts/reflection
Step 2-- Choose the Assessment Tool
most important select 1 or more tools that match the type of verb and actual verb.
Step 3-- State the Criteria for Competence
performance indicators for the standard, or steps of description for the standard
Step 4-- Describe the Levels of Quality
Description of the levels of Quality or Skill Achievement for that standard
Step 5-- Develop Samples of Student Work
Also known as:Exemplars or Anchors

provides students with concrete examples for the quality of work they need to produce for a specific level or grade.
What are the 3 points in the Assessment Process or Implementation?
Diagnostic Assessment= beginning of instructional process; starting line

Formative Assessment= during the process; along the way to ensure the correct path

Summative= at the destination
Diagnostic Assessment
AKA: pre-assessment; measures student achievement of the unit level standard prior to the start of instruction

--Results provide teacher with instructional starting points

--Additionally provides teacher with baseline data so that, at the end of the learning process, everyone can see how much the students have progressed.
Formative Assessment
Provides information on the effectiveness of the instructional process during that unit.

Data secured informs students, teachers, and parents about student progress toward mastery of the standards throughout the school year.
Summative Assessment
Answers the question: Has the student achieved the standards and is the student ready to move on?
How can you know that your test is valid?
Ask your self does it measure what it is supposed to?

Validity: the relationship between the attributes that are intended to be measured by the test and the actual attributes measured by the test.
How will you know if your test is reliable?
Ask what is the relationship between scores given by different raters, which (Interrater reliability), the same rater over time (intrarater reliability), or the same assessment over time?
What tells you factors tell you that your test is objective?
Ask can results be measured the same way by any qualified professional?

Assessment tool should not be subjective or simply based on an individual's opinion.
How do you know there is standards set in your assessment?
ask your self is the assessment performed in the same way each time?

A set of consistent procedures should be used for administering and scoring an assessment.
Tests that require students to use higher-order thinking skills and to actually apply then are called?
performance-based assessments
An assessment that resembles a "real-life" situation in which students must apply skills, knowledge, and attitudes in scenarios that reflect the ambiguities of life
Authentic Assessment
When students use different type of media for creating project, the type of media used is typically rotated for the next project, so that all students have access to the different types of media
Types of Assessment tools:

Journals- records student perceptions, reflections, ad/or feelings.(Affective)

Logs- data about a specific behavior collected of over time (Summative & Affective)

Motor-Skill/Fitness tests- On demand performance of a motor skill of fitness test in typically closed document. (Psychomotor)

Reports/Projects- involve researching and writing (effective)

Role-playing/Simulations- Assessments for which students are given scenario and asked to play a particular role. (Cognitive-Psyhcomotor)

Structured Observation- Students are given a scenario and asked to play a particular role. (Psychomotor & Cognitive)

Written Tests- tests in which students respond in writing. (Cognitive)
Types of written tests?
and Styles Force Choice or Supply items
short answer (SI)
essay (SI)
open ended questions(SI)
multiple-choice(FC)
True/False(FC)
Matching Items(FC)
Fill-in-the-blank(FC)
5 Step assessment development process: Step 1
Identify the Standard:
-"__________"
What is the verb in the sentence?
What is the type of verb?
What level of proficiency is stated?
What is the skill or content?
5 Step assessment development process: Step 2
Choose the assessment tool
5 step Assessment Development process: Step 3
State Criteria for competence
5 step Assessment Development process: Step 4
Describe levels of quality
5 step Assessment Development process: Step 5
Describe the Anchors
What are the multiple choice question development guidelines?
Include as much of the item content in the stem as possible
State the stem in the positive
Have distractors(incorrect answers) that are plausible alternative answers, and parallel in form an length.

Avoid over using "always"; "never"; "all of the above" & both "A and C" in the distractors

Randomly select position of correct answer

Have only one correct or best answer
What are the development guidelines for Short Answer questions?
Write questions that are very specific

Requested answers are brief and specific

Only one desired answer is appropriate
What are the development guidelines for Open Ended questions?
Determine a real-world context for the question

Define the question as specifically as possible

Include two or more questions that are more specific and shorter instead of asking a single longer question

Inform student of the grading criteria and conditions (i.e. expected length of the answer)
What are some guidelines to follow when writing rubrics?
Find a balance between being to general and to detailed

Consider quality and quantity

Use meaningful criteria that clearly define each level

Keep the distance between levels equal

Ensure that the elements are appropriate for everyone

Expect to revise
Diagnosis
through measurement you can assess the weakness (needs) and strengths of a group of individuals

May cause you to alter your initial approach to what you are teaching

Can identify who is a group has good technique, which enables you to devote more time to those who cannot perform the skill.

Diagnostic testing after a group has participated in an activity my determine why some individuals are not progressing as they should
Classification
My be occasions when classifying individuals into similar groups for ease of instruction

People feel more comfortable when performing with other of similar skill....homogeneous grouping
Achievement
Most common reason for measurement & assessment to determine the degree of achievement of program objectives and personal goals

Assessment of each student's skill at the beginning of an activity unit helps you determine the effectiveness of an activity unit helps you determine the effectiveness of previous instruction and programs and at what point you should begin your instruction
Evaluation of Instruction and Programs:
Prediction
measurement to predict future performance
Evaluation of Instruction and Programs:
Research
used to find meaningful solutions to problems and as means to expand a body of knowledge
Evaluation of Instruction and Programs:
Analyze and Interpret Data
Statistical analysis can provide you with a more meaningful evaluation of your measurement and better inform all participants of the test results
Statistical Terms:
Data
result of measurement; numerical result of measurement
Statistical Terms:
Variable
trait or characteristic of something that can assume more then 1 value
Evaluation of Instruction and Programs:
Determine the Worth of a Test-- Validity and Reliability
Validity-- refers to the degree to which a test measures what it claims to measure.

Reliability- refers to the consistency of a test

knowing how to interpret statements about these characteristics, more likely to select the appropriate tests to administer to your subjects.
Statistical Terms:
Population
includes all subjects (members) within a defined group
Statistical Terms:
Sample
part or subgroup of the population from which the measurements are actually obtained
Statistical Terms:
Random sample
one in which every subject in the population has an equal chance of being included in the sample
Statistical Terms:
Parameter
a value, a measurable characteristic, that refers to a population; population mean is a parameter
Statistical Terms:
Statistic
value, a measurable characteristic, that refers to a sample; used to estimate the parameters of a defines population
Statistical Terms:
Descriptive statistics
methods used to describe a group; every member of a group is measured and no attempt is made to generalize to a larger group
Statistical Terms:
Inferential statistics
when a random sample is measured and projections or generalizations are made about a larger group; able to use data generated from a sample to make inferences about the entire population
Statistical Terms:
Discrete data
values are limited to certain numbers and cannot be reported as fractions
Statistical Terms:
Continuous data
can have any value within a certain range; values can be reported as fractions;
examples: time & distance
Statistical Terms:
Ungrouped data
measures not arranged in an meaningful manner; raw scores used for calculations
Statistical Terms:
Grouped data
measures arranged in some meaningful manner to facilitate calculations
What are the 4 types of Scales of Measurement?
Nominal

Ordinal

Interval

Ratio
Scales of Measurement:
Nominal
the lowest and most elementary scale; Numbers used to represtent variables, but the nimbers do not have numerical value

Numbers represent categories. They do not distinguish groups or reflect differences in magnitude
Scales of Measurement:
Ordinal
provides some information about the order or rank of variables; doesn't indicate how much better or worse one score is than another

numbers indicate rank of measurements but not magnitude of the interval between measure
Scales of Measurement:
Interval
information about order of variables, using equal units of measurement; distance between scale is always the same

possible to say how much better one is for another

has no true 0
Scales of Measurement:
Ratio
possesses all the charachteristics of the interval scale and has a true zero point

In most statistical studies interval & ratio scales are analyzed in the same way

numbers represent equal units between measurements, and there is an absolute 0
Affective
Learning dispositions concerning how students act and feel including attitudinal, emotional, and valuational, responses of the learner.
Alignment
Clear and direct relationship among standards, curricula, instructional materials, instructional methods, and assessments.
Alternative assessment
assessment tools that engage students in the learning process and assess higher-order cognitive processes; this type of assessment requires students to generate a responses to a question rather than choose from a set of responses given to them.
Analytic rubric
A procedure in which performances are evaluated for selected dimensions, with each dimension receiving a separate score.
Anchor
A sample of student work that exemplifies a specific level of performance.
Authentic Assessment
Processes of gathering evidence and documentation of a student's learning and growth in ways that resemble "real life" as closely as possible.
Checklist
Specifies whether or not criteria for successful performance of an activity are met.
Concurrent validity
The relationship between a small sample of items that represents the total content measured
Construct validity
the relationship between what is actually being measured and what was intended to be measured
Content Standards
What students should know and be able to do in various subjects.
Criteria
The elements contained in a rubric or scoring guide that identify factors necessary for evaluating performance.
Criterion-referenced assessment
Describes how well a student performs in comparison with a predetermined and specific standard of performance.
Curriculum
A planned sequence of formal instructional experiences that defines the content to be taught (what) and the methods to be used (how and when).
Embedded assessment
Assessment that occurs simultaneously with instruction.
Exemplar
A sample of student work that illustrates specific levels of performance
Exit Standards
Expectations of what all students should know and be able to do at graduation for high school.
**Formative Assessment
Ongoing assessment that can provide information to guide instruction and improve performance.
Grade-level standards
Expectations of what all students should know and be able to do at the end of a grade level; these are sometimes referred to as benchmarks
Norm-referenced assessment
Describes how well a student performs in comparison to other students.
Objective
A specific learning outcome intended to be accomplished in a short period of time (i.e. 1 lesson)
Peer Observation
Provision of feedback on performance to a peer, based on criteria supplied by the teacher.
Performance indicator
A measure that describes how good is good enough.
Pre-assessment
Occurs at the beginning of a Unit of instruction and allows for the p[planning of appropriate development sequences.
Process criteria
Criteria that refers to how the performance or product is completed.
Product criteria
Criteria that refer to what a student produces or does.
Qualitative analytic rubric
A type of analytic rubric that requires the scorer to determine a level of quality for each dimension assessed using a verbal description for that dimension.
Quantitative analytic rubric
A type of analytic rubric that requires the scorer to determine a level of quality for each dimension assessed using a number.
Rating score or scale
Specifies the extent to which, or frequency with which criteria for successful performance of an activity are met.
Reliability
The degree to which the results for an assessment are dependable and yield consistent results.
Standardization
A set of consistent procedures for administering and scoring and assessment.
Standards-based assessment
Criterion-referenced assessment that shows clear and direct relationship between the assessment and the identified standards.
Standards-based Curriculum
A curriculum designed to produce student understanding and work that demonstrates achievement of the identified standards.
Structured Observation
An assessment that includes observation and evaluation of a performance based on criteria
Summative assessment
Assessment that takes place at the end of instruction.
Validity
The degree to which an assessment measures what is supposed to measure.
Guidelines for Rubric Development
six-point & four point
Six-Point Rubric
6: Fully achieves purposes of the task
5: Accomplishes the purpose of the task
4:Completes the purpose of the task substantially (meets criteria)
3: Purposes of the task not fully achieved
2: Omits important purposes or movements
1: Fails to achieve purposes of the task

Four-Point Rubric
4: Fully achieves purposes of the task
3: Completes the purpose of the task substantially (meets criteria)
2: Omits important purposes of the task
1: Fails to achieve purposes of the task
Mean (average) Characteristics
1) It is the most sensitive of all the measures of central tendency. It will always reflect any change with-in a distribution of scores.

2)Most appropriate measure of central tendency to use for ratio data and may be used on interval data.

3)Considers all the information about the data and is used to perform other important statistical calculations.

4) Influenced by extreme scores, especially of the distribution is small. This is one major disadvantage of the mean.
Median (middle) Characteristics
1)Not affected by extreme scores; more representative measure of central tendency than mean when extreme scores are in the distribution.

2)Measure of position; determined by the number of scores and their rank order. Appropriately used in ordinal and interval data

3) Not used for additional statistical calculations.
Mode(occurs most frequent) Characteristics
1)Least used measure of central tendency. Indicates the score attained by the largest number of subjects. Used to indicate most popular item or product used by a group.

2)Not used for additional statistical calculations

3)Not affected by extreme scores, total number of scores, or their distance from the center of the distribution.
Range(highest and lowest values) Characteristics
1)Dependent on the extreme scores

2)Least useful measure of variability
Standard Deviation Characteristics
1)square root of the variance, most useful and sophisticated measure of variability. Describes the scatter of scores around the mean

2)Applicable to interval and ratio level data, includes all scores, most reliable measure of variability.

3)used with the mean; in normal distribution +1 and -1 s from the mean should include 68.26% of the scores.

4)Small SD indicates group being tested has little variability; homogeneous. A large SD indicates group has mush variability; heterogeneous

5) Used to perform other statistical calculations. SD is especially important for comparing differences between means.