• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/22

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

22 Cards in this Set

  • Front
  • Back
Education Assessment:
*Objectives
Who, What, Where, When, Why, How much

"After one year, residents of southern Washington County will provide adequate facilities for delivery of health department services."
Program Management
*Organizational Chart
*PERT chart
*GANTT chart
*ORG: To see what staff you'll have to pay before making tasks and budget. Pay-only positions listed. Solid line for salaried, dashed line for temporary/hourly. Need PI and Advisory Committee.
*PERT:
*GANTT:
Types of Budget
Personnel Incentives
Fringe Benefits Facility Costs
Travel Communications
Equipment Staff Development
Supplies Client Cost
Consultant Program Income
Sub-contracter Indirect Costs
Direct/Indirect Costs
Direct Costs: Project and program costs. Include program materials.

Indirect Costs: Administrative, facilities, utilities
Per Diem Rates
Hourly rates. Maximum rates apply for transportation.
Fringe Benefits
Extra benefits associated with salaried staff. Include insurance, laptops, mileage, etc.
Process Evaluation
Aims at evaluating the program condition, status, and quality when the program is in its implementation stage.
Standards of Acceptability (Process Evaluation)
*5 types
-Arbitrary (50% improvement)
-Scientific (state-of-the-art results)
-Historic (local past performance)
-Normative (state or national averages)
-Compromise (budget and capability compromise)
Types of Data (Process Evaluation)
Quantitative:
-Record # of sessions conducted
-# of school-aged children reached
-Times and types of mass media played
-Recipient program satisfaction

Qualitative
-Description of provider professional qualification
-Recipient program satisfaction
-Observe program delivery
Some examples of Process Evaluation Data
Client Records
Program Records
Periodic monitoring reports
Special-purpose assessments
Periodic monitoring
Outcome Evaluation
Evaluation of health outcomes such as morbidity and mortality that resulted from the program.
Validity vs Reliability
Validity: Accuracy of measuring device in reflecting the concept it's supposed to measure.

Reliability: Consistency of the measuring instrument
External Validity
Random Sampling design
Matching
Internal Validity
Participation or participant maturation
Testing or observation
Instrumentation
Statistical regression and artifacts
Selection
Participation attrition
Interactive effects
O=T+Ec+Er
About the validity and reliability of your instrument

O: Observed value
T: True value
Er: Random error
Ec: Constant error
R E O1 X O2/
R E O1 O2
A visual depiction of the experimental design
R: Random sampling
E: Experimental group
O1: Pretest
X: Intervention
O2: Post-test
Administrative Diagnosis
Analysis of the resources and circumstances in your community or organization that could facilitate or hinder the health program required to affect the priority predisposing, enabling and reinforcing factors identified earlier.
Best practices
Interventions recommended on the basis of systematic reviews of evidence from controlled studies that substantiate their efficacy in the situations in which they were done, but not necessarily their effectiveness in other populations and circumstances.
Best processes
Methods such as those of PRECEDE and the matching, mapping, pooling, and patching of interventions to align and adapt them to the needs of a particular population and setting.
Aligning evidence with needs (Administrative Diagnosis)
-Matching ecological levels of a system or community with evidence of efficacy for interventions at those levels
-Mapping theory to the causal chain to fill gaps in the evidence for effectiveness of interventions
-Pooling experience to blend interventions to fill gaps in evidence for the effectiveness of programs in similar situations
-Patching pooled interventions with indigenous wisdom and professional judgment about plausible interventions to fill gaps in the program for the specific population
Assessing Cultural Competence
1. Experience, track record of involvement with target population
2. Training and staffing
3. Language
4. Materials
5. Evaluation
6. Community Representation
7. Implementation
Cultural Competence Evaluation Steps
1. Engage stakeholders
2. Describe the program
3. Focus on evaluation design
4. Gather credible evidence
5. Justify conclusions
6. Ensure use & share lessons learned