• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/89

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

89 Cards in this Set

  • Front
  • Back
  • 3rd side (hint)
Bozeman’s commentary on the mandatory question from the Program Evaluation Midterm 2009.

Most critical piece of evaluation process…
• Proposal
o Contract between evaluator and client.
Bozeman’s commentary on the mandatory question from the Program Evaluation Midterm 2009.

• Advice for answering this question:
• Start with the definition of formative evaluations: the purpose is to improve the program or product. A format to determine while they are forming.
• Like a cook tasting the soup
• Evaluating determining worth
Bozeman’s commentary on the mandatory question from the Program Evaluation Midterm 2009.

• Preempt reader as you write this “mock proposal”
• may not know as much as you do
• DON’T ASSUME that client knows as much as they think they do
Bozeman’s commentary on the mandatory question from the Program Evaluation Midterm 2009.

• 10 Components he was looking for in this answer
– these are not serial / pipeline order but rather parallel concepts more like a blueprint than a sequence
1. Quick / dirty literature review “Gotta know the territory”

2. Obtain description of program as set forth by whatever people et it up – does a written plan exist? Get it.

3. Throughout the evaluation need to be concerned with values, issues, interests of various stakeholders – cast a wide net: parents, teachers, administrators, students, community members (don’t have to interview community members, but consider. For example how does a 4 period day affect businesses who hire teenagers after school?)

4. Sampling – ways to get a sense of concerns, values, interest

5. Develop preliminary specifications of evaluation plan – what are you going to try to wrap your hands around

6. Timeline – usually negotiable, has to be on table or doesn’t get talked about. There should be clear reporting expectations

7. Data, climate, culture, attitudinal, achievement, absentee, tardy, drop-out: determine
• Who will collect data? / Who will pay for it?
• What resources will be available – example a memo up front from the superintendent telling schools to cooperate with data collection by the evaluator or else is very powerful. “Fore-arm” yourself so that data collection is not difficult.
• What will be on the survey – what is the purpose of the survey.
• There are three types of data – real data, attitudinal data and perceptual data.

8. Expected product of formative evaluations? In your mind dialogue about what expectation will be. Don’t presume what is in the mind of the person requesting evaluation – get them to spell it out for you.

9. A part of the proposal should be that a written plan for evaluation will be submitted – not asking to write it here in this question but include the need

10. Some type of sign off: now a quasi-contract, probably won’t hold up in court, but does review expectations – sets the stage
Bozeman’s commentary on the mandatory question from the Program Evaluation Midterm 2009.

• 10 Components he was looking for in this answer – these are not serial / pipeline order but rather parallel concepts more like a blueprint than a sequence

Which Style is Correct?
There was no one correct choice for type of evaluation method used. Improvement focused was fine or operations monitoring or even a combination was okay.
Planning – Research and Evaluation…

Purpose of an Evaluation
• To provide feedback from program actives and outcomes to those who can make decisions.
Planning – Research and Evaluation…

Why Plan an Evaluation?
• To determine the major goals and strategies to create a desired future.
• Starts with a vision or mission statement
Planning – Research and Evaluation…

Ethics in Evaluation reporting.
• Utility: Effective and Useful

• Feasibility: realistic expectations, delivering what is promised

• Propriety: fair, honest and objective

• Accuracy… In Reporting and analysis
Planning – Research and Evaluation…

Types of Evaluations
• Need : ID and measure unmet needs
• Process: how much has been implemented, who is being served, is it operating as expected, are they documenting implementation
• Outcome: well imp., getting to right peope
• Efficiency: cost effective, enough resources
• N.O.P.E
Planning – Research and Evaluation…

Types of Evaluation Assessments.
Formative and Summative Assessment
Planning – Research and Evaluation…

Formative Assessments…
• Help Develop Program
• Flexible: Input, feedback, comments
• Research oriented
• Like having someone over shoulder during work.
• Best Evaluator: Internal
• Conduct during development phase
• “Cook tasting the soup”
Planning – Research and Evaluation…

Summative Assessments…
• Summary: to start, to continue, to stop
• May result in major change
• Action oriented
• Best Evaluator: External
• Quantitative: Using numeric scores, letter grades to assess achievement
• “Guests tasting the soup”
Planning – Research and Evaluation…

Types of Evaluations
• Internal and External Evaluations
Planning – Research and Evaluation…

Pro and Con of Internal Evaluations…
• Pro
o Cheaper, know the culture/political lines, minimal training, self-evaluation/reflect, on-going monitoring
• Cons
o Biased
o Won’t touch the “sacred cows”
Planning – Research and Evaluation…

Pro and Con of External Evaluations…
• Pros
o Non-biased, objective on “Sacred cows”, thinks outside box, outside opinions as comparison, focused, more time spent on project
• Cons
o Expensive, not in culture, training, confidentiality
Planning – Research and Evaluation…

Favorite Evaluation Model in Education
• Improvement Focused
o Familiarize yourself with program
o ID/meet with stakeholders
 Biases and needs, provide knowledge and perspectives
o What info is needed, what to ask, needs to be known and who wants to know
o What are true motives
o Stick with the questions, how can program improve, avoid “would be nice” questions,
o remain objective
o What are realistic time limits, resources and budget
o Prepare a plan; agree on data reporting method and when. Watch for dysfunctional attitudes to evaluations.
Planning – Research and Evaluation…

Breakthrough Thinking (Who?)
• Nadler & Hibino (1996)
Planning – Research and Evaluation…

Breakthrough Thinking (Key Features)
• Synthesis of Information
• Understanding need for continual change
• Best solution for everyone involved
• S.U.B (go under the emotions)

• Nadler & Hibino (1996)
Planning – Research and Evaluation…

Breakthrough Thinking (6 Characteristics)
• Uniqueness: each situation is different, and may require a unique solution
• Purposes: explore and expand to work on the right problem
• Solution-After-Next: make changes today on possible solutions tomorrow
• Limited Information Collection: avoid unnecessary data collection; study solutions, not problems
• People Design: everyone who is affected is involved
• Betterment Techniques: continuing change and improvement

o Nadler & Hibino (1996)
Planning – Research and Evaluation…

Breakthrough Thinking
(5 Stages of Problem Solving)
1. Purpose Determination
o Purpose: What is it that wants to be accomplished, what is purpose

2. Ideal “Purposeful Solution Development
o Solution: Generate ideas to achieve the purpose

3. Target Plan Development
o Plan: targeted ideas to achieve purpose

4. Detail the Recommended Plan
o Recommend: Develop details of the plan; realize obstacles, consequences and outcomes.

5. Implement and Evaluation
o Implement and Evaluate: Test the plan, examine performance, evaluate outcomes

• Nadler & Hibino (1996)
Planning – Research and Evaluation…

Breakthrough Thinking
Prior to implementing nominal group or system matrix techniques…
• Purpose needs to be clearly defined
o From Purpose… all things are developed
o What is the question, what needs to be planned or evaluated?
o What (exactly) is being discussed?

o Nadler & Hibino (1996)
Planning – Research and Evaluation…

Breakthrough Thinking
Nominal Group Techniques…
• Pose the question as accurately as possible
• Generate ideas to answer the question silently
• Record all ideas (non judgmentally)
• In turn, discuss ideas for clarification
• Vote (preliminary) on best/most relevant ideas
• Discuss Vote
• Vote (Final) for best, most relevant and applicable idea.
Planning – Research and Evaluation…

Breakthrough Thinking
System Matrix: Elements …
• Purpose: The Mission, needs, primary concern
• Inputs: What is put forward to be changed: People, things, information
• Outputs: The result of the change: desired and undesired outcomes
• Sequence: The process that is followed fro input to output
• Environment: Physical, sociological, phychological
• People (Human Agent): Who aids in the change process, what are their responsibilities
• Physical Components: What equipment or facilities needed
• Information Aids: What aids (textbooks, guidelines…) are needed
o Nadler & Hibino (1996)
• PIOS EPPI
Planning – Research and Evaluation…

Breakthrough Thinking
System Matrix: Dimensions …
• Fundamental: Definitional; What, how, where, who
• Values/Goals: as the elements relate to community, professional organization, ethics, etc
• Rates/Measures: how many, how much, levels of success
• Controls/Evaluations: How is it measured / inspected
• Interface: how does it inter-relate: general summary
• Future: how should it look later (a year from now); what are planning changes
o Nadler & Hibino (1996)
- FVRCIF
Planning – Research and Evaluation…

Spring 2001 Question
You have been directed by your super to plan an evaluation for your institution’s information and technology resources and systems. Discuss the advantages and disadvantages of having program evaluation as an integral component of any evolutional delivery system. In your discussion, address the following four components..
a. discuss how you would proceed with your planning activity.
b. Who (stakeholder) might be involved in the planning and why?
c. What dysfunctional attitudes fo the stakeholder should be addressed and why?
d. What models of evaluation might be employed?
Planning – Research and Evaluation…

Summer 2001 Question
You have been directed by your superintend to produce a faculty development plan for your school (e, m or hs) This plan should be reasonably comprehensive, and embrace both present and contemplated needs. Ues the concepts of systems, systems theory, and systems design to develop this plan, Through completeness si desirable, it is understood that you may not be able to supply certain specific details/ Your plan, however, should offer as much specificity as possible, and provide information gathering where you unable to supply detailed specifications. Limit your plan to 203 pages. Format the document as if you were submitting it to the superintendent. In addition, discuss in about 50-100 words how you feel the use of systems may enhance the product of your planning efforts.
Planning – Research and Evaluation…

Summer 2001 Question
Define and discuss formative and summative evaluations. When would it be appropriate to use the respective models of evaluation? What models of evaluation might be appropriate in these modes of evaluation?
Planning – Research and Evaluation…

Summer 2001 Question
In recent years we have witnessed a proliferation of on-line degree programs. What process would you use to determine whether these courses are meeting critical need or eroding the quality of higher education?
Planning – Research and Evaluation…

Fall 2001 Question
Political and psychological factors can undermine any evaluation effort. Careful planning by the evaluator, however, can enhance the likelihood of success of the evaluation. Identify and discuss three factors that might adversely effect a program evaluation. Include in your discussion three factors that might adversely effect a program evaluation. Include in your discussion an example of each factor and how one could address the factor during the planning process.
Planning – Research and Evaluation…

Spring 2002 Question
Successful leaders consider evaluation, accountability, and continuing change and improvement as positive forces, not ‘necessary evils.’ Careful planning can enhance the likelihood of success of the evaluation and accountability efforts. Identify and discus three approaches to systemic planning and evaluation that will enhance your effectiveness as an educational leader. Be specific in your discussions and support your approaches with references to appropriate and relevant literature.
Planning – Research and Evaluation…

Summer 2002 Question
West Churchman, the renowned systems theorist, mathematician, an statistician once remarked, Politics are the enemy of the systems approach. Likewise political forces and factors can undermine any planning, research, and evaluation effort. Careful attention to politics and planning by the evaluator, however, can enhance the likelihood of success of the evaluation. Identify and disuses three factors that might adversely affect a program evaluation. Include in your discussion an example of each factor and how one could address the factor during the planning process.
Planning – Research and Evaluation…

Fall 2002 Question
a) Discuss the advantages and disadvantages of internal and external program evaluators.
b) Distinguish between formative and summative program evaluations. How do these differences affect negotiations with program staff and other stakeholders in the planning of the evaluation?
Planning – Research and Evaluation…

Spring 2003 Question
Evaluation is defined as the assessment of an objects merit and worth. Given the political realities of most evaluation efforts, it is clear that effective school leaders must carefully approach the evaluation process in order to realize legitimate outcomes.
Discuss the advantages and disadvantages of having program evaluation as an integral component of any educational delivery system. How can this be an asset in a leader’s efforts to fulfill the above definition of evaluation (and address political realities)? Include examples where appropriate.
Planning – Research and Evaluation…

Spring 2003 Question
Evaluation is defined as the assessment of an objects merit and worth. Given the political realities of most evaluation efforts, it is clear that effective school leaders must carefully approach the evaluation process in order to realize legitimate outcomes.
You have been directed by your super to plan an evaluation of your institution’s information technology (IT) resources and systems. Discuss how you would proceed with this planning activity. Who (stakeholders) might be involved in the planning and why? What dysfunctional attitudes of the stakeholders should be addressed and why? What models of evaluation might be most appropriate? Include examples.
Planning – Research and Evaluation…
Summer 2003 Question
Evaluation is essentially the generation of information that can assist others in making judgments and recommendations about a program, service, policy, or organizational unit. Given that realities of most evaluation efforts, it is clear that effective school leaders must carefully approach the evaluation process in order to realize legitimate outcomes. Leaders must also, however, consider certain ethical issues. The American Evaluation Association, for example, has adopted five ethical principles related to systematic inquiry, competence, honesty and integrity, respect for people and responsibility for general welfare.
As an educational leader, it may be your responsibility to engage in research and evaluation about a program or service in your school, college, or general educational community. Select an example of a program, environment, etc. Discuss how ethics in program evaluation may shape the approach and procedures in your evaluation.
Planning – Research and Evaluation…

Fall 2003 Question
a) Discuss the advantages, disadvantages, problems and assets of internal vs external program evaluation.
b) Distinguish between formative and summative program evaluation. How do these differences affect negotiations with program staff and other stakeholders in the planning of an evaluation?
c) Often researchers elect to conduct research in their own organizations. Identify two possible problems with such an investigation. What are some possible assets?
Planning – Research and Evaluation…
Spring 2005 Question
The central purpose of evaluation is to generate information that can assist decision makers in rendering judgments and recommendations about a program, service, policy, or organizational unit. Stated another way, evaluation should assist decision makers in reaching valid and systematic judgments about a program, service, structure, progress, and outcomes (Program merit or worth). Given the political realities of most evaluation efforts, it is clear that effective school leaders must carefully approach the evaluation process in order to realize legitimate outcomes.
Your institution is facing what has been termed a “leadership vacuum” or an insufficient number of acceptable/qualified individuals for leadership vacancies. Therefore, you have been directed by your super to plan an evaluation of your institution’s human resource / management development (HRMD) assets and resources.
a) Briefly describe the setting (short paragraph)
b) Discuss how you would proceed with planning and implementing the evaluation. Who (stakeholder) might be involved in the evaluation and why? What dysfunctional attitudes of the stakeholders should be addressed and why?
c) What model of evaluation might be appropriate and why (cite specific evaluation models from the literature in your rationale) Include examples where appropriate. How might the evaluation be used?
Planning – Research and Evaluation…
Test Question Central Themes…
Define and discuss formative and summative assessment
List advantages and disadvantages of internal and external program evaluation.
Discuss the assets of having a program evaluation as an integral component of an educational system.
Multiple: Discuss how you would proceed with this planning activity. Who might be involved in the planning and why? What dysfunctional attitudes of the stakeholders should be addressed and why? What models of evaluation might be most appropriate? Include examples.

Ethics

Stakeholders
Planning – Research and Evaluation…

Book Name, Author and Date.
Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Traditional Model
• No formal model
• Used by human services (hospitals)
• Evaluations tend to serve the interest of organization providing the service and did little to challenge the program directors and staff
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Social Science Research Model
• Experimental group and control group
• Must use large enough sample size for accurate evaluation.
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Industrial Inspection Model
• Sometimes used in manufacturing
o When Problems are found, the item is fixed before it is delivered to the customers
o Inefficient and leads to high costs
 Do it right the first time.
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Black Box Evaluation
• Examines the output without examining its internal operations.
• Does not serve well for evaluating social problems.
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Objectives-Based Evaluation
• Most Prevalent model

• Evaluators work with clearly stated goals and objectives and then measures the degree to which such goals and objectives are achieved.

• Some evaluators are so focused on g&o, they neglect to examine why program fail or succeed.
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Goal-Free Evaluation
• Similar to anthropologist – studies pogram, staff, clients, settings w/o knowing stated goals or program.
• Expensive and open ended nature would make it threatening to staff members.
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Fiscal Evaluation
• Self explanatory
• Many decisions about selecting the services to offer cannot be made by bottom line alone.
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Accountability Model
• Developed from fiscal evaluation
• Publicly funded programs must devote resources to the activities that were mandated when the programs were approved.
• Because this model focuses so much on compliance with regulations, they cannot be applied universally
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Expert Opinion Model
• Attempt to remove biases
• Have expert examine the program
• Often used when entity being evaluated is large, complex, and unique
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Naturalistic Model
• Evaluator becomes the data gathering instrument, not surveys or records.
• Personal observations to obtain rich understanding of the program
• Often quite lengthy
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Empowerment Evaluation
• Some argue that this approach can be easily compromised
• Invite clients to participate actively
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Theory-Driven Evaluation
• Analysis consists of calculating the correlations among the services and the characteristics of participants, among the services and immediate changes, and among immediate changes and outcome variables.
• Critics argue this model does not provide th same level o assurance that evaluations involving random assigning to program groups and control groups can give
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
Improvement Focused Model
• Author Recommended
• Program improvement is the focus rather than particular methodology.
• Best meets the criteria necessary for effective evaluation
• Serving the needs of stakeholders, providing valid information, and offering an alternative point of view to those doing the really hard work of serving the program participants
• To carry this off without threating staff is most challenging aspect of program evaluation
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
When to pick a model?
• Evaluators and stakeholders must pick evaluation model only after analyzing the setting and the questions that are to be answered by the evaluation.
• Most evaluations use features from several of the models.
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: 13 Evaluation Models
What does Bozeman Like?
Objective Based Evaluation
Improvement Focused Model
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Book: Types of measures
• Surveys
o Carefully determine purpose of survey
o Fixed responses over open ended
o Questions clean and direct
• Interviews
• Checklists – Observation Records
• Tests
• Records
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Sample Test Question:
The purpose of program evaluation stated in the text and in class discussions is to.
• Provide feedback from program activities and outcomes to those who can make decisions
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Sample Test Question:
Which type of program evaluation is often overemphasized in discussion about program evaluation?
• Evaluation of Outcomes
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Sample Test Question:
Define Planning
• Determine major goals and strategies to create a desired future.
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Sample Test Question:
Advantage that an external evaluator has over internal evaluator
• Greater Objectivity
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Sample Test Question:
Define Vision or Mission Statement
• Strategic planning document
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Sample Test Question:
Define Vision Statement
• Short statement that describes the future you want to create
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Sample Test Question:
Benefits of identifying stakeholders
• Knowledge of various people, understand needs of information of various groups, understanding the perspectives of various groups
• Program Evaluation: Methods and Case Studies, Posavac & Carey (2007)
Planning – Research and Evaluation…

Method to avoid evaluation problems
Use Logic
Establish a Rationale
Define a Process
Avoid Pitfalls
Stufflebeam

LERDPAP
Planning – Research and Evaluation…

Need Contract with advanced agreements...
Criteria to be used
Audience
Purpose/job to be done
Data sources, collection, analysis
Assistance from
Timelines
Agree on roles and responsibilities, money
CAP DATA:RR.$
Planning – Research and Evaluation…

Problems and issues implementing effective evaluations
Agreements and Expectations
Program
People
Threatening atmosphere
Surprises
Data
Results
Incompetent Evaluator
Validity/Reliability
Ethical Violations
Purpose/Focus
Agreements
Cooperation
Time
Stakeholder Needs
APPS DRIVE PACTS
Planning – Research and Evaluation…

Stufflebeam program evaluation model
CIPP

Context: What is being met? (What needs to be done)
Input: How is need being met? (How should it be done?)
Process: Is program being implemented as planned?(Is it being done?)
Program: Is program successful? Evaluation of worth/merit(Did it meet the needs?)
SITE

Sustainable:
Impact: needs met?
Transportable: replicatable?
Effective: cost/se
Planning – Research and Evaluation…

Major ethical Issues
Protect People (minimize potential harm to participants)
- confidentiality, informed consent, compensate for ineffective or novel treatment, respect, dignity, worth

Role Conflicts: Danger of role conflicts

Validity: Avoiding threats to the validity of the evaluation

Stakeholder needs: identify and meet needs of various stakeholder groups

Side Effects: staying alert to negative side effects that may be related to the program or the evaluation itself (due diligence)
PERVES
Planning – Research and Evaluation…

Purpose of Program Evaluation
Unmet Needs
Implementation as planned
Compare Program: Best results
Quality
Unplanned Side Effects
UNICorn QUES
Planning – Research and Evaluation…

Improvement Focused Evaluations assess gaps between...
Needs and Objectives
Plans and Implementations
Expectations and Services
Projected and Achieved outcomes
IF NO PIES PA
Planning – Research and Evaluation…

An Evaluator HELPS
Organize
Focus on Needs
Plan effectively
Monitor carefully
Assess Quality accurately and justly
nurture improved practices
detect unwanted side effects
Planning – Research and Evaluation…

Criteria for effective evaluation
Provide Valid Info
Provide Alternative Viewpoint
Serve Needs of Stakeholder
VANS
Planning – Research and Evaluation…

Nadlers System Matrix and Breakthrough Thinking
Purpose, inputs, outputs, sequence, environment, people, physical, information aids

Fundamental, values, rates/measures, control, future
POIS EPPI against Fundamental Values R Controlling my Future
Planning – Research and Evaluation…

Principles of BT
Uniqueness, purpose, ideal solutions, systems, limited info, people, better timeline
UPS sends little pink boxes
Toprol XL
Metoprolol Succinate
anti-hypertensive
Program Evaluation...

Types of evaluator Error
Type 1 Errors: Reporting evaluation was effective when it was not

Type 2 Errors - Reporting evaluation results is not effective when it was.
Program Evaluation...

Dysfuncitonal Attitudes toward program evaluation
1) Assume program is perfect
2) Fear that evaluation will offend staff
3) Fear that evaluation will inhibit innovation
4) Fear that program will be terminated
5) Fear that the infomation will be msiisued
6) Fear that qualitative understanding may be supplanted
7) Fear that evaluation drains program resources
8) Fear of losing control of the program
9)Fear that evaluation has little impact.
Program Evaluation

Evaluation Problems
- Unrealistic expectations
- Pressure from client
- resistance from clients
- limiting innovations
- misuse of findings
- resources requirements
- loss of control
- evaluation impact
Program Evaluation

General Evaluation Questions
1) program - values match
2) Needs served
3)Program plan v. operation
4) OUtcomes v. goals
5) Program theory/support
6) Acceptance
7) Resource utilization
Program Evaluation

Evaluation Criteria Limitations
1) Budget
2) Time
3) Acceptance of criteria (note in writing)
4) Criteria are the lens through which users of the evaluation will view the assessment of the program. It is the evaluator's responsibility to strive for clear and undistorted vies
Program Evaluation

Intended beneficiaries
Program participants
Artifacts
Community Indexes
Providers of Services
Program Staff
Program Records
Expert Observations
Trained Observations
SO
Evaluation Staff
Program Evaluation

Which beneficiaries to use
Multiple measures from more than one source
Multiple variables
Nonreactive measures
valid measures
reliable measures
measures sensitive to change
cost effective measures
Program Evaluation

Ethical Principles of evaluation
Systematic Inquiry
Competency
Integrity and Honesty
Respect for People
Responsibility for general public welfare
Program Evaluation

Ethical issues involved in the treatment of people
1) Compensating for ineffective, novel treatments
2) obtaining informed consent
3) maintaining confidentiality
Program Evaluation

Role conflicts facing evaluators
providing honest information to stakeholdes is a must
outside evaluations bring more honest evaluations comapred to internal evaluations and most focus should be placed on the expected outcomes in the evaluation
Program Evaluation

Recognizing the needs of different stakeholders
Program Managers: efficiencey of the organization
Staff Members: seek assistance in direction
Clients: want effective and appropriate services
Community Members: want cost effective program
Program Evaluation

Avoiding negative side effects of evaluation process
1) Can someone be hurt by inaccurate findings?
2) Type 2 errors: use large samples and outcomes with high reliability
3) Pay attention to unplanned effects
4) Analyze impact values held by evaluator.
Program Evaluation

5 common ethical problems
1) changing evaluation focus after examining data
2) promising confidentiality when it could not be guaranteed
3) Making decisions about the evaluation without consulting the clients
4) carrying out an evaluation without sufficient training
5) Allowing partisan groups to delete references to embarrassing program weaknesses from report.
Program Evaluation

Eight System Elements
1) Purpose: mission, aim, need
2) Inputs: people, things, information
3) Outputs: people, things, info
4) Operating Steps: process and conversion tasks
5) environment: physical and organizational
6) human enables: people, responsibilities, skills, to help in sequence
7: physical enablers: equipment, facilities, materials to use in the sequence
8) Information enablers: knowledge, instructions (not output)
Program Evaluation

Six System Dimensions
1) Fundamentals: Observable properties, physical charactistics
2) values: organizational desires, beliefs, consensus
3) measures: effectiveness measures, rates, capacities, performance
4) control: methods used to measure what occurs, specification performance levels, confidence limits, degree of precision
5) interface: relationship of system elements with other elements, operations in the organization
6) Future: planned changes, research needs for all dimensions