Module 7 g&m

Post on 01-Dec-2014

351 views 4 download

Tags:

description

 

transcript

T E A M 4

MODULE 7: CHAPTER 17 G&M

What is evaluation to you? Our text lists five program issues for evaluation:● Quality● Suitability● Effectiveness● Efficiency● Importance

How do these components relate to your place of work? Can you provide examples?

EVALUATION PRE-DISCUSSION

�Evaluation-estimating value

�Two steps1.Compare results and objectives

2.Appraise or judge the value of the differences assessed

What is the difference between “measurement” and “evaluation”?

EVALUATION

�A systematic process, reliant upon multiple skills

● �Collecting information

● Interpreting data�● Drawing conclusions�● Communicating outcomes�

The authors list two major purposes of evaluations. What are they?

EVALUATION

ADDIE MODEL

1. Identify All Clients and Stakeholders and Clarify Their Needs2. Identify the Performance Improvement Initiative to Be Evaluated3. Identify and Clarify the Purposes for the Evaluation4. Determine the Critical Research Questions That the Evaluation

Must Address5. Develop an Evaluation Design6. Analyze Resources and Constraints7. Determine the Best Data Collection Methods8. Plan Reporting and Communications Actions

Can one person explain a step of the process from their own experience at work?

THE PROCESS

KIRKPATRICK’S FOUR LEVELS OF EVALUATION MODEL

Most of us said that our organizations utilized the Kirkpatrick model of evaluation, but in a limited capacity (reaction and learning)

What do you think would have been an effective method of incorporating Levels 3-4 (behavior and impact) ?

KIRKPATRICK

● Helps training professionals to understand evaluation in a systematic way

● A straightforward system for discussing training outcomes

● recognizes that single outcome measures cannot adequately reflect complexity of organizational training programs (Bates, 2004)

Are there any other advantages of Kirkpatrick’s model that you would like to add?

KIRKPATRICK ADVANTAGES

● Model is incomplete- oversimplified view of training effectiveness. Other factors influence training outcomes.o learning culture of the organizationo organizational or work unit goals and valueso interpersonal supporto climate for learning transfero adequacy of material resources

(Bates, 2004)

KIRKPATRICK LIMITATIONS

● Assumption of Causal Linkages- model assumes that the criteria represent a causal relationship between the levels of evaluationo research has failed to confirm causal linkageso “ if training is going to be effective , it is

important that trainees react favorably” and “without learning, no change in behavior will occur” (Kirkpatrick, 1994)

(Bates, 2004)

KIRKPATRICK LIMITATIONS

● Incremental Importance of Information- model assumes that each level of evaluation provides data that is more informative than the lasto perception that establishing level 4 results provide

the most useful informationo “in practice, however, the weak conceptual

linkages inherent in the model and resulting data it generates do not provide an adequate basis for this assumption” (Bates, 2004)

KIRKPATRICK LIMITATIONS

•“Evaluation is the systematic process of delineating, obtaining, reporting, and applying descriptive and judgmental information about some object’s merit, worth, probity [moral correctness], feasibility, safety, significance, or equity"(Stufflebeam & Shinkfield, 2007)

EVALUATION

Context- Input-Process-Product

STUFFLEBEAM’S CIPP MODEL (1983)

What needs to be done? ContextHow should it be done? InputIs it being done? ProcessDid it succeed? Product

STUFFLEBEAM’S CIPP MODEL

Uses for CIPP model:● Conduct a needs analysis● Evaluation of alternatives for addressing needs● Monitor design/implementation of interventions● Helps to examine outcomes of intervention

regarding impact to the organization

STUFFLEBEAM’S CIPP MODEL

● Addresses concerns of decision-makers for justifying the investment in interventions/initiatives

● Provides a framework for comparing alternatives for future investments

COST BENEFIT MODEL (KEARSLEY, 1986)

● Provides the expected benefit or return on investment

● Expressed as a percentage or in actual dollars● Identify the benefits of intervention ($), divide by

the cost (%) or subtract costs

RETURN ON INVESTMENT

● Helps to decide how to best allocate resources● Disadvantage: most interventions or initiatives

provide benefits which are hard to quantify

RETURN ON INVESTMENT

1. What do you think is the difference between the Cost Benefit Model and the ROI model?

2. Provide examples of when these models should be used.

QUESTIONS:

7 step model1. Determine purpose, objectives, participants (who

wants this information)2. Assess information needs3. Consider proper protocol4. Describe population to be studied, selet subjects5. Identify other variables6. Formulate a study design7. Formulate a management plan

FORMATIVE EVALUATIONS

● Most useful for evaluating instruction● May be used to for performance improvement

and change interventions

FORMATIVE EVALUATIONS

“...evaluative inquiry can not only be a means of accumulating information for decision making and action..but that it also be equally concerned with questioning and debating the value of what we do in organizations” (Preskill and Torres, 1999)

EVALUATIVE INQUIRY

Evaluative inquiry is a way of fostering individual learning and team learning within an organization , about issues that are critical to its purpose and what it values (Parsons, 2009)

EVALUATIVE INQUIRY

● Collaboration● Organizational learning and change● Links learning and performance● Diverse perspectives

EVALUATIVE INQUIRY

● A study is needed to evaluate and redesign an online master’s degree program consisting of 12 courses in informatics

● Educators are concerned about the quality of online education courses

● Meaningful assessment is essential for improving quality of such programs

CASE STUDY FOR EVALUATION

Considering the evaluation models that we have discussed:

1. Which model(s) would you consider appropriate for this case? Why?

2. Design an evaluation program to include Steps 1-5 as described by Rothwell and Kazanas ( Gilley & Maycunich, p. 430-432)

CASE STUDY FOR EVALUATION

Bates, R. (2004) A critical analysis of evaluation practice; the Kirkpatrick model and the principle

of beneficence. Evaluation and program planning,27(3),341-34

Parsons, B. (2009) Evaluative inquiry for complex times. OD Practitioner, 41(1).

Preskill, H. & Torres, R. T.(1999). building capacity for organizational learning through evaluative inquiry. Evaluation, 5(1) 42-60

REFERENCES