Date post: | 29-Dec-2015 |
Category: |
Documents |
Upload: | camron-doyle |
View: | 214 times |
Download: | 0 times |
Part 2
Last Week Why evaluate Results-based approach Purposes and uses of evaluation Reviewed Kirkpatrick model
This Week More on Kirkpatrick model Types of data collection Types of data Evaluation instruments Tips for Survey/Questionnaire
* Will vary from note taking H.O.
Warm Up Exercise
F
Levels of Evaluations
• Donald Kirkpatrick - 4 Levels
I. Reaction – Were the participants pleased with interventionII. Learning – What did the participants learnIII. Behavior – Did the participants change their behavior based on what was learning?IV. Results – Did the change in behavior positively affect the organization
Exercise # 3
• What are some advantages and limitations in each of the four levels of Kirkpatrick’s evaluation model?
(Use note-taking handout)
Collect Post Intervention Data(Handout 3)
1. Surveys2. Questionnaires3. On the job observation4. Post intervention interviews5. Focus groups6. Program assignments7. Action plans8. Performance contracts9. Follow up session10. Performance monitoring
Exercise # 4
• Using the 10 items in the Post Intervention Data handout (3):
Each group describe (fabricate) a situation where you might gather post intervention data for a level 2/3/4 evaluation. Be prepared to explain your rationale.
Hard Data
Hard data can be group into four categories
1. Output – of the work unit
2. Quality – how well produced or serviced
3. Cost – improvement in costs
4. Time – savings
Soft Data
• Work habits – absenteeism, tardiness, violations
• Climate – number of grievances, complaints, job satisfaction
• Satisfaction – Favorable reactions, employee loyalty, increased confidence
Exercise # 5
• List some advantages and disadvantages / limitations when collecting hard and soft data.
• (Use Notetaking H.O. p. 3/4)
Evaluation Instruments
Validity – does it measure what it is supposed to measure
• Content validity – how well does the instrument measure the content/objectives of the program
• Construct validity –How well does it measure the construct (abstract variable such as KSA)
• Concurrent validity – How well does the instrument measure up against other instruments
• Predictive validity – how well can it predict future behavior
Part 3
Last Week
More on Kirkpatrick model Types of data collection Types of data
This Week
Developing evaluation instruments
The survey process
* New Handouts
Exercise # 5
• List some five things you would do to enhance the chances of getting a good number returns for surveys/ questionnaires.
Survey Process -- Tips
• Communicate – in advance– the purpose
• Signed introductory letter
• Explain who’ll see data• Use anonymous input?
More Tips• Keep it simple• Simplify the
response process– Bubble format– SASE
• Utilize local support
More Tips
• Consider incentives• Use follow-up
reminders• Send a copy of the
results to the participants
Action PlanningAction Planning
• Most common type of follow-up assignments.
• Developed by participants.
• Contains detailed steps to accomplish measurable objectives.
• Shows what is to be done, by whom, when.
• Must be monitored
Action Plans• Communicate the action plan requirement early and
explain its value (avoids resistance)
• Describe the action-planning process at the beginning of the program (outline)
• Teach the action-planning process
• Allow time to develop the plan
• Have the facilitator approve the action plans
• Require participants to assign a monetary value for each improvement (helps ROI later)
Action Plans• Ask participants to isolate the effects of the
program
• Ask participants to provide a confidence level
for estimates
• Require action plans to be presented to the groups by participants (peer review) if possible
• Explain the follow up mechanism
• Collect action plans
• Summarize the data and calculate ROI
Converting Data to Monetary Benefits
• Focus on a unit of measure• Determine the value of each unit• Calculate the change in performance• Determine an annual amount for
the change
• Calculate the total value of the improvement
Ways to Put Value on Units
• Cost of quality
• Converting Employee time
• Using Historical Costs
• Using Internal and External Experts
• External Databases
• Estimates from the participants
• Estimates from Supervisors
• Estimates from Senior Managers
• Using HRD staff estimates
Credibility• Source of the Data• Source for the study• Motives of evaluators• Methodology of the study• Assumptions made
in the analysis• Realism of the outcome data• Types of data• Scope of analysis
Guidelines for Study
• Credible and reliable sources for estimates
• Present material in an unbiased, objective way
• Fully explain methods (step by step)
• Define assumptions and compare to other studies
• Consider factoring or adjusting output values when they appear unrealistic
• Use hard data whenever possible
Identifying intangible Measures
(Not based upon monetary values)
• Employee satisfaction
• Stress reduction
• Employee turnover
• Customer satisfaction, retention
• Team effectiveness
Determining Costs• Collect costs on every intervention• Costs will not be precise (hard to be perfect)• Be practical - work with accounting department• Define which costs to collect, categories, sources• Computerize• Cost accumulation (track accounts)• Cost estimation (forumulas - page 227)• Fully load with all costs possible - be truthful• Overhead, benefits, perpherial costs, etc
Data Analysis
• Statistics (use a professional)
• Use terms appropriately (ie, Significant difference)
• Statistical deception (erroneous conclusions)
Return on Investment
• Compares costs to benefits
• Complicated
• Usually annualized
• Business casespecific
• Communicate the formulaused
I. Reaction and Planned Action – measure’s participants reactions and plans to change
II. Learning – Measures KSA
III. Job Applications – Measures change of behavior on the job and specific use of the training material
IV. Business results – Measures impact of the program
V. Return on investments – Measures the monetary value of the results and costs for the program, usually expressed as a percentage
Phillips ROI Framework
Level 1 Reaction Participants
Level 2 Learning Participants
Level 3 JobApplications
ImmediateManagers
Level 4 BusinessImpact
Immediate/SeniorManagers
Level 5 Return onInvestment
Senior ManagersExecutives
Evaluation as a Customer Satisfaction Tool
From Level 4 to Level 5Requires Three Steps:
1. Level 4 data must be converted to monetary values
2. Cost of the intervention mustbe tabulated
3. Calculate the formula
ROI Process ModelROI Process Model
CollectData
IsolateEffects ofTraining,
ConvertData toMonetaryValve
Calculate theReturn onInvestment
IdentifyIntangibleBenefits
TabulateProgram Costs
ROI Formula
ROI (%) =
Net Program Benefits
Program Costs
X 100
Two Methods
• 1. Cost/Benefit RatioAn early model that compares the intervention’s costs to its benefits in a ratio form. For every one dollar invested in the intervention, X dollars in benefits were returned.
• 2. ROI FormulaUses net program benefits divided by costs, and expressed as a percent.
Cost / Benefit
CBR = Program Benefits
Program Costs
ROI Formula
ROI (%) =
Net Program Benefits
Program Costs
X 100
Cautions With Using ROI
• Make sure needs assessment has been completed
• Include one or more strategies for isolating the effects of training
• Use the reliable, credible sources in making estimates
• Be conservative when developing benefits and costs
• Use caution when comparing the ROI in training and development with other financial returns
• Involve management in developing the return
• Approach sensitive and controversial issues carefully
• Do not boast about a high return (internal politics)
Implementation Issues
• Identify an internal champion (cheerleader)
• Develop an implementation leader
• Assign responsibilities so everyone will know their assigned tasks and outcomes
• Set targets (annual)
• Develop a project plan, timetable
• Revise/Develop Policies and Procedures (Page 367)
• Assess the climate – gap analysis, SWOT, barriers
Preparing Your Staff
• Involve the staff in the process
• Using Evaluation Data as a Learning Tool
• Identify and remove obstacles(complex, time, motivation, correct use of results)
ROI Administration
Which programs to select?Large target audiencesImportant to corporate strategiesExpensiveHigh VisibilityComprehensive needs assessment
ROI Administration
• Reporting ProgressStatus meetings (facilitated by expert)Report progressAdd evaluation areas
• Establish Discussion Groups
• Train the Management Tool
Timing of Evaluation
1. During the program
2. Time series – multiple measures
3. Post tests – timing
Questionnaire Content IssuesQuestionnaire Content Issues
• Progress with objectives
• Action plan status
• Relevance of intervention
• Use of program materials
• Knowledge/skill application
• Skill frequency
• Changes in the work unit
• Measurable improvements/accomplishments
• Monetary impact
• Confidence level
• Improvement linked with the intervention
• Investment perception
• Linkage with output measures
• Barriers
• Enablers
• Management support
• Other solutions
• Target audience recommendations
• Suggestions for improvement