Date post: | 25-Dec-2015 |
Category: |
Documents |
Upload: | dominic-houston |
View: | 214 times |
Download: | 1 times |
Beginning Farmer and Rancher Development Program
Best Practices Webinar #2
Evaluating Your BFRDP Project
Welcome – one of the recommendations of the 2011 PD meeting is to have webinars to share best practices and discuss common issues.
This presentation by Stephanie Ritchie, USDA-National Agricultural Library, is about evaluation of participants and annual reporting of outcomes for all BFRDP grants.
Feel free to ask any questions
This meeting is being recorded and the link will be posted on the BFRDP website
Introduction and Welcome
Evaluating Your BFRDP Project
Stephanie RitchieUSDA – National Agricultural Library
Start2Farm.gov
Goals for This Session1. Understand why program evaluation is
important2. Learn current BFRDP evaluation requirements
to improve data collection and reporting process
3. Discover best practices and tools for program evaluation that will help you be more successful
4. Explore rules for data collection from individuals
5. Investigate how to develop instruments
Goals for This Session1. Understand why program evaluation is
important2. Learn current BFRDP evaluation requirements
to improve data collection and reporting process
3. Discover best practices and tools for program evaluation that will help you be more successful
4. Explore rules for data collection from individuals
5. Investigate how to develop instruments
OutcomesWhat are they and why should you care?
Outcomes are the change that occurs as a result of our activities and investments
Government is often asked to account for how the provided resources were used and to demonstrate the outcomes of projects that funded
This information is used for legislative and funding appropriation decisions and demonstrate to the public that taxpayer monies have been put to good use.
NIFA Outcome Reporting Requirements
Acknowledge funding in all materialsParticipate in annual Project Director’s meetingProvide any information that is freely available to the public to the BFRDP ClearinghouseProvide requested data through CRIS annual reports
Format of NIFA system challenging for BFRDP outcome dataMay send additional information by e-mail
BFRDP Outcome will be used to: Respond to questions from the Congress and the publicProduce two annual reports – overall BFRDP program results for public and detailed project data for NIFA internal program managementAchieve overall goals:
Improve Performance - Enhance Visibility – Increase Accountability
Evaluation to Show BFRDP Successes – CRIS Report
Use Activity and Outcome Measures Recommended by BFRDP Can be compiled across projects to create an overall picture of BFRDP activities and outcomesMeasures impacts of projects on participating individuals/families/farms
Evaluation for Project ImprovementProjects may wish to include questions in their evaluation instrument about the techniques/methods/practices specific to their work
Examples of what might be evaluatedHow did you hear about our program?Instructor performanceSatisfaction with services received
Please submit your internal program evaluation questions/instruments to Start2Farm Extranet - http://www.s2fdata.org/
Get an Extranet login at: https://docs.google.com/spreadsheet/viewform?pli=1&formkey=dHBxR0dQRmRLdmNTZzMtb1lCaGtGZ2c6MQ#gid=0
Meta Changes / System ChangesEvaluation to show changes beyond BFRDP program to the (beginning) farmer community at largeNo plan yet to do this type of evaluationAbility to quantify significant social/economic/environmental change is beyond the scope of BFRDP goals and individual projectsMaybe an Education Enhancement Project could be appropriate entity to study the beginning farmer training in general – or a group of BFR training organizations
Goals for This Session1. Understand why program evaluation is
important2. Learn current BFRDP evaluation requirements
to improve data collection and reporting process
3. Discover best practices and tools for program evaluation that will help you be more successful
4. Explore rules for data collection from individuals
5. Investigate how to develop instruments
Current BFRDP Outcome Reporting
Report amount of materials produced to publicize/recruitReport number of workshops/ training programs offeredProvide total numbers and demographics of participants attending workshops and training programs
After (varied intervals) training program, record # and % :
• change in knowledge, attitudes, skills, or intention (KASI)
• change in behavior/approachOkay to measure planned (intention) and actual behavior changes
Activity Measures Outcome Measures
Mandated Outcome Measures:•How many people are trained?•How many of the target audience are trained?•How many new farms have been created?
Outcomes not required, but of interest:•What are the best practices for beginning farmer training?•What lessons have been learned by grant projects?
LONG TERM GOALMore Successful New
Farmers !
• Increased Productivity• Increased Profitability/Efficiency
• Increased Environmental Sustainability• Increased Quality of Life
• Increased Knowledge• Changed Attitude• Increased Skills
• Intention to adapt new practice
Assess right after training
•Application of New Practices/Skills
Assess 1+ years after training
EVALUATION THEORYLevels of Evidence
7. End Results6. Behavior (practice
change)5. KASI change
(knowledge, attitude, skills, intentions)
4. Reactions3. People involvement2. Activities1. Resources
Impacts
Outputs
Inputs
New DRAFT Outcome Measurements
Of those who complete any part of a training program (baseline), immediately record the number and percent of participants: plan to start farmingwho are farmingplan to continue farmingplan to stop farmingchange in knowledgechange in attitudeschange in skillsplan a change in behavior/approachplan to continue participating in trainingOther – explain
One-year after the initial training program, record the number and percent of participants who as a result of your training:
graduatedstarted farming did not start farmingcontinued farmingstopped farmingchanged farming\land management practiceschanged marketing practices changed business practices developed farm plancontinue to participate in your training programsOther – explain
New DRAFT Outcome Measurements
http://www.s2fdata.org/sites/default/files/outcome_standard_grants_2012draft2.doc
Summary of Data Needed for BFRDP Outcomes Reporting
How many changed? • # of participants, % of participants of total #
participants• # of target audience versus # total participants
How much change?• Knowledge, Attitude, Skill, Intention change• This may be difficult to measure uniformly
across programs – but you should report #/% of people, NOT % of change (i.e., 30% score increased to 90% score)
Types of impactOn a key concepts quiz, participants scored 58% before the course and 82% on average after the course (N=36).100% plan to use a soil test and 9 of 15 plan to use a cover crop (N=15).1 year after the course 83% tested their soil (N=5).Ben Davies, who took a class on soils through the program, said, "I think how well the season went this year was because of the soils class. I calculated my amendments, applied the right amounts and the plants grew really well."
Introduction to Soils – 56 participants
Students see the benefits of soil aggregates.
Overall project impact• 406 new farmers participated in 19
courses in 7 counties.• 51% currently farming (n=186)• 99% increased knowledge
– 63% increased knowledge “a great deal” (n=255)
– 34% increased knowledge “a moderate amount” (n=255)
– Average increase in real knowledge of 36%(3 grades) (n=115)
• 65% plan to adopt 1+ new practices/make a change (n=235)
• 40% plan to start farming (n=151)• 49% plan to continue farming (n=151)
Goals for This Session1. Understand why program evaluation is
important2. Learn current BFRDP evaluation requirements
to improve data collection and reporting process
3. Discover best practices and tools for program evaluation that will help you be more successful
4. Explore rules for data collection from individuals
5. Investigate how to develop instruments
So how do we collect the data?
Knowledge gain – perceived1. Overall, after completing the course, do you think your knowledge of sheep
management has increased: a. great deal b. a moderate amount c. a little d. not at all
BFRDP Outcome measure:% Change in knowledge
So how do we collect the data?
BFRDP Outcome measure:% Change in knowledge
Knowledge Gain – Real Pre – Post Test
2010 SHEEP MANAGEMENT SHORT COURSE (Please circle your answers) 1. Which breed of sheep has the finest (highest quality) wool?
Suffolk RomneyMerino Cheviot
2. Which breed will commonly reproduce in the spring as well as in the fall?
Dorset HampshireSuffolk Shropshire
3. The average length of the Estrous, or reproductive cycle, in the ewe is:
10-12 days 20-21 days16-17 days 28 days
4. The average length of gestation (pregnancy) in the ewe is:
135-140 days 154-159 days144-150 days 160-165 days
BFRDP Outcome measure:% Change in knowledge
Change in Behavior4. Listed below are some topics from this class. Please circle the best answer(s) for each item.
Did Plan to do NoNot
before within 6 plansapply
Techniques class months Take a soil test BEFORE PLAN NO PLAN NA Calibrate your sprayer BEFORE PLAN NO PLAN
NA 5. What else did you learn that you plan to use this year?
BFRDP Outcome measure:% planned change in behavior
2. Please think back to your knowledge before this class and what it is now at the end of the class.
For each topic listed below, place a B at the point where your knowledge was at before the
class and an N for where your knowledge is now, after the class. The importance of figuring true costs into estimates
VERY IMPORTANT
NOT IMPORTANT
Change in Attitude BFRDP Outcome measure:% change in attitude
Compiling Data across multiple classes
Using the same question format makes it easier to compile.Using question formats that are flexible allows you to still retain detail at the class level and maintain the ability to compile the data.
ONE YEAR AFTERBFRDP Outcome measure:% changed farming/ land management practice
Our Current TemplateWhen we can’t do pre-post testing
Summarizing the Data
Best Practices/Evaluation Tools
Share under “Evaluation Tools and Best Practices” at http://www.s2fdata.org/ - Forum
WANTED1. Land Stewardship Project’s Animal and
Vegetable Farming Skills Evaluations2. Core competencies and skills checklist
developed at Northeast Beginning Farmers Coalition meeting
3. YOUR EVALUATION TEMPLATES!!!
Captured
Goals for This Session1. Understand why program evaluation is
important2. Learn current BFRDP evaluation requirements
to improve data collection and reporting process
3. Discover best practices and tools for program evaluation that will help you be more successful
4. Explore rules for data collection from individuals
5. Investigate how to develop instruments
Data Collection from IndividualsResearch projects which involve human subjects should:
1) not place subjects at undue risk; 2) get uncoerced, informed consent of subjects for their participation
Guidance for USDA at 7 CFR Part 1c Title 7—Agriculture: PART 1c--PROTECTION OF HUMAN SUBJECTS
http://www.access.gpo.gov/nara/cfr/waisidx_05/7cfr1c_05.html
The Belmont ReportBased on The Belmont Report: Ethical Principles and Guidelines for
the protection of human subjects of research - April 18, 1979http://videocast.nih.gov/pdf/ohrp_belmont_report.pdf
The three fundamental ethical principles for using any human subjects for research are (as found on http://ohsr.od.nih.gov/guidelines/belmont.html):
1. Respect for persons: protecting the autonomy of all people and treating them with courtesy and respect and allowing for informed consent. Researchers must be truthful and conduct no deception;
2. Beneficence: The philosophy of "Do no harm" while maximizing benefits for the research project and minimizing risks to the research subjects; and
3. Justice: ensuring reasonable, non-exploitative, and well-considered procedures are administered fairly — the fair distribution of costs and benefits to potential research participants — and equally.
Data Collection from Individuals7CFR Part 1C applies to all research involving human subjects conducted,
supported or otherwise subject to regulation by any federal department or agency. Research activities exempt from this policy:
(1) Research conducted in established or commonly accepted educational settings, involving normal educational practices, such as (i) Research on regular and special education instructional strategies, or (ii) research on the effectiveness of or the comparison among instructional techniques, curricula, or classroom management methods.
(2) Research involving the use of educational tests (cognitive, diagnostic, aptitude, achievement), survey procedures, interview procedures or observation of public behavior, unless:
(i) Information obtained is recorded in such a manner that human subjects can be identified, directly or through identifiers linked to the subjects; and
(ii) Any disclosure of the human subjects' responses outside the research could reasonably place the subjects at risk of criminal or civil liability or be damaging to the subjects' financial standing, employability, or reputation.
Institutional Review Boards• Most Universities and some other organizations
have Institutional Review Boards (IRBs) that provide oversight for research that involves human subjects
• NIFA does NOT require that BFRDP grantees have their research reviewed by such a Board
• These reviews are generally in place for medical research, but some behavioral science research is also covered by University/Federal guidelines
• Some BFRDP grantees may be subject to this process through their organization
Goals for This Session1. Understand why program evaluation is
important2. Learn current BFRDP evaluation requirements
to improve data collection and reporting process
3. Discover best practices and tools for program evaluation that will help you be more successful
4. Explore rules for data collection from individuals
5. Investigate how to develop instruments
Developing Survey Instruments
Survey instruments should be both:• Reliable
“Does our question yield the same result on repeated trials?“
Might measure over time or with a group experts
• Valid“Do the questions measure the outcomes stated?”Usually involves review by experts, comparison to
recognized valid measurements, or the appearance of correctly measuring
Example BFRDP Survey Instrument Design“Content validity was assured through a review of the
instrument by this panel of experts consisting of approximately 47 project coalition members from agriculture, higher education, non profit organizations, ‐and state agencies (Pedhazur & Schmelkin, 1991). Approximately fourteen individuals were asked to complete a pilot test of the instrument to ensure the online survey instrument functioned correctly. The Virginia Tech Institutional Review Board reviewed the instrument and gave approval for administration.”
From Virginia Beginning Farm & Rancher Coalition Project Survey Report, October 2011, Prepared by Matt Benson, PhD Student, Agriculture & Extension Education
http://www.vabeginningfarmer.aee.vt.edu/survey-finalreport.pdf and http://s2fdata.org Adapted from the Northeast Beginning Farmers Project survey instrument titled“Beginning Farmer Barrier ID Ranking: Ranking the needs of the beginning farmer...”
CLAIMING CREDIT• You want to show that changes are a result of your
training.• To show your influence:
- Use the same observation item, survey question , activity (diagnose plant diseases) or other data points at baseline, before the program and after:
- Use a retrospective survey question that measures before and after both at the same time, after the program.
ONE YEAR AFTERBFRDP Outcome measure:% changed farming/ land management practice
Thank You
Contact for more information
Stephanie Ritchie301-504-6153