+ All Categories
Home > Documents > Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance...

Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance...

Date post: 14-Dec-2015
Category:
Upload: ashley-greer
View: 219 times
Download: 3 times
Share this document with a friend
Popular Tags:
18
Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College [email protected] Making Sense of the “Eval”: RFP language, lingo and assessment strategies
Transcript
Page 1: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Matt Rearick, Ph.D.Research & Evaluation, The RLI Group

Associate Professor, Human PerformanceRoanoke College

[email protected]

Making Sense of the “Eval”:RFP language, lingo and assessment strategies

Page 2: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Overview

①Cultural Shift in Assessment and Evaluation

②Grant Requirements and Lingo

③Logic model to Evaluation design (and a little more lingo)

a.) design variations

b.) tricks of the trade

- scalability of program assessment

④Give it a whirl …

Page 3: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Cultural shift …

① Data – driven, Value – added… Post program effects(?)

② Development, Scale – up … State & National Models

③ Emphasis on Programs to Emphasis on Assessment of Programs

(Grant world and beyond)

④ Less funds, more requests (typical supply and demand)

⑤ Data is EVERYWHERE now (ugh!)

Can you do what you say you are going to do, can you

do it well, and can you “prove” it?

Page 4: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Funder’s NeedsOur Needs and

Page 5: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Grant Requirements and Lingo …

1. Does the agency have an outcome that is key to the current mission?

2. What standard of evaluation is expected?

3. Identify key vocabulary: project objectives, evaluation and outcomes

4. Plan integrated evaluation – seamless …

5. If possible, model the proposed program first (pilot?)

Step 1: Know your granting agency

Page 6: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Enter the RFP (and more lingo)Step 2: Dissect the Program Description – NSF, Research Experiences for Undergraduates

Research experience is one of the most effective avenues for attracting Students to and retaining them in science and engineering, and for preparing them for careers in these fields. The REU program, through both Sites and Supplements, aims to provide appropriate and valuable educational experiences for undergraduate students through participation in research. REU projects involves students in meaningful ways in ongoing programs or in research projects specifically designed for the REU program. REU projects feature high-quality interaction of students with faculty and/or other research mentors and access to appropriate facilities and professional development opportunities.

Page 7: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

More RFP lingo

Describe the plan to measure qualitatively and quantitatively the success of the project in achieving its goals, particularly the degree to which students have learned and their perspectives on science, engineering, or education research related to these disciplines have been expanded. Evaluation may involve periodic measures throughout the project to ensure that it is progression satisfactorily according to the project plan, and may involve pre-project and post-project measures aimed at determining the degree of student learning that has been achieved. In addition, it is highly desirable to have a structured means of tracking participating students beyond graduation, with the aim of gauging the degree to which the REU Site experience has been a lasting influence in the students’ career path. Proposers may wish to consult The 2010 User-Friendly Handbook for Project Evaluation for guidance on the elements in a good evaluation plan. Although not required, REU Site PIs may wish to engage specialists in education research in planning and implementing the project evaluation.

Step 3: Dissect the Program Evaluation – NSF, Research Experiences for Undergraduates

Page 8: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Funder’s GoalsOur Goals and

Page 9: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Logic Model to Evaluation Design

Page 10: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Deep Breath

Logic Model to Evaluation Design

Page 11: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Design & Assessment Strategies

A. Random Assignment, Field – “Experiment” Design

- Intervention and Control Groups

(ethical and practical considerations/limitations)

- Willingness to take long -view

B. Quasi – Experimental Design

- Intervention and Control Groups

* self-selected, matching (composition, predispositions and experiences are relatively close) ... More immediate view and easier to sell – less confident in program effects

Page 12: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Design & Assessment Strategies

C. Units of Analysis

- Individuals vs. Groups vs. Communities/Schools/Centers etc. (Sample size? Effect Sizes?)

D. Assessment Approach (Think MIXED METHODS!)

Qualitative

- Survey (Short Answer)

- Interviews (Individual)

- Structured vs. Semi-structured

- Focus Groups

- Observation (rubric)

Page 13: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Design & Assessment StrategiesD. Assessment Approach, cnt… (Think MIXED METHODS!)

Quantitative

- Measurements, Instruments …

- Valid and Reliable

E. Data Collection Strategies

- Single Measurement, often post intervention

- Pre – Post Measurement

- Periodic Measurement (which could include pre-post too)

- Time-series, better estimates of program effects

Page 14: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Design & Assessment StrategiesD. Data Analysis Strategies

- Difference on the original measurement scale

a.) Intervention vs. Control

- Comparison with test norms, Normative Population

a.) Absolute vs. Relative Change

- Proportion over a success threshold

a.) Diagnostic

b.) Arbitrary

- Comparison of effects with similar programs

Page 15: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Assessment Strategies – “Tricks of the trade” (before you call in the cavalry …)

1. What is the (your) organization already doing in assessment?

a.) Think Past, Present and Future

2. Process vs. Product Assessment

a.) Continuous Improvement Model (Formative/Summative)

3. What instruments and tools can you find on your own?

a.) Expertise issues

4. NO KITCHEN SINKS ALLOWED! Be judicious and realistic.

a.) Capacity issues (you can’t do it all, and won’t be able to)

Page 16: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Assessment Strategies (call in the cavalry)

1. Evaluation Design

a.) Program to Eval intimately linked

b.) Put a team together early!

- Provides confidence and avoids pitfalls

2. Statistics

a.) Confidence in your descriptive findings

b.) Nvivo, SPSS, etc. expertise

Page 17: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Always keep in mind…

A solid program with continuous improvement and measurable findings can (and often does)

lead to the NEXT GRANT

Page 18: Matt Rearick, Ph.D. Research & Evaluation, The RLI Group Associate Professor, Human Performance Roanoke College matt@therligroup.com Making Sense of the.

Give it a whirl … RFP Lingo … language you can use to develop

the proposal and the evaluation

Mini Logic Model

- Activity (Strategy), Output(s), Outcome(s)

Design? (Random assignment, Quasi?)

Think Mixed Methods

Data Collection …

Data Analysis …


Recommended