+ All Categories
Home > Documents > Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA...

Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA...

Date post: 28-Mar-2015
Category:
Upload: alonso-sussex
View: 217 times
Download: 4 times
Share this document with a friend
Popular Tags:
45
Mixed Methods in Program Evaluation” Presented by Tom Chapel Thomas J. Chapel, MA, MBA [email protected] Chief Evaluation Officer 404-639-2116 Centers for Disease Control and Prevention
Transcript
Page 1: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

“Mixed Methods in Program Evaluation”

Presented by Tom Chapel

Thomas J. Chapel, MA, MBA [email protected] Evaluation Officer 404-639-2116Centers for Disease Control and Prevention

Page 2: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Agenda

1. The why and how of mixed methods:

• Rationale

• Options

• Challenges

• Criteria for making choices

2. Apply points to some simple examples

Page 3: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Ensure use and share lessons learned

Gather credible evidence

Engage stakeholders

Describe the

program

Focus the evaluation

designJustify

conclusions

STEPS

StandardsUtility

FeasibilityProprietyAccuracy

CDC’s Evaluation Framework

The Standards apply especially when we’re trying to make data collection choices.

Page 4: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

CDC’s Evaluation Standards

The Standards provide a quick and easy way to identify the 2 or 3 best data collection choices for this evaluation.

Standards

Utility

Feasibility

Propriety

Accuracy

Page 5: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

CDC’s Evaluation Framework

Not “Collect data”

Not “Analyze data”

Rather…

“Gather credible evidence”

Ensure use and share lessons learned

Gather credible evidence

Engage stakeholders

Describe the

program

Focus the evaluation

designJustify

conclusions

STEPS

StandardsUtility

FeasibilityProprietyAccuracy

Page 6: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Steps 1-3 Help You Focus Design And Data Collection Choices

After the first 3 steps of the Evaluation Framework, we know which evidence will work for these stakeholders in this situation.

Qualitative data?

Quantitative data?

Randomized control trials?

Performance measures?

Page 7: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

CDC’s Evaluation Standards

The Evaluation Standards help us narrow down our data collection choices to the handful of methods that will work for this evaluation at this time.

Standards

Utility

Feasibility

Propriety

Accuracy

Page 8: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Mixed Methods

Data collection methods that will work for this evaluation at this time sometimes means surveys or focus groups.

But sometimes there is no one best way.

The best choice would be a combination of methods or “mixed methods”.

Page 9: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Six (Most) Common Ways to Collect Data

● Surveys

● Interviews

● Focus Groups

● Document Review

● Observation

● Secondary Data

Page 10: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

How Standards Inform the Choice of Methods

Consider the context :

• How soon do I need the results?

• What resources can I use?

• Are there any ethical issues to consider?

Standards

Utility

Feasibility

Propriety

Accuracy

Standards

Utility

Feasibility

Propriety

Accuracy

Page 11: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

How Standards Inform the Choice of Methods

Also consider the content :

• Sensitivity of the issueStandards

Utility

Feasibility

Propriety

Accuracy

Standards

Utility

Feasibility

Propriety

Accuracy

Page 12: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

How Standards Inform the Choice of Methods

Also consider the content :

• “The Hawthorne Effect”

Will the act of being observed cause someone to distort their response?

Standards

Utility

Feasibility

Propriety

Accuracy

Standards

Utility

Feasibility

Propriety

Accuracy

Page 13: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

How Standards Inform the Choice of Methods

Also consider the content :

• Validity

• Reliability

Standards

Utility

Feasibility

Propriety

Accuracy

Standards

Utility

Feasibility

Propriety

Accuracy

Page 14: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Mixed Methods Address Concerns

Key Concept:

Regardless of the method, when there are validity and reliability concerns, often using more than one method-- i.e., mixed methods--will help.

Page 15: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Mixed Methods: Definition

“The combination of at least one qualitative and at least one quantitative component in a single research project or program.”

(Bergman 2008)

Page 16: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Use Complementary Methods

Mixed methods is:

A combination of methods that has complementary strengths and non-overlapping weaknesses.

The purpose is to supplement or complement the validity and reliability of the information.

Page 17: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Strengths of Quantitative Methods

Strengths of quantitative methods:

• Require less time than qualitative methods

• Cost less• Permit researcher control• Quantitative data is

considered to be “scientific”• Easier to explain validity and

reliability• Easily amenable to statistical

analyses

Page 18: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Strengths of Qualitative Methods

Choose qualitative methods when you are trying to:

• Explore or describe a phenomenon

Page 19: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Strengths of Qualitative Methods

Choose qualitative methods when you are trying to:

• Look for induction (i.e., “surprise”)

Page 20: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Strengths of Qualitative Methods

Choose qualitative methods when you are trying to:

• Identify patterns

Page 21: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Strengths of Qualitative Methods

Qualitative data can help you understand

not just “what” but “WHY”.

Page 22: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

When to Use Mixed Methods

1. Corroboration• better understanding; more credibility• “triangulation” – measuring the same thing from

several different viewpoints

2. Clarification• trying to understand why we got this result

Page 23: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

When to Use Mixed Methods

Mixed Methods are most commonly used for:

3. Explanation – similar to clarification• want to know the “why” or “what” behind the

situation

4. Exploration – similar to explanation• charting new territory• trying to observe patterns• examine different situations and varying

results to induce patterns

Page 24: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Number of Project Facets Reported via Each Data Collection Method

Source: Gregory Guest, PhD

Page 25: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Number of Project Facets Reported via Each Data Collection Method

This is an example of using a qualitative method (site visits) to corroborate a quantitative method (surveys).

The result was increased validity of the data.

Source: Gregory Guest, PhD

Page 26: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Which to Choose?

How do you choose which methods to use?

Which method comes first, the quantitative or the qualitative?

You have a lot of flexibility in these decisions.

Page 27: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Parallel or ConcurrentMixed Methods

For “parallel” or “concurrent” mixed methods,quantitative and qualitative data collection happen at the same time.

QUANTITATIVE

QUALITATIVE

Page 28: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

SequentialMixed Methods

For “sequential” mixed methods, either quantitative or qualitative data collection can happen first.

QUANTITATIVE QUALITATIVE

QUANTITATIVEQUALITATIVE

OR

Page 29: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Example of Sequential Mixed Methods to Corroborate Data

In this case, the qualitative method (site visits) was used to corroborate the quantitative (survey) method and the results were different.

QUALITATIVE

QUANTITATIVE

Page 30: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Mixed Methods Is Your Choice

You are never required to use mixed methods.

However, you may choose to use mixed methods when:

• you have some indication that a single method may give you incorrect data.

• a single method may give you an incorrect perception of reality.

Page 31: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Mixing Methods During Data Analysis

Qualitative data (focus groups, observations, secondary data, etc.) can be converted to “numbers” via quantitative techniques like content analysis.

This is also a mixed method design approach.

Page 32: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Mixing Methods During Data Analysis

Qualitative data can be very complex.

Examining qualitative data with quantitative techniques helps to identify or validate patterns or themes.

Page 33: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Deciding When To UseMixed Methods and How

Key Concept:

Using mixed methods is a deliberate design decision. You use it when you don’t trust the data from any single method.

The reason for your uncertainty determines the methods you choose to mix and the order in which you use them.

Page 34: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Example 1

Problem or Purpose: Validity

• Do people give similar responses on surveys as well as in focus groups?

Example

• Survey (quantitative) and focus groups (qualitative) are conducted concurrently with similar participants.

Concurrent Design

Page 35: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Example 2

Problem or Purpose: Explain unexpected results•Use a qualitative method to explain “blindside” results from a quantitative method.

Example•Survey (quantitative) followed by focus groups (qualitative) to explain or to better understand what’s going on.

Explanatory Sequential Design

Page 36: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Example 3

Problem or Purpose: Verify suspected

patterns

• Explore potential patterns with a qualitative method and then verify the patterns with a quantitative follow-up.

Example

• Focus groups (qualitative) first, identify potential patterns, then do a survey (quantitative) to validate any patterns.

Exploratory Sequential Design

Page 37: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.
Page 38: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Design Options Summary

You mix quantitative and qualitative methods in a different order depending on the presenting problem:

Validate results

Page 39: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Design Options Summary

You mix quantitative and qualitative methods in a different order depending on the presenting problem:

Validate results Explain the unexpected

Page 40: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Design Options Summary

You mix quantitative and qualitative methods in a different order depending on the presenting problem:

Validate results Explain the unexpected Explore new themes

Page 41: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Selected Resources(Page 1 of 2)

Caracelli, V. and J. Greene (eds.). 1997. Advances in Mixed-Method Evaluation: The Challenges and Benefits of Integrating Diverse Paradigms. San Francisco, CA: Jossey-Bass.

Creswell, J. and V. Plano Clark. 2010. Designing and Conducting Mixed Methods Research, 2nd edition. Thousand Oaks, CA. Sage Publications.

Morse, J. and L. Niehaus. 2009. Mixed Method Design: Principles and Procedures. Walnut Creek, CA. Left Coast Press.

Page 42: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Selected Resources(Page 2 of 2)

Johnson, R. Burke, and L. Christensen. Evaluation Methods. 2008. www.southalabama.edu/coe/bset/johnson/

Plano Clark, V. and J. Creswell. 2008. The Mixed Methods Reader. Thousand Oaks, CA: Sage Publications.

Teddlie, C. and Tashakkori, A. 2009. Foundations of Mixed Methods Research: Integrating Quantitative and Qualitative Approaches in the Social and Behavioral Sciences. Thousand Oaks, CA. Sage Publications.

Page 43: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

Recommended Resource

Creswell, J. and V. Plano Clark. 2010. Designing and Conducting Mixed Methods Research, 2nd edition. Thousand Oaks, CA. Sage Publications.

Page 44: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

The Community Tool Box

Community Tool Box http://ctb.ku.edu

Chapter 37, Section 5. Collecting and Analyzing Data

Page 45: Mixed Methods in Program Evaluation Presented by Tom Chapel Thomas J. Chapel, MA, MBA Tchapel@cdc.gov Chief Evaluation Officer 404-639-2116 Centers for.

End “Mixed Methods”

Return to Webinar 4: Gathering Data, Developing Conclusions, and Putting Your Findings to Use

Return to Evaluation Webinars home page


Recommended