+ All Categories
Home > Documents > Module 1 Overview of Evaluation NBCCEDP Enhancing Program Performance through Evaluation.

Module 1 Overview of Evaluation NBCCEDP Enhancing Program Performance through Evaluation.

Date post: 28-Dec-2015
Category:
Upload: shawn-pierce
View: 214 times
Download: 0 times
Share this document with a friend
Popular Tags:
95
Module 1 Overview of Evaluation NBCCEDP Enhancing Program Performance through Evaluation
Transcript

Module 1

Overview of Evaluation

NBCCEDPEnhancing Program Performance

through Evaluation

Module 1: Overview of EvaluationWelcome to Module 1: Overview of Evaluation. This module consists of the following sections:

Choose from the menu above to go to a specific section or choose the next button below to go view the objectives, overview, and organization of this module.

Objectives

Description of Evaluation

Types of Evaluation

Program Outcomes

CDC Framework for Program Evaluation

Planning an Evaluation

My Evaluation Plan

References

Module 1 Quiz

At the conclusion of Module 1 you should be able to:

Identify and describe the six steps in the Centers for Disease Control and Prevention (CDC) Evaluation Framework

Explain the benefits of program evaluation to your program

Module 1 Objectives

Module 1 Overview

This module will provide an introduction to evaluation and describe the importance of evaluation to BCCEDPs.

It will describe a process to easily conduct a program evaluation, the CDC Program Evaluation Framework, and offer examples for evaluating components of BCCEDPs.

Organization of Module 1

In this module, you visit the following sections:

At the end of the module, you can take a short quiz to assess the knowledge you have gained about evaluation.

1. Description of evaluation 2. Presentation of the CDC Evaluation

Framework and an evaluation case study3. Development of your own evaluation plan

Definition of Program Evaluation

Evaluation is one of the essential functions of the NBCCEDP that serves to support all of the activities of the main components of the program:

Recruitment Screening and Diagnostics

Services Data Management Professional Development Partnerships Program Management Quality Assurance and

Improvement

Definition of Program EvaluationEvaluation is defined as the systematic documentation of the operations and outcomes of a program, compared to a set of explicit standards, objectives, or expectations.1

Systematic implies that evaluation is carefully planned and implemented to ensure that its results are credible, useful and are used.

Information represents all the evaluation data that are collected about the program to help make the “judgments” or decisions about program activities.

Activities and outcomes identify what BCCEDPs do, actions of the program, and effects of the program.

It is required that programs have an evaluation plan for all essential program components in order to determine if the components are reaching the desired outcomes.

Why is Evaluation Important?

The purpose of program evaluation is to assess the program implementation (processes), outcomes (effectiveness) and costs (efficiency). It gathers useful information to aid in planning, decision-making, and improvement.

Evaluation aims to better serve BCCEDP participants, program partners, and your own program by :

maximizing your ability to improve the health status of women in your community through the program’s activities.

demonstrating accountability and sound management of resources.

Evaluation tells you:Is our program working?

In short, evaluation helps you explain that you are achieving outcomes.

It helps your communities, funders, and partners see that your BCCEDP get results.

Best of all, it helps you demonstrate that aspects of your program work.

Evaluating BCCEDP components helps you to:

•Monitor compliance with CDC guidance.

•Identify what your BCCEDP has done.

•Learn about your program’s strengths and successes.

•Identify program needs and weaknesses.

•Identify effective and ineffective activities.

•Improve the quality, effectiveness, and efficiency of your program.

•Improve the program operations and outcomes.

•Recognize gaps in the overall program.

•Demonstrate program effectiveness to stakeholders.

•Use findings for program planning, monitoring, and decision making.

Evaluating BCCEDP components helps you to:

•Monitor compliance with CDC guidance.

•Identify what your BCCEDP has done.

•Learn about your program’s strengths and successes.

•Identify program needs and weaknesses.

•Identify effective and ineffective activities.

•Improve the quality, effectiveness, and efficiency of your program.

•Improve the program operations and outcomes.

•Recognize gaps in the overall program.

•Demonstrate program effectiveness to stakeholders.

•Use findings for program planning, monitoring, and decision making.

Why is Evaluation Important?

Program Reflection

Now, take a moment to think about why evaluation is important to your program.

Consider 2-3 reasons why evaluation might be critical.

Click here to go to a Program Reflections Word document to work through as you go through this module.

Types of Evaluation

There are three main types of evaluation that you can conduct:1) Process (evaluation of activities)2) Impact (evaluation of direct effects of the program)3) Outcome (evaluation of longer lasting benefits of the program)

Ideally, any program will conduct a combination of all types of evaluation in order to get a well-rounded picture of what is actually going on.

The use of a specific type of evaluation will depend on the purpose of the evaluation, your program’s stage of development, and available resources. More established activities will have less need for basic process evaluation and may focus more on impact and outcome evaluation. The following section describes the three main types of evaluation. Click here for a summary of the Types of Evaluation.

Types of Evaluation: Process Evaluation

Process evaluation

A process evaluation focuses on:•how a program works to attain specific goals and objectives•how well implementation is going, and•what barriers to implementation exist.

Process evaluation is known as the “counting” or “documenting” of activities and should answer the question, are we doing what we said we would do? It can also help identify problems with program implementation earlier.

Process evaluation is most appropriate when your program is already being implemented or maintained, and you want to measure how well the program process is being conducted. These process data often provide insight into why outcomes are not reached.

Sample process evaluation questions include:

How many women were screened by the time of your progress report or annual report?

Where were the program-sponsored professional development events held in the last year compared to the location of your clinics?

How many women at clinic A were advised by a provider to be screened?

Are providers satisfied with the program?

Types of Evaluation: Impact Evaluation

Impact Evaluation

An impact evaluation is implemented in order to determine what direct effects the program actually has on those directly and indirectly experiencing it, not only participants, but also program partners and the community.

The impact evaluation also provides information about whether the program has been able to meet its short-term goals and objectives.

Impacts on the program often focus on changes in awareness of breast cancer and need for screening, attitudes toward preventive screening, and screening behaviors on the part of the participants.

Sample impact evaluation questions include:

To what extent has your program met its screening targets?

To what extent has your program delivered appropriate and timely screening and diagnostic services?

Have your partnership activities helped to increase screening among priority populations?

To what extent has your program employed evidence-based strategies for recruitment and guidelines for screening of women?

Types of Evaluation: Outcome Evaluation

Outcome evaluation

An outcome evaluation is implemented in order to discern whether a program has been able to meet its long-term goals and objectives.

Outcome evaluation can track the maintenance of program effects (i.e., screening over time). It also documents longer term effects on morbidity and cancer mortality.

Sample outcome evaluation questions include:

Have the number of mammograms and Pap smears provided increased over time?

Has the program maintained enrollment of women in priority populations over time?

Has the program continued to detect breast and cervical cancers in earlier stages?

Review of Types of EvaluationType When to Use What it Assesses Why it is useful

Process Evaluation

•As soon as a program begins•During the conduct of component activities

•The extent to which the program is being implemented as designed•How well the program is working related to defined objectives•The accessibility and acceptability of the program to its target audience

•Provides early warning for any problems•Allows for program monitoring of component activities

Impact

Evaluation

After component activities have begun

The direct or immediate effects of the program components

Describes whether the program is being effective in meeting its objectives

Outcome

Evaluation

•At appropriate intervals during a program•At the end of the program

The intermediate or long-term

The degree to which the program decreased breast and cervical cancer morbidity and mortality and reduced cancer disparities

•Provides evidence of long-term effectiveness•Allows for policy and funding decisions

Using Program Outcomes in Evaluation ProcessFor the BCCEDP, there are major outcome measures for every program component. It is important to keep your program’s outcome measures in mind throughout the evaluation process, including the development stage of your plan, during data collection and especially when interpreting your findings.

Your program activities lead to outcomes that help reach NBCCEDP goals. If you want to assess whether outcomes were met, your evaluation will focus on how they were achieved. From the evaluation results, you may learn that activities need to be revisited, revised, or added to meet program goals and objectives.

Program Activities

Evaluation of a Program Component

Program Outcome Measures

NBCCEDP Component

Short-term Goals Intermediate Goals Long-term Goals

Program Outcome MeasuresThe boxes below provide examples of outcomes for each program component. You may have already incorporated these within your evaluation or you may want to consider how these outcomes may be helpful to you. Outcomes can be process-oriented or health-centered, focusing on priority populations or knowledge, attitudes, or skills development.

Click on the program component below to review examples of outcomes for that component. Click here for a summary of suggested Program Outcome Measures.

Program Management Outcomes

Quality and characteristics of annual workplan.

Correspondence of program’s budget to its workplan.

Allocation of resources to implement program components.

Staff’s comprehension of NBCCEDP components.

Use of program data for program planning and decision-making.

Screening & Diagnostic Services Outcomes

Access to program services for eligible women.

Access to cervical cancer screening, diagnostic and treatment services.

Access to breast cancer screening, diagnostic and treatment services.

Provision of services according to clinical guidelines approved by the grantee medical advisory board or consultants.

Use of current data for effective case management.

Data Management Outcomes

Existence of data systems to collect, edit, manage, and continuously improve tracking of services provided.

Reduction or elimination of program data errors.

Existence of mechanisms for reviewing and assessing data quality.

Quality Assurance & Quality Improvement Outcomes

Providers’ use of current standards, accepted clinical guidelines, and program policies as assessed by program staff.

Existence and maintenance of a continuous quality improvement committee.

Monitoring, assessing, and improving clinical services to meet or exceed CDC performance benchmarks for quality.

Program-eligible women’s satisfaction with program services provided.

Program providers’ satisfaction with the program.

Evaluation Outcomes

Quality of evaluation plans for each program component.

Availability and quality of program data. Conduct of evaluation activities in order to

assess program effectiveness. Use of evaluation results to inform program

decision-making.

Recruitment Outcomes

Implementation of evidence-based strategies to increase recruitment.

Recruitment of program-eligible women in priority populations.

Conduct of program activities to increase awareness about breast and cervical cancer.

Program-eligible women’s attitudes toward screening.

General public’s knowledge about the need for breast and cervical cancer screening.

Partnerships Outcomes

Use of partnerships to recruit and retain providers.

Use of partnerships to educate and increase awareness of breast and cervical cancer.

Use of partnerships to promote and facilitate breast and cervical cancer screening.

Use of partnerships to promote professional development activities.

Engagement of community partners in activities that promote breast and cervical cancer screening services for NBCCEDP priority populations.

Professional Development Outcomes

Development of partnerships with academic and professional organizations in order to build resources for professional development activities.

Local BCCEDP staff’s knowledge about breast and cervical cancer screening, diagnosis, and treatment.

Adoption of evidence-based clinical practices by providers to improve services.

Assessment of provider needs for improving screening, diagnosis, and treatment.

Integration of cultural sensitivity into professional development activities.

If you would like more information about evaluation, you can review the Evaluation Chapter of the NBCCEDP Program Guidance Manual.*

Click here to link to the Evaluation Chapter. WORD | PDF

More Information on Evaluation

*This chapter was made available as of April 2007. Please check withyour program consultant as to potential updates to the chapter.

Getting Expert Assistance with Evaluation

As you are planning for the evaluation, you should consider if you need a person with expertise in planning and carrying out the evaluation. Some of the types of people who may be helpful are:

People from Cancer registries Surveillance staff Data management staff Program evaluators Economists for cost studies.

You may find them within your own organization or within collaborating organizations. These experts may be particularly helpful to your evaluation team.

CDC Framework for Evaluation

In the next section, we will review the CDC Framework for Evaluation. It is a process that you can use to conduct an evaluation project.

To learn more about the Framework, click CDC Framework for Evaluation.

CDC Framework for Evaluation

The CDC Framework for Program Evaluation in Public Health is the recommended process for conducting evaluations. It outlines six steps for program evaluation. Click here to see a reference handout on the Framework for Evaluation.

The following section will use the Framework for Program Evaluation as a guide to the evaluation process for breast and cervical cancer early detection programs. These steps will be described as they relate to BCCEDP evaluation.

Modules 2 and 3 will walk you through the process of conducting an evaluation of recruitment and partnership development strategies.

Goals of the CDC Framework

The CDC Framework for Evaluation is a process to help you answer these questions:

1. Who is the evaluation for?2. What program are we evaluating?3. What methods will we use in conducting our evaluation?4. How will we gather and analyze information that is credible and in

what forms?5. How will we justify our conclusions?6. How can we be assured that what we learn will be used?

Ensuring a Quality Evaluation

The CDC Framework also has a part that addresses the quality of an evaluation.

To answer how good an evaluation is, the Framework suggests that you consider the following standards for evaluation:

Utility Is the evaluation useful?

Feasibility Is the evaluation practical and viable?

Propriety Is the evaluation ethical?

Accuracy Are the evaluation measures and results found correct?

CDC Framework for Evaluation and Evaluation Case Study

As we introduce the CDC Framework for evaluation, you will also be presented with an example for each step.

The example will focus on a screening evaluation question.

Now, you will be taken through each step of the CDC Evaluation Framework.

The CDC Framework for Evaluation Step 1: Engage Stakeholders

The first step in the program evaluation process is to engage stakeholders.

Stakeholders are those persons or organizations that have an investment in what will be learned from an evaluation and what will be done with the knowledge. It is important to involve stakeholders in the planning and implementation stages of evaluation to ensure that their perspectives are understood and that the evaluation reflects their areas of interest.

Involving stakeholders increases awareness of different perspectives, integrates knowledge of diverse groups, increases the likelihood of utilization of findings, and reduces suspicion and concerns related to evaluation.

The CDC Framework for Evaluation Step 1: Engage Stakeholders

Here are different types of BCCEDP stakeholders who are important to engage in all parts of the evaluation.

BCCEDP stakeholders can fall into different categories:

Decision-makers Implementers Program Partners Participants

The table provides examples of each category of stakeholders.

Stakeholder Types Examples

Decision-makers Program DirectorsNBCCEDP staffEvaluation CoordinatorsCenters for Disease Control and PreventionChronic Disease Directors

Implementers Program staffClinical staffMedical care providers/ provider networkCommunity-based organizationsLicense Boards

Program Partners American Cancer SocietyNational Cancer InstituteSusan G. Komen FoundationProfessional associationsLocal health departmentsLocal, state and regional coalitionsAdvocacy groupsMedical societiesLocal universities and collegesLocal businesses Local mediaMedical care providers

Participants Women eligible for program or enrolled in program

Program Reflection

Identify potential stakeholders who are important to your program.

Identify specific people or groups of people who will use the evaluation findings to make decisions about your program.

Click here to go to a Program Reflections Word document to work through as you go through this module.

The CDC Framework for Evaluation Step 1: Engage Stakeholders

The CDC Framework for Evaluation Step 1: Engage Stakeholders

Some of the roles of stakeholders may include:

Serving on an advisory committee for the evaluation.

Prioritizing aspects of the program to evaluate.

Developing questions or surveys. Providing resources (e.g., American

Cancer Society, Centers for Disease Control and Prevention).

Offering data sources (e.g., state cancer registry, National Cancer Institute, Centers for Disease Control and Prevention).

Analyzing data. Communicating evaluation results.

Engage Stakeholders Case StudyYou are interested in assessing how your program is doing with its screening activities.

For your BCCEDP, screening activities include: 1) women with abnormal screening test results receive timely diagnostic examinations, and 2) women will receive a final diagnosis after an abnormal screening result.

Stakeholders who may be interested in this question are:

Program director, screening coordinator, case management coordinator, or quality assurance coordinator

Provider networks that screen and provide services for the women

CDC

An Evaluation WorkplanAn evaluation plan is a program management tool that provides direction and guidance for the overall evaluation as well as each evaluation component. It is designed to be used for evaluation planning, implementation, and monitoring progress. Designing an evaluation plan is intended to make the job of managing your program more efficient and effective.

Here is an example of a simple evaluation plan that takes you from your questions to how to use the data. Click here for a Word version of the Evaluation Plan.

Evaluation Question Uses of Evaluation Findings Data Collection Processes

Data Analysis

Questions that ask about aspects of the program that you want to examine. You can begin with evaluation questions related directly to your objectives and/or measures of success described in the workplan.

You may also want to consider other bigger picture questions of interest to your program that go beyond the workplan.

Users. A subset of stakeholders can clarify intended uses, help prioritize questions and methods, and prevent the evaluation from becoming misguided or irrelevant. The users are the participants who will use the evaluation findings.

Uses. To be useful to the program, evaluation results must be applied. The results could, for example, identify an ineffective activity, suggest a way to adjust resources, document a positive outcome or mobilize support for the program.

The data collection processes events or things that will need to be in place to answer the evaluation question.

Data analysis that includes reviewing data or comparing data to set targets or measures of success.

Completing the Evaluation Workplan

Evaluation Question Uses of Evaluation Findings Data Collection Processes

Data Analysis

Step 3: Focus the Evaluation Design

Enter the Evaluation Question.

Step 1: Engage Stakeholders

Enter the key stakeholders from and their interest in the evaluation here.

Step 4: Gather Credible Evidence

Enter the data source from here.

Step 5. Justify Conclusions

Enter how you will analyze or make judgments about the evaluation results.

We will be using the Evaluation Workplan to help you prepare for an evaluation throughthe course of this training. When you are completing the evaluation workplan, you should enter the following information related to the CDC Framework for Program Evaluation in each column.

Step 2: Describe the Program is not included in the plan. Your activities are probably already described in your work plan. You can also put more information about the activities to be evaluated under the column Evaluation Question.

Now that you have identified stakeholders, here is how your evaluation plan would appear.

Evaluation Question Uses of Evaluation Findings Data Collection Processes

Data Analysis

Program Manager. Improve the program’s meeting of the standard of timeliness.

Screening and Case Management Coordinators. Data on how well service delivery is being conducted.

CDC. Data on meeting of NBCCEDP standard of timeliness

Engage Stakeholders Case Study

The CDC Framework for EvaluationStep 2: Describe the Program

The second step in the program evaluation process is to describe the program.

Before you can develop an overall evaluation plan, it is important to have a description of, and the context of, your program. Without an agreed-upon program definition and purpose, it will be difficult to focus the evaluation efforts and results will be of limited use.

Important aspects of the program description are: 1) need for the program, 2) resources, 3) component activities, and 4) expected outcomes.

This information is often found in your workplan. Therefore, it is valuable to review the annual workplan with the objectives for each component area.

In order to understand how all of these aspects of the program work together it is valuable to develop a picture of your program. This can be done in a number of formats including a flow chart. The program flow chart shows the larger picture of the program including the relationships between individual activities and the expected results.

A program flowchart describes the program: resources or inputs, activities, products or outputs, and outcomes.

It is a graphic representation of all aspects of the program and how they work together to achieve the program’s long-term outcomes. Some people call this picture a logic model.

The CDC Framework for EvaluationStep 2: Describe the Program

NBCCEDP Flowchart

Federal programs and NBCCEDP

staff

Grantee breast and cervical

cancer program

State and community

partners

Program participants and

public

Workplan

Inputs ActivitiesThis flowchart presents the relationships between your program component activities and their potential outcomes.

By clicking on each component under the Activities, you will be able to view examples that can help you begin to think about how your program corresponds to the flow chart.

Click here to review a printable version of the NBCCEDP flowchart.

Workplan

RecruitmentScreening &

Diagnostics Services

ProfessionalDevelopment

Quality AssurancePartnerships

Data Management

Evaluation

Management

Number of women diagnosed in the program who have an early stage of the disease.

Providers are knowledgeable about breast and cervical cancer screening.

Current standards, guidelines, and policies assessed by the medical advisory consultants.

Improvement of rates of breast and cervical cancer rescreening per clinical guidelines.

Evidence-based clinical guidelines adopted by providers to improve services.Assessment of needs of providers.

Use of current data for effective case management.

Evidence-based strategies implemented to increase recruitment.Services provided according to clinical guidelines approved by the medical advisory committee.

Program services accessible to eligible women.Access to cervical cancer screening and diagnostic and treatment services.Access to breast cancer screening and diagnostic and treatment services.Recruitment of program eligible women priority populations.

Cervical cancer screenings among program eligible women, with an emphasis on rarely/never screened.

Breast cancer screening among program eligible women, with an emphasis on those aged 50 – 64.

Evidence-based practices used in service delivery.

Breast and cervical cancer early detection issues are addressed by sustained and effective partnerships.

Collaborations are used to recruit providers and conduct professional development activities.Sustainability and effectiveness of partnerships.

Timely and adequate service delivery and case management to women with abnormal screening results (or diagnosis of cancer).Treatment service delivery for women with cancer.Case management provided to women with abnormal screening results.

Use of program data for program planning and decision making.QA data provided for the clinical practice QA program.

Reduction or elimination of program data errors.

Program eligible women’s and public’s awareness, attitude and knowledge of screening.

Eligible women re-screened at appropriate intervals.

Short term Outcomes Intermediate Outcomes

A reduction in breast and

cervical cancer related

morbidity and mortality.

Long-temOutcomes

Outputs

Management•Staff hired•Number of staff meetings

Evaluation•Existence of an evaluation workplan

The CDC Framework for EvaluationStep 3: Focus the Evaluation

Once the key players are involved, and there is a clear understanding of the program, it is important to focus the evaluation design and determine the purpose, users, uses, evaluation questions, and develop an evaluation plan.

Focusing the evaluation design is important because it helps identify useful questions that you want to ask about your program. Click here for a handout of Examples of Evaluation Questions for BCCEDPs.

The CDC Framework for EvaluationStep 3: Focus the Evaluation

Due to the large size, complexity, and finances of the program, it is not possible to evaluate every aspect of the program.

You should first examine the DQIGs that provide valuable monitoring data to help identify problem areas for further investigation and the program performance as noted in progress reports. Then, focus evaluation efforts on the areas of the program that are not working optimally.

It is, however, important to look at the program as a whole and think about how you would evaluate each aspect of the program.

You should first begin addressing:• Any new initiative with

resources allocated to it.• Any activity that consumes a

high amount of resources.• Activities that are not

successful at meeting their measures of success.

• Program inconsistencies to explore why they exist.

• Any unevaluated activity (e.g., recruitment strategies, screening) that is employed frequently by the program.

You should first begin addressing:

• Any new initiative with resources allocated to it.

• Any activity that consumes a high amount of resources.

• Activities that are not successful at meeting their measures of success.

• Program inconsistencies to explore why they exist.

• Any unevaluated activity (e.g., recruitment strategies, screening) that is employed frequently by the program.

What are key evaluation questions that you want answered?

Identify:•the stakeholders of the evaluation findings (e.g., program managers, CDC, providers), •what do they need to know, and •how they will use that information.

Click here to go to a Program Reflections Word document to work through as you go through this module.

The CDC Framework for EvaluationStep 3: Focus the EvaluationProgram Reflection

Think about what your outcome measure would be for the provision of timely and appropriate diagnostic services to women receiving abnormal breast or cervical cancer screening results (follow-up).

Here are activities that your program does to help track this outcome:

1. Reviewing weekly all abnormal breast and cervical screening results to determine appropriate referral and follow-up status.

2. Logging the name and total number of patient records for women with abnormal results.

3. Assigning case management for every client with an abnormal screening.

4. Case managing women to ensure appropriate tracking, referral and follow-up.

Focus the Evaluation Case Study

Due to the fact that screening and diagnostic services are a large component of your program, you may want to focus part of your evaluation plan on how to improve it.

From your workplan, your goal was to ensure that all of your clients with abnormal screening tests results receive timely diagnostic examinations.

The measure of effectiveness that you indicated was that:• At least 75% of participating women will receive a final diagnosis within 60

days after an abnormal Pap smear• At least 75% of participating women will receive a final diagnosis within 60

days after an abnormal mammography result

Let’s say you are interested in whether you are meeting your screening targets that were set in your workplan. You ask the question:

To what extent has your program met NBCCEDP standards for timeliness of follow-up?

You would then need to determine how and what information needs to be collected to answer these questions. We will cover this in the next section.

Focus the Evaluation Case Study

Now that you have identified the evaluation question, here is how your evaluation plan would appear.

Evaluation Question Uses of Evaluation Findings Data Collection Processes

Data Analysis

To what extent has your program met NBCCEDP standards for timeliness of diagnostic follow-up?

Program Manager. Improve the program’s meeting of the standard of timeliness.

Screening and Case Management Coordinators. Data on how well service delivery is being conducted.

CDC. Data on meeting of NBCCEDP standard of timeliness

Evaluation Plan Case Study

The CDC Framework for Evaluation Step 4: Gather Credible Evidence

Gathering credible and relevant data is important because it strengthens evaluation findings and the recommendations that follow from them, thereby enhancing the evaluation’s overall utility and accuracy.

When selecting which processes or outcomes to measure, keep the evaluation purpose and uses of the data findings in mind. An evaluation’s overall credibility and relevancy can be improved by: 1) using multiple purposeful or systematic procedures for gathering, analyzing, and interpreting data, and 2) encouraging participation by stakeholders.

The CDC Framework for Evaluation Step 4: Gather Credible Evidence

In gathering data, you should start with the evaluation questions listed in the evaluation plan. The questions should identify the data indicator of interest. The data collection process should identify the source(s) to collect information on the indicator.

The sources could include persons, documents, or observation. If evaluation was included as part of the planning process, your program will already include data collection activities. If existing data systems cannot answer your evaluation questions, you might consider developing your own system that will monitor or track what you are interested in evaluating.

Indicator Source

Process Data

Barriers to screening services Focus groups or interviews with women

Patient satisfaction with services Patient satisfaction survey

Outcome Data

Women who receive screening Intake survey at program clinics

Clients with cancer diagnosis starting treatment within 60 days

Clinical charts and follow-upCase management written plans

Collaborative activities with program partners Meeting minutes

The CDC Framework for Evaluation Step 4: Gather Credible Evidence

Qualitative Versus Quantitative Data

Quantitative data are information expressed as numbers. Surveys, patient logs, and a MDEs database are examples of quantitative data collection methods.

However, not all data collection methods are quantitative. Some issues are better addressed through qualitative data, or information described in words or narratives. Qualitative data collection methods include observations, interviews, focus groups and written program documents.

Often a combination of the two methods provides the most accurate representation of the program. For example, you may find that interviews with providers in your provider network or site visits to clinics may give you more information about service delivery than surveys can.

Here are some examples of possible sources for both quantitative and qualitative data collection methods. Original information collected directly by you or your program is considered primary data.

Primary Data Quantitative Data Collection: • Enrollment, screening, and diagnosis service forms• Minimum Data Elements (MDEs)• Staff surveys and interviews• Site visit report• Tracking of media• Provider surveys or interviews• Surveys of intended audience• Logs • Chart audits• Progress reports• Needs assessment

Qualitative Data Collection: • Exit interviews of participants• Focus groups• Topic expert interviews• Staff meeting minutes• Site visits• Observations of activities, staff, and clients

The CDC Framework for Evaluation Step 4: Gather Credible Evidence

Other options for collecting data to use in evaluating your program include sources of information produced by others, also known as secondary data.

Secondary DataQuantitative Data Collection: Cancer RegistriesBehavioral Risk Factor Surveillance System (BRFSS)Medical recordsMedical claims dataCensus dataVital statisticsNational Health Interview SurveyVital records

Qualitative Data Collection: Administrative reportsPublications and journals

Click here for a summary of BCCEDP Data Indicators and Collection Methods.

The CDC Framework for Evaluation Step 4: Gather Credible Evidence

Name three types of data you collect for your program.

For each one, think about how you can ensure that the

data collected are credible and accurate.

Click here to go to a Program Reflections Word document to work through as you go through this module.

The CDC Framework for Evaluation Step 4: Gather Credible EvidenceProgram Reflection

Gather Credible Evidence Case Study

Going back to our Screening example of tracking timeliness of follow-up after an abnormal screening for women in your program:

What data source would you use to locate this information?

You would most likely use program records for participants, weekly screening records of women with abnormal results, and your case management data file to locate this information.

Now that you have decided on the data indicator of timeliness of follow-up and its source, here is how your evaluation plan would appear.

Evaluation Question Uses of Evaluation Findings Data Collection Processes

Data Analysis

To what extent has your program met NBCCEDP standards for timeliness of diagnostic follow-up?

Program Manager. Improve the program’s meeting of the standard of timeliness.

Screening and Case Management Coordinators. Data on how well service delivery is being conducted.

CDC. Data on meeting of NBCCEDP standard of timeliness

Review of patient screening information data/CaST to determine number of women with abnormal results.

Review of case management files.

Evaluation Plan Case Study

The CDC Framework for Evaluation Step 5: Justify Conclusions

The evaluation conclusions are justified when they are linked to the data gathered and judged against agreed-upon values or standards set by the stakeholders. Stakeholders must agree that conclusions are justified before they will use the evaluation results with confidence.

After you have read through the evaluation findings, you should interpret and discuss preliminary evaluation findings with program staff and stakeholders. You can then develop action steps based on the evaluation findings.

The CDC Framework for Evaluation Step 5: Justify Conclusions

The major activities in justifying conclusions are to:

1. Analyze the evaluation data.

2. Interpret the results to discover what the data say about the program. Decide what the numbers, averages, and statistical tests tell you about the program activity.

3. Make judgments about the program data. Decide on the meaning of the data with a group of stakeholders by comparing evaluation results with CDC standards or your measures of success described in the workplan. Then, make a recommendation to address the evaluation results

The CDC Framework for Evaluation Step 5: Justify Conclusions

Once data (evidence) are analyzed and summarized, conclusions can be drawn about the component activities.

To begin to make sense of your evaluation data, you should re-examine defined program standards, or what is defined as successful, adequate, or not successful. These may be objectives or benchmarks set in the work plan, performance criteria set by CDC, baseline data, or previous performance of your own agency or similar programs.

Interpretation is attaching meaning to the evaluation data. It is important to gather stakeholders and discuss the data.

The conclusions should lead to further recommendations for program action. For example, recommendations may include continuing, redesigning, expanding, or stopping a program activity.

The CDC Framework for Evaluation Step 5: Justify ConclusionsWays to justify conclusions for BCCEDPs:

Compare the data to previously set standards or performance indicators (e.g., CDC/NBCCEDP) Women will receive a final diagnosis

within 60 days after an abnormal Pap smear

Percent of women enrolled will be those who have never or rarely been screened for cervical cancer

Compare data to your BCCEDP objectives (e.g., target enrollments)

Make comparisons of the data with those from previous years

Make subjective or qualitative judgments about the results in terms of meeting the objective, needs improvement or not meeting the objective

Justify Conclusions

When analyzing your data, it is helpful to examine how you have met your program outcomes, or how you justify your conclusions. But it is also important to critically think about what can be improved upon if outcomes are not met so that during the next evaluation you will be on target.

For example, let’s say you analyzed screening data for this past fiscal year. Only half of the eligible women in a priority population were screened for breast cancer and only ¼ of the eligible women were screened for cervical cancer.

Are the results similar to what you expected? If not, why do you think they are different?

Why might this be? How would you go about assessing why and how outcomes were not met?

Justify Conclusions

The following action steps and questions may be useful to consider when you are ready to analyze and interpret your findings. Remember to go back and review your evaluation question to make sure it is answered.

On what standards are you evaluating your findings on? National set standards for screening? Past program results? National averages?

Review your recruitment strategies with program staff.

Strengthen relationships with community organizations to build consensus for program involvement if need be.

Examine data quality. Did you find any incomplete or missing data that could push your enrollment numbers down? What could you do to remedy that?

Are your recommendations based on these findings? Why or why not?

What are your findings limitations? Did you include various stakeholder viewpoints to analyze and interpret the findings? How can this help you or others improve evaluation plans in the future?

Justify Conclusions Case Study

Based on our timeliness of screening example, you now want to compare the percentage of women who receive a final diagnosis within 60 days after a screening test results to the 75% of participating women who had breast or cervical cancer diagnostic work-up in your program within 60 days set in your workplan.

Target Objective [from workplan]: By the end of the fiscal year, 75% of participating women will receive a final diagnosis within 60 days after an abnormal Pap smear or mammography result.

Evaluation Findings: In this fiscal year, only 60% of women in your program had received a final diagnosis after an abnormal mammogram within 60 days, while 78% of patients who had an abnormal Pap smear received a final diagnosis within 60 days.

Therefore, you can compare the finding of the annual percentage of women who had received a final diagnosis (60%) within 60 days after an abnormal result to your target objective (75%). You realized that this year you did not reach your goal for timeliness of diagnosis for mammography screening, although it was partially met. The next step is to think about how to better achieve this screening target of timeliness of diagnostic workup.

Justify Conclusions Case Study

It is important to reflect on the screening results and what reasons may have contributed to the measure of effectiveness not being met.

You start talking to your case managers and regional contacts from provider agencies. You may find that some providers did not see the urgency in scheduling women immediately or that there were issues with finding certain diagnostic equipment within rural areas. The next step is to figure out ways to address these issues to better achieve your goal.

Thinking about those issues may lead to solutions for improving your chances of reaching your target objective in the future.

Now that you have decided on how to evaluate your results, here is how your evaluation plan would appear.

Evaluation Question Uses of Evaluation Findings Data Collection Processes

Data Analysis

To what extent has your program met NBCCEDP standards for timeliness of diagnostic follow-up?

Program Manager. Improve the program’s meeting of the standard of timeliness.

Screening and Case Management Coordinators. Data on how well service delivery is being conducted.

CDC. Data on meeting of NBCCEDP standard of timeliness

Review of patient screening information data/CaST to determine number of women with abnormal results.

Review of case management files.

Compare the percentage of women who receive a final diagnosis within 60 days after a screening test results to the standard of 75% set in your workplan

Evaluation Plan Case Study

The CDC Framework for Evaluation Step 6: Use of Evaluation Findings

Decision-makingNow that the findings are in hand, program directors and staff can use those data to inform programmatic decisions. Examine the suggested recommendations and decide on a course of action.

Some actions may be easy to implement such as expanding, continuing or stopping an activity. Others may lead to changes in program operations such as changing recruitment efforts, charging case managers to quickly follow-up with patients, or recontacting providers for missing data on forms. Therefore, it is important to use the data for improvements and future program planning.

Data as Monitoring ToolFurthermore, the data may be kept for progress reports or ongoing monitoring and evaluation as comparison points. For example, you may track the number of women screened semi-annually and use that information to track performance over time.

The CDC Framework for Evaluation Step 6: Use of Evaluation Findings

Once the findings are established, there needs to be a concerted effort to share the lessons learned with stakeholders in a timely manner. This step involves the preparation of tangible products of the evaluation, including an overall evaluation report and specific and targeted recommendations.

Early on in evaluation planning, a strategy should be developed for disseminating evaluation findings to users of the evaluation data. It is key to share information with stakeholders to keep their buy-in with your program. In addition to sharing the results and recommendations with the program stakeholders, disseminate your findings to others in the field who may be able to learn from your efforts and results.

Results can be shared formally or informally. Deliberate effort is needed to ensure that the evaluation processes and findings are used and disseminated appropriately.

The CDC Framework for Evaluation Step 6: Use of Evaluation FindingsUse of Findings

Here are some examples of ways to share evaluation results.

• Writing a formal report to CDC, other funders, and stakeholders, modifying the report for each audience.

• Making an oral report to staff, funders, and local partners.

• Making presentations to health care providers and provider networks to show them how they have contributed to the program results.

• Using evaluation results to help document program data for new or continued grant funding.

The CDC Framework for Evaluation Step 6: Use of Evaluation Findings

Use of Findings• Sharing evaluation highlights

through program newsletters and health channels in the community.

• Publishing evaluation efforts and findings in a research or practice journal.

• Developing a presentation of the project evaluation at a regional or national cancer conference.

• Giving presentations at women’s clubs, libraries or partner organizations.

• Posting a message on the NBCCEDP Web Forum.

The CDC Framework for Evaluation Step 6: Use of Evaluation Findings

Writing an Evaluation Report

Writing evaluation results and reports is an important part of sharing information within the program and with program stakeholders.

In many cases, there will be different versions of reports developed for specific groups including information relevant to the portion of the program with which they work. For example, since you have to submit progress and annual reports (e.g., interim progress reports describe progress made towards objectives) to CDC, your program could include parts of your evaluation finding in those reports.

Sections of an evaluation report include:

Executive SummaryIntroductionMethodsResultsCharts and TablesDiscussionConclusionRecommendations/Action PlanAppendices

Take a look at a more detailed outline.

Evaluation Report Outline• Executive Summary• Introduction

– General Description of the Program – Description of the Program Component– Description of the Objectives/Measures of Success for the Component– Statement of the Evaluation Questions

• Methods– Description of Data Indicators for each Evaluation Question– Description of the Data Collection Methods for each Question– Procedures for Data Analysis– Description of Limitations of Data Collection

• Results– Charts and Tables– Summary of Results– Discussion of Results

• Conclusions based on the Results• Recommendations to Program Administrators or Staff• Action Plan to address the Recommendations• Appendices

– supporting documentation– copies of instruments– detailed tables

Click here for an Evaluation Report Checklist.

Now that we have findings from our evaluation question of meeting NBCCEDP standards of timeliness of diagnostic follow-up, what can you do with the information?

Program Decision-makingYou realize that the results demonstrate that your program has poor timeliness from the receipt of an abnormal mammography result to diagnosis. The program director meets with staff and they have some data to indicate that the problem may be due to case managers not following up with the patients quickly to help them access diagnostic services. Therefore, lead case managers are contacted to encourage their staff to connect with patients with abnormal results more rapidly.

You also decide to conduct more process evaluation to follow-up on those individual cases that were not timely to determine the reasons. This information could result in other programmatic changes to improve this area.

Use of Evaluation Findings Case Study

Use of Evaluation Findings Case Study

Sharing the data • Share findings of not meeting the target with your quality assurance

coordinator and case manager coordinator. Have them consider ways to improve the percentage of women with abnormal screening tests who have timely diagnostic examinations.

• Use the information to encourage discussion with stakeholders like contracted providers to improve timeliness.

• Report percentages to CDC in annual reports.

Can you think of possible recommendations for this finding? Some potential solutions that may be raised are to:

• Work with quality assurance and case management coordinators to ensure that women with abnormal results are tracked and receive follow-up.

• Dialogue with contracted providers to address issues with timeliness of diagnostic follow-up.

Now that you have walked through an evaluation question, here is how your evaluation plan would appear with all of the steps completed.

Evaluation Question Uses of Evaluation Findings Data Collection Processes

Data Analysis

To what extent has your program met NBCCEDP standards for timeliness of diagnostic follow-up?

Program Manager. Improve the program’s meeting of the standard of timeliness.

Screening and Case Management Coordinators. Data on how well service delivery is being conducted.

CDC. Data on meeting of NBCCEDP standard of timeliness.

Review of patient screening information data/CaST to determine number of women with abnormal results.

Review of case management files.

Compare the percentage of women who receive a final diagnosis within 60 days after a screening test results to the standard of 75% set in your workplan.

Evaluation Plan Case Study

Planning an EvaluationFor an evaluation project, it may be helpful to refer to the following critical components of planning and managing an evaluation project. You should consider each of these areas carefully for a successful project outcome.

Staff: Equip an evaluation project with qualified personnel. Include coordinators and other staff most knowledgeable about the BCCEDP component areas related to the evaluation questions.

Orientation and training: Acquaint all personnel who will work on the evaluation project with their responsibilities by drafting an evaluation workplan or other project management tool.

Scheduling: Develop and maintain up-to-date projections of all relevant evaluation activities with a detailed timeline and project management tool such as an evaluation workplan.

Control: Plan, monitor, and maintain a sense of control over all evaluation activities to make sure the proper work is completed.

Economy: Monitor time and resources to ensure that all factors related to the evaluation operate as smoothly and efficiently as possible.

Click here to see a checklist for Evaluation Planning and Management.

Barriers to Evaluation and Solutions

Now that you have been orientated to the CDC Framework for Evaluation and have walked through an example that applied the framework, it seems like a very straightforward process. That is true, but it is also realistic to be prepared for challenges that may arise during the process.

Listed in the table on the next slide are possible barriers to evaluation and the suggested solutions to keep your plan on track.

Evaluation Barrier Solution

Evaluation is “done to” rather than “for” your program.

Program directors and CDC/NBCCEDP staff should educate program staff and program partners on how evaluation helps to improve their program first and foremost. It tells you what is working and not working with the overall BCCEDP program and individual program component. Evaluation discussion should be on the agenda at program staff meetings to help everyone recognize its importance. All staff and program partners should be involved in evaluation planning, devising questions, reviewing instruments, reviewing results and deciding action steps. Therefore, evaluation is seen as a core function of the program and your program sets the direction for the evaluation.

Fear of consequences. The program should engage the stakeholders as an important part of the evaluation process and provide support for the evaluation and uses for its findings. An evaluation may reveal program successes or problems. Both types of information are significant. The potential discovery of problems should be viewed as an opportunity to learn and improve the program.

Lack of knowledge or expertise.

Key program staff should read information on evaluation and attend trainings on the topic to help educate other staff about the process. The resource sections in the Getting Help with Evaluation at the end of this chapter discuss other resources for learning about the theoretical foundations and methods for program evaluation.

Lack of resources. Often, program resources are allocated more towards program component activities and not evaluation. Funding must be allocated toward evaluation in the program budget. You may also seek assistance on certain aspects of evaluation from other program directors, program partners, or CDC program consultants. Finally, you can hire an outside contractor to conduct specific evaluation activities, if necessary.

Getting Help with Evaluation

Program evaluation consists of a variety of theories and processes, and at some point it may be helpful to seek the expertise of evaluation experts. It is important to prioritize while evaluating your program, and to be aware of those evaluation areas for which you may need help.

There are several sources of assistance that your program may seek. The CDC Evaluation Workgroup and the American Evaluation Association web sites provide advice on how to conduct evaluation, as well as resources to help you identify other evaluation consultants. Other options for obtaining help include:•Employing a program partner•Working with faculty and students from local universities as a free resource•Hiring an evaluation consultant or contractor

There are many types of people who may serve as consultants or contractors. Click here to become familiar with important characteristics and skills to keep in mind when selecting a consultant.

What to look for in a Professional Evaluator?

When considering hiring an outside expert to help you conduct your evaluation, it is important to keep in mind the following as you make your decision.

The consultant:• Has experience in the type of evaluation needed• Is comfortable with qualitative and quantitative data sources and analysis• Is able to work with a wide variety of BCCEDP stakeholders, including

representatives of target populations• Incorporates evaluation into all program component activities• Educates program personnel about designing and conducting evaluation• Will give staff the full findings• Has strong coordination, communication and organization skills• Explains material clearly and patiently• Respects all levels of personnel

Generally, you should consider the consultant’s level of training, evaluation philosophy, experience, and ability to meet your needs. The consultant’s background and approach should match your evaluation goals and needs.

My Evaluation Plan

Click here to see an example of an evaluation plan for recruitment.

Now, it is your turn to plan an evaluation for your program. An EvaluationPlan is a good tool to help you brainstorm an evaluation project. Click here

for a printable Word version of the Evaluation Plan. It will help you get started on evaluating a part of your program.

Remember to start thinking about evaluation of an BCCEDP area with:

• components or areas that have not been successful, • any activity that consumes a large amount of resources,• any new initiative, or • any unevaluated activity that is employed frequently by the program.

The evaluation plan helps you to:• Identify key evaluation questions • Identify how you will use the evaluation findings and who will use the

findings• Describe how the evaluation data will be collected• Describe how they will be analyzed.

Evaluation Plan

Evaluation Question Uses of Evaluation Findings Data Collection Processes

Data Analysis

Questions:List an evaluation question about a program component here.

Users.

Describe a stakeholder who may be interested in the evaluation results or who will use the results.

Uses.

Describe how the stakeholder will use the evaluation results.

List the method(s) that you will use to collect the data for the evaluation.

Describe how you will analyze and interpret the evaluation data.

Below is the evaluation plan. Each column helps you work through anevaluation question and how you will use the evaluation findings.

Example of a Recruitment Evaluation Plan

Evaluation Question Uses of Evaluation Findings Data Collection Processes

Data Analysis

To what extent did Program X reach its target of 75% of mammograms funded through the program being provided to women aged 50-64?

Program Director and Recruitment Coordinator.Determine if recruitment efforts are working as intended and allow staff to change recruitment efforts if necessary.

Program staff and CDC NBCCEDP.

Inform program staff, program service providers, and CDC if progress towards the performance indicator is being made.

Review patient demographic and screening information data/CaST to determine number of women screened ages 50 – 64.

Comparison of evaluation results with the set program or CDC standards; 75% or more of Program X’s women will be ages 50 – 64.

Assess, report and discuss this percentage on a

quarterly basis.

Evaluation Never Ends

Congratulations on completing the module on the basics of evaluation.

Evaluation is an ongoing process. As long as you are conducting the BCCEDP program, you will need to continuously evaluate parts of the program.

You do not need to start at of the beginning each time you conduct an evaluation. You may continue to ask some critical evaluation questions over time. However, you should also consider adding new questions to address changes in program activities over time.

Module 1 Summary

In this module, you have learned what evaluation is, the benefits of evaluation, and a process for conducting an evaluation project through the CDC Evaluation Framework. If you follow the steps in the Framework to answer key evaluation questions, you can demonstrate what works in your program and identify areas for improvement.

We hope that you discover that evaluation can be well worth the effort. When you learn about the effects of your program, you not only help your program but you also assist other BCCEDPs. They can benefit from knowing what strategies or approaches work to improve cancer screening and service delivery to women in your communities.

References

1. Patton, M. Q. (1997). Utilization-focused Evaluation. Thousand Oaks, CA: Sage.

2. MacDonald, G., Starr, G., Schooley, M., Yee, S.L., Kimowski, K., & Turner, K. (2001). Introduction to program evaluation for comprehensive tobacco control programs. Atlanta, GA: Centers for Disease Control and Prevention.

3. U.S. Department of Health and Human Services. (2002). Physical activity evaluation handbook. Atlanta, GA: USDHHS, Centers for Disease Control and Prevention.

4. Centers for Disease Control and Prevention. (1998). Practical Evaluation of Public Health Programs, PHTN course VC-1007 Workbook. Atlanta, GA: Centers for Disease Control and Prevention.

5. Centers for Disease Control and Prevention. Framework for Program Evaluation in Public Health. (1999). MMWR, 48 (No. RR-11), i-40.

More Information about Evaluation

If you would like to learn additional information about evaluation, you can review the Evaluation Chapter from the Program Guidance Manual.

Click here to link to the Evaluation Chapter. WORD | PDF

Congratulations! You have completed Module 1 which provided an overview of evaluation and a process to evaluate your program, the CDC Evaluation Framework.

To assess what you learned in this module or to reinforce the main points, click below.

Click here to test your evaluation knowledge.

Or

Click here to proceed to Module 2.

Module 1 Quiz

What are Stakeholders?

A. The funders and managers of the program.

B. People that have an investment in what will be learned from an evaluation and what will be done with the knowledge.

C. Individuals that are aware of the program.

Module 1 Quiz Stakeholder Answer

Correct, stakeholders are people that have an investment in what will be learned from an evaluation and what will be done with the knowledge.

OR

That is incorrect, stakeholders are people that have an investment in what will be learned from an evaluation and what will be done with the knowledge.

Module 1 Quiz

Used to discern if aprogram has met its long-term goals and objectives. Implemented to determine how well a program is “doing what it’s supposed to do” and to count the conduct activities.

Helps to understand what effect the program is having on the participants, the partners and the community.

Match the level of evaluation with the right definition.

Process

Impact

Outcome

Module 1 Quiz Level of Evaluation Answer

Correct, a process evaluation is implemented to determine how well a program or intervention is “doing what it’s supposed to do” and to count the conduct activities.An impact evaluation helps to understand what effect the program is having on the participants, partners and the community.Outcome evaluation, is used to discern if a program has met its long-term goals and objectives.

ORThat is incorrect, a process evaluation is implemented to determine how well a program is “doing what it’s supposed to do” and to count the conduct activities. An impact evaluation helps to understand what effect the program is having on the participants, partners and the community. Outcome evaluation, is used to discern if a program has met its long-term goals and objectives.

Module 1 Quiz

We described the CDC Evaluation Framework in this module. Please put the steps of the framework in the correct order. Move the items in the correct order. You must place them all before checking your answer.

JustifyConclusions

GatherCredibleEvidence

Focus the Evaluation

Design

Ensure useand

Share LessonsLearned

Describethe

Program

Engage Stakeholders

1.

2.

3.

4.

5.

6.

Module 1 Quiz CDC Framework for Evaluation

JustifyConclusions

GatherCredibleEvidence

Focus the Evaluation

Design

Ensure useand

Share LessonsLearned

Describe the

Program

Engage Stakeholders


Recommended