Ain’t No Mountain High Ain’t No Mountain High EnoughEnough
Climbing the Peaks of Program Climbing the Peaks of Program ExcellenceExcellence
Facilitators: Christina Borbely
Kerrilyn Scott-Nakai
Produced and Conducted by the Center for Applied Research Solutions, Inc. for the California Department of Alcohol and Drug Programs
SDFSC Workshop-by-Request
March 22, 2006 Ventura County
Authored by Christina J. Borbely, Ph.D.Safe and Drug Free Schools and Communities Technical Assistance Project
Trail MapTrail Map
• Why Are We Doing This– Value – Opportunities
• Opportunity for Recognition
• Advanced Program Essentials– Program Essentials– Key Considerations
• Advancing Program Through Evaluation– Methodology: Design & Instrumentation– Data Plan & Analysis– Reporting
Why Are We Doing This?Why Are We Doing This?
• The Value of Advancing Programs
• Opportunities for Advancing Programs
ValueValue
• Replicating innovative strategies– Fill in gaps– Integrate latest science and/or practice
• Making contribution through dissemination– Participate in science-service dialog– Advance the field– Provide effective program to others
OpportunitiesOpportunities
• Expansion– Demonstrate need/value of new or additional
funding– Bolster capacity to sustain programming
• Recognition– Validation from field– Potential for supplemental support/resources– Publications
Opportunity for RecognitionOpportunity for Recognition
Validation from the Prevention Field:
• Service to Science
• NREPP
• Exemplary Programs
National Registry of Effective National Registry of Effective Prevention Programs (NREPP)Prevention Programs (NREPP)
NREPP is coordinated by the Center for Substance Abuse Prevention (CSAP) under the federal Substance Abuse and Mental Health Services Administration (SAMHSA)
NREPP is:“a system designed to support informed decision making
and to disseminate timely and reliable information about interventions that prevent and/or treat mental and substance use disorders.”
http://modelprograms.samhsa.gov/template.cfm?page=nreppover
Original NREPP DesignationsOriginal NREPP Designations• A program will be considered “Model” if the NREPP review team
appointed your program as an effective program, and an agency agrees to participate in CSAP’s dissemination efforts. Model programs also provide training and technical assistance to practitioners who wish to adopt a program in order to ensure that the program is implemented with fidelity.
• A program is considered “Effective” if it is science-based, and produces consistently positive patterns of results. Only programs positively effecting the majority of intended recipients or targets are considered effective.
• A program will be considered “Promising” if it provides useful and scientifically defensible information about what works in prevention, but has yet to gather sufficient scientific support to standards set for effective/model programs. Promising programs are sources of guidance for prevention practitioners, although they may not be as prepared as Model programs for large-scale dissemination.
Evidence-Based Programs
●Conceptually Sound and Internally Consistent●Program Activities Related to Conceptualization●Reasonably Well Implemented & Evaluated
Promising Programs
●Some Positive Outcomes
Effective Programs•Consistently Positive Outcomes
●Strongly Implemented & Evaluated
Model Programs
●Available for Dissemination●Technical Assistance Available from Program Developers
NEW NREPP: Eligibility CriteriaNEW NREPP: Eligibility Criteria• Open submission; review based on alignment of
intervention with NREPP priorities• SAMHSA's three Centers -- the Center for Mental Health
Services, the Center for Substance Abuse Prevention, and the Center for Substance Abuse Treatment -- will establish priorities for the types of interventions to be reviewed and highlighted on NREPP.
• Priorities will be established and provided to the public annually through notices on the NREPP Web site.
• These priorities based on dialogues with treatment and prevention stakeholders as well as with SAMHSA's Federal partners
NEW NREPP: Review CriteriaNEW NREPP: Review Criteria
“the sole requirement for potential inclusion in the NREPP review process is for an intervention to have demonstrated one or more significant behavioral change outcomes.”
NEW NREPP: Review ProcessNEW NREPP: Review Process• A trained Ph.D.-level evaluation specialist works with
applicants to assure that adequate materials have been submitted before initiating an NREPP review.
• The evaluation specialist serves as collaborator in the application process and liaison to the reviewers.
• A scientific review of the intervention is conducted by two independent Ph.D.-level reviewers.
• Completed review summaries, including descriptive components, reviewer ratings, and explanations are provided to the applicant for approval before they are posted on the NREPP Web site.
NEW NREPP: Application ProcessNEW NREPP: Application Process
Application materials include one or more of the following types of documents:
• formal evaluation reports, • published and unpublished research articles, • narrative sections of grant applications, • training materials, and • implementation or procedural manuals. • concise summary of the intervention that
includes the intervention name, a description of its main components, the population(s) targeted, and the behavioral outcomes targeted.
The Exemplary Program AwardsThe Exemplary Program Awards
• The Exemplary Program Award is designated by CSAP• The Exemplary Awards program recognizes prevention
programs in two tracks: Promising Programs—those that have positive initial results but have yet to verify outcomes scientifically, and Model Programs—those that are implemented under scientifically rigorous conditions and demonstrate consistently positive results.
• The Exemplary Awards recognize prevention programs that are innovative and effective and that successfully respond to the needs of their target populations, both as Promising Programs and Model Programs.
Exemplary Program Award: Exemplary Program Award: Review ProcessReview Process
• A multifaceted procedure is used identify and select Promising Programs to receive an Exemplary Substance Abuse Prevention Program Award annually. All nominated programs submit to a three-level review process.
• First, state agency personnel and national organizations submit their formal nominations.
• Applications are then reviewed by experts in the field of substance abuse prevention and former Exemplary Substance Abuse Prevention Program Award winners.
• Finally, the National Review Committee reviews and scores the top applications according to eight criteria and recommends those that merit an Exemplary Substance Abuse Prevention Program Award. Final selections are made jointly by NASADAD, CADCA, and SAMHSA/CSAP.
Exemplary Program AwardExemplary Program AwardApplication ProcessApplication Process
• Applications for the Innovative Programs may be obtained from State Alcohol and Drug Agencies, the NASADAD/NPN Web page (www.nasadad.org) and office.
• Applicants must submit their application to their national nominating organization (see application appendix) for sign-off. Applicants should then return the original signed, completed application (including cover sheet) and three copies to the NASADAD/NPN central office in Washington, D.C. For more information about the application process, call or write:
NASADAD/NPN808 17th Street, NW, Suite 410
Washington, DC 20006Attention: Exemplary Programs
Web page: www.nasadad.orgE-mail: [email protected]
(202) 293-0090, Fax (202) 293-1250
Exemplary Program Award Exemplary Program Award 8 Review Criteria8 Review Criteria
• Philosophy• Background and need (program planning)• Goals and objectives• Population(s) to be served• Activities and strategies• Community coordination• Evaluation• Program management
Service to ScienceService to Science
• Service to Science is a national initiative supported by SAMHSA/CSAP to enhance the evaluation capacity of innovative programs and practices that address critical substance abuse prevention or mental health needs.
http://captus.samhsa.gov/northeast/special_projects/service_to_science/main.cfm
Service to Science AcademyService to Science Academy
• Designed to enhance capability of community-based prevention strategies, programs or practices that demonstrate effectiveness.
• Each Academy is customized to support the needs of the groups/organizations and programs accepted to attend,
• Emphasis on the development of a strong evaluation and/or research design.
• Participants receive training and technical assistance helping them move along the evidence-based continuum
Service to Science Academy:Service to Science Academy:Eligibility CriteriaEligibility Criteria
1. Primarily focused on ATOD prevention, but may also address the prevention of violence, HIV/AIDS, STDs, etc. Expected outcomes or areas of focus include, but are not limited to, efforts to decrease high-risk behaviors by children or adults; eliminate use of illicit drugs; reduce underage use of alcohol, tobacco, and other drugs, and decrease DUI/DWI rates.
2. Nominated for recognition by a State Alcohol and Drug Agency, by the Community Anti-Drug Coalitions of America (CADCA), or by other national organizations or their affiliates.
3. Able to document and demonstrate success in the form of quantifiable outcome data.
4. In operation for a minimum of two (2) years.
Service to Science Academy:Service to Science Academy:Review CriteriaReview Criteria
• Philosophy• Needs Assessment• Population Served• Goals & Objectives• Activities & Strategies• Evaluation• Program Management• Community Coordination
Service to Science Academy: Service to Science Academy: Application ProcessApplication Process
• The application to attend a Service to Science Academy is a modified National Association of State Alcohol & Substance Abuse Directors (NASADAD) application for Innovative/Exemplary Programs.
• Applications are reviewed by a panel who makes recommendations for acceptance to the Academy.
Application Criteria as Application Criteria as Program PracticeProgram Practice
Live it!
• SDFSC Santa Cruz County: Service to Science AcademySanta Cruz County submitted an application and was awarded a program slot with the current cohort for the Service to Science Academy. The Santa Cruz team will receive a series of trainings and technical assistance to assist them in moving their program towards being recognized as a model or promising program.
• SDFSC Butte County: NPN Exemplary Program AwardButte County submitted 3 of their prevention programs for review: Friday Night Live Mentoring, Friday Night Live, and Youth Nexus. Two of these programs are being recognized nationally, with only 6 programs receiving this national recognition by the National Prevention Network Research.
• Andrea Taylor, Ph.D.: NREPP Model Program StatusAndrea Taylor evolved a local program, Across Ages, an intergenerational mentoring program that promotes positive youth development and helps prevent school failure, substance abuse and teen pregnancies into to an NREPP Model Program that is implemented nation-wide. The process spanned 1991-1998.
Advanced Program EssentialsAdvanced Program Essentials
Put Your Finger On It…
• Logic Model• Core Components• Documented Need and Value• Defining Population• Defining Need for Service within the
Community
Logic ModelLogic Model
“ A logic model is a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan, and the changes or results you hope to achieve.”
(W.K. Kellogg, Logic Model Development Guide, 2004)
Value of a Logic ModelValue of a Logic Model
A Picture is Worth a 1000 Words
• Builds understanding about what the program is, what it’s expected to do and what measures of success will be used.
• Provides a research-based theory behind your strategies
• Promotes communication and a common understanding amongst staff and funders
Core Program ComponentsCore Program Components
What are the “active ingredients” in the formula for program success?
• In theory, core components must be implemented precisely as intended in order to achieve demonstrated outcomes.
• Core components cannot be adapted.
Define Core ComponentsDefine Core ComponentsCore components might be:
• program structure (e.g. the sequence of sessions or context of delivery),
• program content (e.g. specific concepts or skill sets), or
• method of delivery (e.g. “homework” assignments, classroom infusion, or youth-led group activities).
Define PopulationDefine Population• Institute of Medicine (IOM) Classifications
Universal preventive interventions are activities targeted to the general public or a whole population group that has not been identified on the basis of individual risk. Selective preventive interventions are activities targeted to individuals or a subgroup of the population whose risk of developing a disorder is significantly higher than average. Indicated preventive interventions are activities targeted to individuals in high-risk environments, identified as having minimal but detectable signs or symptoms foreshadowing disorder or having biological markers indicating predisposition for disorder but not yet meeting diagnostic levels.
Defining Need for ServiceDefining Need for Service– Integrating key stakeholders in process
• Bonus points for youth• Representative of community
– Strategic Prevention Framework• Needs/Resource Assessment• Strategic Planning• Evidence-based Implementation
Strategic Prevention Framework
Assessment Capacity Planning Implementation Evaluation
Assemble data collection review
team, define substance abuse
problem
Identify and define
target population for reduction/prevention
Identify risk and protective factors
Develop tentative
theory of, or pathway to, change
Identify existing
prevention resources that
target problem & risk/protective
factors
Perform gap
analysis of needs and resources
Examine internal
resources, skills,
readiness
Build
collaboration through
teaming and networking
Examine
community resources
and readiness: external capacity
Determine domain(s) of concentration and
prioritize risk and protective factors
Examine
program/intervention options
Address cultural
relevancy
Explore
fidelity/adaptation balance
Select "best-fit"
program/ intervention
Choose to innovate
Develop logic models for overall
program, components
Develop action
plans for documentation
Document,
review, improve quality
Revisit fidelity
and adaptation issues
as necessary
Report immediate & intermediate
outcomes
Outline process
evaluation from action
plans
Assess long-
term outcomes/
general impact
Communicate outcomes to
key stakeholders
to build support
Re-measure outcomes & supplement
final report if necessary
Key ConsiderationsKey Considerations
Advancing Programming
• What’s the yardstick?
• How do I measure up?
• Where do I want to go from here?
Considerations: ParticipationConsiderations: Participation
• Recruitment – Are we meeting target #s consistently?– Are we using strategic recruitment methods?
• Retention: – Do we have sufficient completion rates? – Have we defined a program graduate/drop-out?– What do we do to encourage retention?
Considerations: FidelityConsiderations: Fidelity
• Fidelity – To what degree are we consistently
implementing core components? Is this sufficient?
– What system do we use to reflect on areas of challenge? How does that inform our process?
– What method do we use to monitor implementation across sites? Are we vigilant enough? Does feedback get incorporated?
Considerations: InnovationConsiderations: Innovation
• Degree to which program is novel, cutting edge, innovative.– How is this different than what’s already available?– What aspects of the program are unique?
• Grounded but Innovative: program alignment with already-proven models of service– What proven methods are incorporated in what we
do?– Did we take an evidence-based strategy to the “next
level” or use it in a novel way?
Considerations: PopulationConsiderations: Population
• How culturally appropriate are services to identified population?– Program content– Program materials (e.g. translation)– Staff (training and protocol)– Tested across ethnic/cultural groups– Link to evidence-based strategies
demonstrated with specific populations
Considerations: MarketingConsiderations: Marketing
• Have materials/curriculum been “packaged” – Sequencing– Branding– Training protocol tested and
established/documented
Considerations: ReplicationConsiderations: Replication
• Protocol– Program curriculum– Training process– Evaluation
• Packaged program materials– Curriculum– Evaluation
• Strategic replication– Varied populations– Varied context
Advancing Programs through Advancing Programs through EvaluationEvaluation
• Rigor
• Methodology
• Data Plan & Analysis
• Reporting
Increasing Evaluation Rigor Increasing Evaluation Rigor Across the BoardAcross the Board
Methodology/Design
Instrumentation to Analysis
Reporting
Tips for Optimal Evaluation RigorTips for Optimal Evaluation Rigor
• Use external evaluator to lend credibility– Especially valuable for publishing findings
• Conduct evaluation of replication sites– Evidence of impact in varied settings; populations
• Evaluate program effect and sustainability of effect– Pre/post demonstrates immediate effects– Follow up (longitudinal) proves how those effects are
sustained.
Advancing MethodologyAdvancing Methodology
• Process & Outcome
• Evaluation Design
• Tips for Optimal Design
Role of Process and Outcome MethodsRole of Process and Outcome Methods
Process
• Allows for continuous learning about how the program is working as it is implemented
• Focuses on clearly describing and assessing program design and implementation.
• Makes it possible to answer questions concerning “why” and “how” programs operate the way they do and what can be done to improve them.
Outcome
• The outcome evaluation focuses on producing clear evidence concerning the degree of program impact on program participants.
• Assesses the immediate or direct effects of program activities (as compared to long-term impact).
Level of Rigor: Level of Rigor: Outcome Evaluation Design Outcome Evaluation Design
Pre and Post with Control (Random Assignment)
Pre and Post with Comparison
Pre and Post with Follow-Up Pre and Post Only
Post-Test Only—Hindsight Comparison
NREPP Source of Evidence Hierarchy
Tip for Optimal Design: Tip for Optimal Design: Matched DataMatched Data
Making a Match– Requires tracking of individuals– Allows for analysis of individual-level impact,
not just aggregate level– Can control for “dosage” or other factors
Looking at the long run…
The majority of programs use a pre/post assessment schedule. – The utilization of follow-up points is
recommended based on length of program • Consider a follow up point at 1, 3, 6, 9, or 12
months after completion. • Programs with continuous enrollment vs. cohorts of
youth need – strong tracking systems– Continuous evaluation schedule (e.g. every 3 or 6
months)
Tip for Optimal Design: Tip for Optimal Design: Longitudinal DataLongitudinal Data
Shall I compare thee to a summer’s day…
Comparison groups can sometimes be fairly easy to develop
• School data • Low dosage service groups can sometimes
be utilized—make the distinction between program drop-out versus evaluation drop-out
• Use standardized measures and compare program groups to school, district, state results.
Tip for Optimal Design: Tip for Optimal Design: Comparison GroupsComparison Groups
Control freak!
– Controls groups require resources and may deter participants due to randomization.
• The trick is in the approach and the ability to provide services at a later date.
Tip for Optimal Design: Tip for Optimal Design: Control GroupsControl Groups
Advancing InstrumentationAdvancing Instrumentation
• Standardized v. Locally Developed
• Tips for Optimal Instrumentation
Survey OptionsSurvey Options
Pros• Already developed, lots of
choices. • Psychometrics established• Allows for comparison of
results—national, state, district levels
• Scoring and analysis sometimes available
Cons• Cost • May not be specific to your
population• May not capture novel
aspects of program
Pros• Can tap into novel program
aspects/impact• Can be tailored to
population• No cost
Cons• Don’t know reliability/validity• Doesn’t allow for
comparison
Tip for Optimal InstrumentationTip for Optimal Instrumentation
• Next level of locally developed measures– “performance measure”/psychometrics of
instruments • reliability & validity (done by your evaluator)
• Track at individual level – Confidential Ids– Develop comprehensive database
Advancing Data Management & Advancing Data Management & ProcessingProcessing
• Data Plan
• Sample Size
• Data Analysis
Data PlanData Plan
• Develop plan for analyzing data based on proposed outcomes (logic model)– What questions to ask of the data?– What piece of the data answers each
question?– Potential sub-group comparisons
(e.g. by gender, dosage, site)
Tips for Optimal Data PlanTips for Optimal Data Plan
• Specify cutoff points for baseline assessment (Defined for program)e.g. Baseline assessments are defined as those
completed prior to session 2 of the curriculum.
• Define completers vs. dropoutse.g. Parents attending 85% of sessions are defined as
program “completers”; less than 10% are defined as “dropouts”.
• Ensure matched pre/post – Individual vs. aggregate level findings
Planning a Sample SizePlanning a Sample Size
How Much Wood should a Woodchuck Chuck?• Sample size (N): refers to population
participating or being measured (e.g. # of participants; # of sites)
• Power: Probability of finding a true effect– Type I error: state a finding when there isn’t one (a false
positive)– Type II error: state no finding when there is one (a false
negative)
• Sample size & Power – Influences types and sensitivity of analysis– Larger sample size increases power
Tips for Optimal Data Plan: Tips for Optimal Data Plan: Strategic Sample SizeStrategic Sample Size
• Calculate necessary sample size for appropriate statistical power
http://www.surveysystem.com/sscalc.htm
http://www.macorr.com/ss_calculator.htm
• Resource limitations: consider using a strategic sub-sample
Data Analysis: Data Analysis: Beyond Percentage ReportingBeyond Percentage Reporting
• Means with standard deviation– SD reflects the variability of values
• Tests of significance (comparative analysis)– Correlations
As participation level increases, attendance rate significantly increases.
– Chi square analysisYouth demonstrated statistically significant
improvements in communication skills over time.
Tips for Optimal Data Analysis: Tips for Optimal Data Analysis: Techniques & StrategiesTechniques & Strategies
• Leverage variability in data/dosage to program advantage e.g. Youth who completed the program were more likely
to have negative attitudes toward use than youth who did not complete the program.
• Identify potential comparison data sets (e.g. school records)e.g. School records show that participating youth had
significantly fewer discipline referrals than the general student population.
Advancing Reporting MethodsAdvancing Reporting Methods
• Venues for Dissemination
• Cater to the Crowd
• Tips for Optimal Reporting
Where to DisseminateWhere to Disseminate
• Evaluation Reports
• Summary Reports
• Applications
• Grants
• Press Release
• Professional Publications
• Academic/research Publications
– What information is relevant to your audience?• Note preferred models/frameworks, rhetoric• Highlight information that is of value to them
– To what extent is detail or brevity important to your audience?
– Are pieces of program information weighted differently (e.g. a reviewer point system)
– Work with your evaluator in developing a brief findings report as well as a full evaluation report
Cater to the CrowdCater to the Crowd
Tips for Optimal Reporting: Tips for Optimal Reporting: Frame ItFrame It
• The evaluator is responsible for providing the full and objective picture
• Program Director may choose to highlight the most positive findings when reporting to funders or stakeholders—if appropriate
• Wording can make a difference! The same findings can be written in a variety of ways—be conscious of the wording.
• Say It In Pictures– The appropriate use of charts and graphs can be a
powerful tool in conveying findings.
• Bring It Home– The use of personal quotes and case examples
can be powerful when they are used to supplement key quantitative findings.
– Personal experiences make the impact real to the reader
– However, when misused they can make the evaluation seem less credible
Tips for Optimizing Reporting: Tips for Optimizing Reporting: The MessageThe Message
Climbing the Mountain: What’s Climbing the Mountain: What’s Your Next Step?Your Next Step?
• Action Planning Exercise – Defining short-term, intermediate, and long-
term goals (e.g. 1 yr, 3yr, and 5 yr goals)• Programmatic Goals• Evaluation Goals• Opportunity Goals
– How can we support you in your climb to the top?
• Customized TA and Training plans