Morehouse School of Medicine Prevention Research Center 2013
Tabia Henry Akintobi, PhD, MPH Director, Prevention Research Center
Director, Evaluation and Institutional Assessment Research
Associate Professor Department of Community Health and Preventive
Medicine Morehouse School of Medicine
Slide 2
Morehouse School of Medicine Prevention Research Center 2013
Objectives Detail the elements of evaluation specific to community
needs assessment and program development Describe qualitative and
quantitative data collection and analysis considerations in
community contexts To discuss the application of evaluation
frameworks applied in community-responsive needs assessment To
discuss lessons learned, issues and recommendations in community
engaged practice
Slide 3
Morehouse School of Medicine Prevention Research Center 2013
The MSM PRC functions within the only Historically Black College
and University funded among The Centers for Disease Control and
Preventions 37 Health Promotion and Disease Prevention Research
Centers Prevention Research Centers
Slide 4
Morehouse School of Medicine Prevention Research Center 2013
Theme - Risk Reduction and Early Detection in African American and
Other Minority Communities: Coalition for Prevention Research
Morehouse School of Medicine Prevention Research Center (MSM
PRC)
Slide 5
Morehouse School of Medicine Prevention Research Center 2013
Where the MSM PRC Works
Slide 6
Morehouse School of Medicine Prevention Research Center 2013
The MSM PRC CCB is comprised of: Community Residents Academic
Institution Representatives Agencies within the City of Atlanta
Neighborhood Planning Units V, X, Y and Z. MSM PRC Community
Coalition Board (CCB)
Slide 7
Morehouse School of Medicine Prevention Research Center 2013
Evaluation and Institutional Assessment Unit: Technical Capacity
Assessments conducted through direct or collaborative acquisition
of federally- funded, private, and local grants and contracts
Morehouse School of Medicine Prevention Research Center 2013
Slide 8
Evaluations Should be Participatory Partnership between
evaluators and stakeholders Sustained ownership and involvement
Build Evaluation Capacities Demystify evaluation Evaluation as
essential to program planning, implementation, and measurement Both
Formative and Summative Evaluation is Critical Identifying best or
promising practices Defining success and how it is achieved
Evaluations Should Lead to Decision-Making Formulations of
recommendations for programmatic improvements, practice/policy
changes, or subsequent research to address identified needs
Evaluation and Institutional Assessment Unit: Guiding Principles
Morehouse School of Medicine Prevention Research Center 2013
Slide 9
To guide the process and encourage collaboration
Slide 10
Morehouse School of Medicine Prevention Research Center 2013
The Participatory Framework for Designing the Program and Its
Evaluation Public Health Initiative Collect the Data Analyze and
Interpret the Data Report the Findings Community Assessment
STAKEHOLDERSSTAKEHOLDERS Design the Evaluation
Slide 11
Morehouse School of Medicine Prevention Research Center 2013
Engaging Stakeholders Stakeholders: Individuals, groups or
organizations having a significant interest in how well a program
functions and/or in the health topic. For instance, those with
decision-making authority over the program, funders and sponsors,
administrators and personnel, clients or intended beneficiaries
Rossi, Lipsey & Freeman (2004)
Slide 12
Morehouse School of Medicine Prevention Research Center 2013
Type of StakeholderWhat they doWho they are PrimaryDaily contact,
request report, major decision- makers, current program
participants sponsors, collaborators, coalition partners, funding
officials, administrators, managers, and staff SecondaryLittle or
no daily contact, potential program participants, current partners
clients, family members, neighborhood organizations, academic
institutions, elected officials, advocacy groups, professional
associations TertiaryPotential partners/funders, target population
Staff, board members, administrators, volunteers Engaging
Stakeholders
Slide 13
Morehouse School of Medicine Prevention Research Center 2013
Engaging Stakeholders Identify leaders first Use snowballing
Understand the role and importance of potential stakeholders To the
extent possible, consider all stakeholder perspectives Be
inclusive
Slide 14
Morehouse School of Medicine Prevention Research Center 2013
Why Needs Assessment? Provides a systematic process to guide
decisions. Provides justification for decisions before they are
made. It is scalable for any size project, time frame, or budget.
Facilitates community engagement in defining needs, assets and
priorities. 14 www.worldbank.org/ieg/training/TNA.ppt
Slide 15
Morehouse School of Medicine Prevention Research Center 2013
What we really need is training on XYZ. But that is the way we have
always done it here. Programs that are not aligned with community /
funder priorities Answers that are simple, straightforward,
acceptable, understandable and yet wrong Needs Assessments Help Us
Avoid www.worldbank.org/ieg/training/TNA.ppt
Slide 16
Morehouse School of Medicine Prevention Research Center 2013
PrioritizeAnalyzeImplementPlan Coalition Building; Consultation
Data Collection Describe Needs / Assets Assess Data Strengths/
Weaknesses Enabling, Predisposing Factors Prioritize Needs Develop
Strategies Needs Assessment Process
http://www.nswphc.unsw.edu.au/pdf/VitalLinks06/
Powerpoints/VanessaTraynorSarahDennis_Demy
stifyingneedsassessment.pdf
Slide 17
Morehouse School of Medicine Prevention Research Center 2013
What other needs assessments have been done in this area or with
the demographic group? What questions remain to be answered? What
form of data collection is appropriate to answer these questions?
What resources are available (money, skills etc)? What steering /
reference groups exist or need to be established to guide the
assessment? Who else can / should we involve? Planning -
Consideration
http://www.nswphc.unsw.edu.au/pdf/VitalLinks06/Powerpoints/VanessaTraynorSarahDennis_Demystifyingneedsassessment.pdf
Slide 18
Morehouse School of Medicine Prevention Research Center 2013
Morehouse School of Medicine Prevention Research Center 201 Needs
Assessment Method and Plan Development Driven by Objective(s)
Developed through Review of Literature Shaped by Identification of
Problem or Issue at the local level May be Driven by a Theoretical
Framework Developed by Input from Stakeholders and Experts
Slide 19
Morehouse School of Medicine Prevention Research Center 2013
What Do You Want to Understand, Change or Measure? Attitudes
Perceptions Preferences Knowledge or Skills Behavior Behavioral
Intentions
Slide 20
Morehouse School of Medicine Prevention Research Center 2013
What do You Want to Know? Examples of Evaluation Questions: Who
needs this program? What is the magnitude of the need? What should
be done to meet the need? What are the existing resources or
capacity to meet the need?
Slide 21
Morehouse School of Medicine Prevention Research Center 2013
PrioritizeAnalyzeImplementPlan Coalition Building; Consultation
Data Collection Describe Needs / Assets Determine Data Strengths/
Weaknesses Enabling, Predisposing Factors Prioritize Needs Develop
Strategies Needs Assessment Process http://www.nswphc.unsw.edu.au/p
df/VitalLinks06/Powerpoints/Vanes saTraynorSarahDennis_Demystifyi
ngneedsassessment.pdf
Slide 22
Morehouse School of Medicine Prevention Research Center 2013
PrioritizeAnalyzeImplementPlan Coalition Building; Consultation
Data Collection Describe Needs / Assets Assess Data Strengths/
Weaknesses Enabling, Predisposing Factors Prioritize Needs Develop
Strategies Needs Assessment Process http://www.nswphc.unsw.edu.au/p
df/VitalLinks06/Powerpoints/Vanes saTraynorSarahDennis_Demystifyi
ngneedsassessment.pdf
Slide 23
Morehouse School of Medicine Prevention Research Center 2013
Analysis & Next Steps - Considerations Prioritization
Comprehensive Picture Identify Needs Identify Contributing Factors
Predisposing, Enabling, etc. Analysis What are the most critical
unmet needs? (for whom, etc) What can be changed? What will it take
to address the needs? (resources, cost, strategies)
http://www.nswphc.unsw.edu.au/pdf/VitalLinks06/Powerpoints/VanessaTraynorSarahDennis_Dem
ystifyingneedsassessment.pdf
Slide 24
Morehouse School of Medicine Prevention Research Center
2013
Slide 25
Evaluation in Context Planning, Implementation, and Effect of
Initiative Evaluation Continuum Planning Implementation Effect
Formative Evaluation Process monitoring and Evaluation Outcome and
Impact Evaluation
Slide 26
Morehouse School of Medicine Prevention Research Center 2013
Morehouse School of Medicine Prevention Research Center 201 A
systematic process Involves data collection Process for enhancing
knowledge and decision-making Adapted from Russ-Eft and Preskill
(2001) What is Evaluation?
Slide 27
Morehouse School of Medicine Prevention Research Center 2013
Role and Importance of Evaluation Planning Guides planning of
proposed programs and activities prior to full implementation of
program Monitors and documents implementation of programmatic
activities Assesses and documents whether activities and
interventions achieve desired outcomes Informs programmatic
decisions
Slide 28
Morehouse School of Medicine Prevention Research Center 2013
Evaluation Planning Evaluation planning includes: Clarifying the
purpose of the evaluation Decide what to evaluate Understand what
is involved Choose the evaluation type and design Gather necessary
resources Determine how to use the results
Slide 29
Morehouse School of Medicine Prevention Research Center 2013
Choosing The Evaluation Design Selection of Design should be based
on: Decisions May Be Based on Results of the Needs Assessment Key
Program Questions, Performance Indicators Resource (program and
evaluation) availability How the data will be used Timeline Adapted
from Sharma, Lanum, Suarex-Balcazar (2000). A Community Needs
Assessment Guide. Loyola University Chicago
Slide 30
Morehouse School of Medicine Prevention Research Center 2013
Choosing The Evaluation Design Consider stakeholder needs: The
information needs of key stakeholders and primary users How the
information will be used Who will use it What kind of information
will have the most credibility for the intended users
Slide 31
Morehouse School of Medicine Prevention Research Center 2013
Other Evaluation Design Considerations Consider participant
characteristics: Literacy Geographic dispersion Cultural issues
(including language) Accessibility Logistic and contextual
constraints Alternative sources of information Time and resource
constraints
Slide 32
Morehouse School of Medicine Prevention Research Center
2013
Slide 33
Morehouse School of Medicine Prevention Research Center 201 The
Evidence in Evidence-based Interventions is the Data Collection,
Analyze and Interpreted May be primary and/or secondary Primary
Data-you collect as a result of your activities or research
Secondary Data-data already collected and independent of your
activities or research May be qualitative or quantitative Should be
systematically collected/ reviewed, analyzed and interpreted
Morehouse School of Medicine Prevention Research Center 2010
Slide 34
Morehouse School of Medicine Prevention Research Center 2013
Data collection includes: Identifying existing data sources
(secondary) Determining the best data collection method (focus
group, survey, interview) Selecting or creating data collection
instruments Deciding on the most appropriate procedures for
collecting data Data Collection
Slide 35
Morehouse School of Medicine Prevention Research Center 2013
Morehouse School of Medicine Prevention Research Center 201
Frequently Used Qualitative Data Collection Tools In-Depth
Interviews Complex subject matter and expert respondents Highly
sensitive subject matter Geographically dispersed participants Aim
to diminish peer pressure or minimize influence on responses Focus
Groups Group interaction to stimulate richer responses Observation
of behaviors, attitudes and language Idea generation Pre-testing
Evaluation of message concepts Problem identification
Slide 36
Morehouse School of Medicine Prevention Research Center 2013
Considerations
Slide 37
Morehouse School of Medicine Prevention Research Center 2013
Frequently Used Quantitative Data Collection Tools: Surveys
Self-Administered Surveys independently completed by participant
Face-to-Face Survey Interviews participant completion is
facilitated by program staff person
Slide 38
Morehouse School of Medicine Prevention Research Center 2013
Considerations
Slide 39
Morehouse School of Medicine Prevention Research Center
2013
Slide 40
Morehouse School of Medicine Prevention Research Center 201
Frequently Used Qualitative Data Collection Tools In-Depth
Interviews Complex subject matter and expert respondents Highly
sensitive subject matter Geographically dispersed participants Aim
to diminish peer pressure or minimize influence on responses Focus
Groups Group interaction to stimulate richer responses Observation
of behaviors, attitudes and language Idea generation Pre-testing
Evaluation of message concepts Problem identification
Slide 41
Morehouse School of Medicine Prevention Research Center
2013
Slide 42
Empowerment Evaluation Provides stakeholders with tools and
skills to evaluate their program Ensures that the evaluation is
part of the planning and management of the program (Fetterman,
2008).
Slide 43
Morehouse School of Medicine Prevention Research Center 2013
Values improvement in people, programs, and organizations to help
them achieve results Community ownership of the design and conduct
of the evaluation and implementation of the findings Inclusion of
appropriate participants from all levels of the program, funders,
and community Democratic participation and clear and open
evaluation plans and methods. Commitment to social justice and a
fair allocation of resources, opportunities, obligations, and
bargaining power Citations: Fetterman, 2008, Wandersman, 2005;
Sufian, Grunbaum, Henry Akintobi, Dozier, Eder, Jones, Mullan,
Weir, White-Cooper, 2011 Empowerment Evaluation
Characteristics
Slide 44
Morehouse School of Medicine Prevention Research Center 2013
Use of community knowledge to understand the local context and to
interpret results Use of evidence-based strategies with adaptations
to the local environment and culture Building the capacity of
program staff and participants to improve their ability to conduct
their own evaluations Organizational learning, ensuring that
programs are responsive to changes and challenges Citations:
Fetterman, 2008, Wandersman, 2005; Sufian, Grunbaum, Henry
Akintobi, Dozier, Eder, Jones, Mullan, Weir, White-Cooper, 2011
Empowerment Evaluation Characteristics
Slide 45
Morehouse School of Medicine Prevention Research Center 2013
Case Study 1: PAATH-II Community Coalition PAATH-II Community
Coalition Board Stakeholders - Parents, education professionals,
civic leaders, and representatives from government agencies Funded
to develop community-led approaches to address youth substance
abuse and violence Target Community-Metropolitan Atlanta Zip Code
30318
Slide 46
Morehouse School of Medicine Prevention Research Center 2013
Case Study 1: PAATH-II Community Coalition PAATH-II Partnered with
MSM-PRC to Evaluate the Coalition through a Demographic Profile
Needs assessment Consisted of: Secondary Data Primary Data
Slide 47
Morehouse School of Medicine Prevention Research Center 2013
Case Study 1: PAATH-II Community Coalition-Secondary Data The
MSM-PRC reviewed zip code 30318 statistics (general demographics
and other risk factors associated with violence and substance
abuse) including: racial background, household make-up, income
level, vacant housing, public housing, and surrounding prison and
addition facilities, truancy, etc.
Slide 48
Morehouse School of Medicine Prevention Research Center 2013
Case Study 1: PAATH-II Community Coalition-Primary Data Key
Informant Interviews To gather insights, opinions and best thinking
from the 30318 community on substance abuse and violence prevention
Target audience: adult stakeholders representing community
organizations, healthcare, neighborhood businesses, private
practice, public housing, law enforcement, and the school system
Youth ages 11-16 Recommendations led to prioritization Youth
Mentoring and Alternative Education goals, evaluation activities,
and outcomes
Slide 49
Morehouse School of Medicine Prevention Research Center 2013
Case Study 1: PAATH-II Community Coalition Alternative Education
Goal Activities Collaboration between the MSM-PRC and the PAATH-II
Coalition Board to develop and administer an Alternative Education
Survey (AES) to capture demographics, perceptions, and
recommendations among students, parents, and stakeholders familiar
with alternative education/non-traditional schools in the Atlanta
Public School System (APS) Analysis of evaluation data collected
through the AES among 24 students, 22 parents, and 12 stakeholders
To submit and present the AES Report to the PAATH-II Coalition
Board To provide recommendations on key findings to include in the
final presentation the Atlanta Board of Education
Slide 50
Morehouse School of Medicine Prevention Research Center 2013
Participatory Evaluation Involves key stakeholders in evaluation
design and decision-making Acknowledges and addresses inequities in
power and voice among stakeholders
Slide 51
Morehouse School of Medicine Prevention Research Center 2013
The focus is on participant ownership; the evaluation is oriented
to the needs of the program stakeholders rather than the funding
agency Participants meet to communicate and negotiate to reach a
consensus on evaluation results, solve problems, and make plans to
improve the program Input is sought and recognized from all
participants. The emphasis is on identifying lessons learned to
help improve program implementation and determine whether targets
were met Citations: Patton, 2008; Sufian, Grunbaum, Henry Akintobi,
Dozier, Eder, Jones, Mullan, Weir, White-Cooper, 2011 Participatory
Evaluation Characteristics
Slide 52
Morehouse School of Medicine Prevention Research Center 2013
The evaluation design is flexible and determined (to the extent
possible) during the group processes The evaluation is based on
empirical data to determine what happened and why Stakeholders may
conduct the evaluation with an outside expert serving as a
facilitator Participatory Evaluation Characteristics Citations:
Patton, 2008; Sufian, Grunbaum, Henry Akintobi, Dozier, Eder,
Jones, Mullan, Weir, White-Cooper, 2011
Slide 53
Morehouse School of Medicine Prevention Research Center 2013
Case Study 2: 2 HYPE Abstinence Education Club The 2 HYPE
Abstinence Education Club (2 HYPE "A" Club) is a co-educational
intervention targeting African American youth ages 12-18 in Fulton,
DeKalb and Clayton counties within Metropolitan Atlanta The program
serves youth in community- based settings, schools and juvenile
facilities, including probationary and long- term detention
centers
Slide 54
Morehouse School of Medicine Prevention Research Center 2013
Case Study 2: 2 HYPE Abstinence Education Club Comprehensive
Approach Including: promotion of delayed sexual activity violence
prevention stress reduction & understanding of abstinence
benefits Creative Arts Reinforce Abstinence Curriculum Club
Activities i.e. Hip Hop Caf (hip-hop rap, poetry, & dance
performances) Peer Educator Training Parent Workshops
Slide 55
Morehouse School of Medicine Prevention Research Center 2013
Youth Participants Case Study 2: 2 HYPE Abstinence Education Club
Wholistic Stress Control Institute Morehouse School of Medicine
Prevention Research Center
Slide 56
Morehouse School of Medicine Prevention Research Center 2013
Morehouse School of Medicine Prevention Research Center 201 Case
Study 2: 2 HYPE Abstinence Education Club Needs Assessment Survey
Development Literature Review Comparable Programs Tools,
Instruments Validity, Reliability, Cultural Context Identified
Trends in Program Data Process Evaluation Measures Evaluation
Advisory Group Survey Pilot Testing Four focus groups 30 African
American youth ages 12 to 19
Slide 57
Morehouse School of Medicine Prevention Research Center 2013
Case Study 2: 2 HYPE Abstinence Education Club-Results Expanded
understanding of how African American youth conceptualize of
marriage, family and their futures Evaluation Implications:
Findings signal the need for an expanded approach to implementing
and assessing programs designed to reduce adverse sexual health
outcomes among African American youth in urban settings Survey
Revision Context for Interpretation of Results
Slide 58
Morehouse School of Medicine Prevention Research Center 2013
Evaluation Plan Development Basics Consider Reporting/Dissemination
Who will you share results with? Will you share results with your
community or selected stakeholders? Who? How? When? Why will this
reporting/dissemination activity be important?
Slide 59
Morehouse School of Medicine Prevention Research Center 2013
Are the right community members at the table? This is a question
that needs to be reassessed throughout the program or intervention
because the right community members might change over time Does the
process and structure of meetings allow for all voices to be heard
and equally valued? For example, where do meetings take place, at
what time of day or night, and who leads the meetings? What is the
mechanism for decision-making or coming to consensus; how are
conflicts handled? How are community members involved in developing
the program or intervention? Did they help conceptualize the
project, establish project goals, and develop or plan the project?
How did community members help assure that the program or
intervention is culturally sensitive? Questions to Consider When
Evaluating Community Engagement Citations: CDC, 2009; Green et al,
1995; Israel et al, 1998; Sufian, Grunbaum, Henry Akintobi, Dozier,
Eder, Jones, Mullan, Weir, White-Cooper, 2011.
Slide 60
Morehouse School of Medicine Prevention Research Center 2013
How are community members involved in implementing the program or
intervention? Did they assist with the development of study
materials or the implementation of project activities or provide
space? How are community members involved in program evaluation or
data analysis? Did they help interpret or synthesize conclusions?
Did they help develop or disseminate materials? Are they coauthors
on all publication or products? What kind of learning has occurred,
for both the community and the academics? Have community members
learned about evaluation or research methods? Have academics
learned about the community health issues? Are there examples of
co-learning? Citations: CDC, 2009; Green et al, 1995; Israel et al,
1998; Sufian, Grunbaum, Henry Akintobi, Dozier, Eder, Jones,
Mullan, Weir, White- Cooper, 2011. Questions to Consider When
Evaluating Community Engagement
Slide 61
Morehouse School of Medicine Prevention Research Center 2013
Challenges in Community Needs Assessment Partnership Development,
Relationship, Formation and Maintenance Different Individual and
Organizational Cultures Values Practical vs. Statistical
Significance Varying Understanding of the Evaluation or Research
Process
Slide 62
Morehouse School of Medicine Prevention Research Center 2013
Key Steps in Community Partnership Development, Relationship,
Formation and Maintenance Clearly Define Target Population Provide
Detailed Descriptions of Process and Outcomes Specify the Measures
and Instruments for Data Collection Clarify Timeline for Conducting
all Evaluation Activities
Slide 63
Morehouse School of Medicine Prevention Research Center 2013
Benefits in Community Evaluation Partnership Development,
Relationship, Formation and Maintenance Community Credibility
Increased Recruitment Sustained Retention Expanded Funding and
Human Resources Collaboration in Dissemination of Emerging Models,
Best Practices and Outcomes of Community Engagement and Associated
Research
Slide 64
Morehouse School of Medicine Prevention Research Center 2013
Community Engagement & Evaluation Blumenthal, D. How do you
start working with a community? Section 4a of Challenges in
Improving Community Engagement in Research, Henry Akintobi, T,
Goodin, L., Trammel, E., Collins, D., & Blumenthal, D. How do
you set up and maintain a community advisory board? Section 4b of
Challenges in Improving Community Engagement in Research, Sufian,
M., Grunbaum, J., Akintobi, T., Dozier, A., Eder, M., Jones, S.,
Mullan, P. Weir, C.R., & White-Cooper, S. Program Evaluation
and Evaluating Community Engagement
http://www.atsdr.cdc.gov/communityengagement
Slide 65
Morehouse School of Medicine Prevention Research Center 2013
Benefits in Community Evaluation Partnership Development,
Relationship, Formation and Maintenance Akintobi, T.H., Trotter,
J.C., Evans, D., Johnson, T., Laster, N., Jacobs, D., & King,
T. (2011). Applications in Bridging the Gap: A Community-Campus
Partnership to Address Sexual Health Disparities Among African-
American Youth in the South. Journal of Community Health, 36,
486-494. PMID: 21107895 Mayberry, R., Daniels, P., Yancey, E.,
Henry Akintobi, T., Berry, J., & Clark, N. (2009). Enhancing
community- based organizations' capacity for HIV/AIDS education and
prevention. Journal of Evaluation and Program Planning, 32(6),
213-220. PMID: 19376579 Mayberry, R., Daniels, P., Henry Akintobi,
T., Yancey, E., Berry, J., & Clark, N. (2008). Community-based
organizations capacity to plan, implement, and evaluate success.
Journal of Community Health, 33(5). PMID: 18500451 Henry Akintobi,
T, Goodin, L., Trammel, E., Collins, D., & Blumenthal, D.
(2011). How do you set up and maintain a community advisory board?
Section 4b of Challenges in Improving Community Engaged Research,
Clinical and Translational Science Award Community Engagement Key
Function Committee Task Force on the Principles of Community
Engagement (Chapter 5, Section 4b). Principles of Community
Engagement, 2 nd edition. Washington, DC: U.S. Department of Health
and Human Services. Sufian, M., Grunbaum, J., Akintobi, T., Dozier,
A., Eder, M., Jones, S., Mullan, P. Weir, C.R., & White-Cooper,
S. (2011). Evaluating Community Engagement. Chapter 7 of Clinical
and Translational Science Award Community Engagement Key Function
Committee Task Force on the Principles of Community Engagement.
Principles of Community Engagement, 2 nd edition. Washington, DC:
U.S. Department of Health and Human Services, 2011
Slide 66
Morehouse School of Medicine Prevention Research Center 2013
Lessons Learned Document Progress Watch and Learn Communication
Patience Remembering Your Role Finding Mutual Benefit Community As
a Partner
Slide 67
Morehouse School of Medicine Prevention Research Center 2013
Morehouse School of Medicine Prevention Research Center 201 Contact
Information Morehouse School of Medicine Prevention Research Center
Morehouse School of Medicine Prevention Research Center 720
Westview Dr., SW, Atlanta, GA 30310 720 Westview Dr., SW, Atlanta,
GA 30310 Phone: 404-752-1022 Phone: 404-752-1022 Email: Email:
[email protected][email protected]: www.msm.edu/prc
www.msm.edu/prc