Post on 18-Oct-2014
description
transcript
Fix it or Flee it?Proven approaches for dealing with
failing, flagging and floundering association programs
Greg Melia, CAE @gmeliacae
What We’ll Do Today
• Identify 5 steps essential to program assessment
• Use real world examples to highlight pitfalls to avoid and key points to remember
• Talk about real world issues and challenges
Photo credit: hebedesign on flickr
Program Assessment is a lot of GRIEF
Program Assessment is a lot of GRIEF
• Goals
• Research
• Impact
• Efficiency
• Finances Credit: Angie Torres on Flickr
Goals
Credit: dkwonsh on flickr
Defining the Goal of the Review
–Work with key stakeholders and decision-makers
–Solicit their key questions:• What do they want to know?• What information will help their decision-
making?• What do they think needs to be
evaluated?
–Refine questions to be SMART
Original Program Aspects
Investigate the Key “Whys”:• Why was the program originally established?• Why has it continued to be offered?• Why has it change over time?
Review the Original Intended Approach:• Who, what, when, where, why and how?• What was implemented? • What was not?
Lessons Learned
• Verbalizing goals of the review are an important part of preparing for change
• Seek superordinate goals
• Some agreements are easier reached upfront
Research3 kinds of data:
Corpus Inscriptionum
Opinions and Descriptions
Imponderabilia of behavior
Bronislaw Malinowski
Credit: Photographie et Ethnologie: Les photographes français au XXème siècle
devant d’autres formes de cultures.
Lessons Learned
• Historical minutes and documents can be VERY informative
• Seek to understand program mutations
• Good decisions based on bad data usually give bad results
• Remember they may have been involved
Impact Measure
• Outputs = Immediate
I attended
• Outcomes = Short-term
I gained knowledge
• Impacts = Long-termI served more members as a
result
Lessons Learned
• Associations tend to over focus on outputs and outcomes.
• Hard to argue “happy & full”, but sometimes necessary.
• Is the program impact large and broad enough to displace alternatives?
Efficiency
Good, Better, BestNever let it rest
Until your Good is Better
And your Better is Best!
Credit: eatwell.in on Flickr
Efficiency
Staff and Volunteer Time
Complaints and Comments
Marketing/Communication
Registration/Orders
Technology Photo credit: mansikka on Flickr
Lessons Learned
• How do people feel about it? Do they have suggestions to increase efficiency?
• What are the trends in terms of level of effort?
• What is cost/benefit of doing it the same way versus trying a new approach?
• Can improved efficiency save it?
Finances
• Direct fixed expenses– Incurred regardless of how many people
participate, buy, or are served
• Variable expenses– Incurred per each additional unit or
person that is serviced
• Mixed expenses– Incurred per each additional set of units
or persons serviced
Types of Costs
• Direct fixed expenses– Staff, Rent– Purchased equipment– Per event contracts
• Variable expenses– Per person contracts– Attendee gifts, handouts, etc.
• Mixed expenses– Temporary help– Office supplies
Key Calculations
• Revenue – Expense = Margin
– Margin per attendee/unit
– Margin per staff hour
– Margin per impact
Allocating Overhead Costs
• Equal allotment
• Proportional allotment – To budget– To number served– To staff hours
Example
Annual Giving Campaign
Revenue:$12,000
Expenses: $ 2,000
# Solicited: 1,000
Staff Hours: 200
# of contributors: 100
Example
• Revenue – Expenses = Margin$12,000 - $2,000 =
$10,000• Margin per unit (solicitation):
$10,000/1,000 = $10
• Margin per staff hour:$10,000/200 =
$50• Margin per impact (contributor):
$10,000/100 = $100
Lessons Learned
• One full flight is more profitable than two half full ones.
• Most do not realize the true full cost.
• Understanding costs helps determine price.
• Price increases (and closing programs) take time.
Tips on Implementation
• Internal Evaluators:– Program knowledge &
skills– May create staff buy-in– Less $$$ outlay
• External Evaluators– Usually seen as more
objective– Assessment expertise– Comparative ideas– More $$$ outlay
Photo credit: Matthew Burpee on Flickr
Implementation
• Who to involve–Stakeholders–Staff–Non-users
• Who to interview–Former volunteers
–Former staff
–Vendors
Photo credit: kierenmccarthy.co.uk
Communication of Results
• Review purpose of review
• Review what was done
• Arrange key findings in logical order, highlighting interpretations where appropriate
• Close with recommendations or issues to be addressed
Credit: EE.UTD.Events on Flickr
Lessons Learned
• Equivocal recommendations get equivocal results
• Decision-makers need synthesized data, not the opportunity for data analysis
• Survey data is important, but the proof is in historical performance
• Plan and communicate transitions
Questions?