Date post: | 18-Jan-2016 |
Category: |
Documents |
Upload: | merryl-mccormick |
View: | 215 times |
Download: | 0 times |
Using results frameworks to shift the focus of evaluation to a strategic level
Emerging research on the principles underpinning results frameworks
Kate Averill
AES Conference
Sydney 2011
Building futures
Using results frameworks to shift the focus of evaluation to a strategic level
Emerging research on the principles underpinning results frameworks
Kate Averill
AES Conference
Sydney 2011
Building futures
Presentation
Why frameworks?
Shifting evaluation focus – to strategic and sector levels
Architecture and use of results frameworks
What are results frameworks?
Principles underpinning results frameworks
Implications for practitioners
Why frameworks? Frameworks are promoted as providing the structural, strategy and
performance links between the different actors in development cooperation (Aid Effectiveness Review, AusAID 2011)
Since 2002, move towards more systematic approach to planning, monitoring and evaluating for results - both outputs and outcomes, and impacts (Aucoin & Jarvis, 2004)
Monitoring and evaluation systems/frameworks include: (a) identify intended results (outputs, outcomes, impacts)
(b) measure results (monitoring and evaluation)
(c) use evidence-based information to improve programs (Leonard & Bayley, 2008; Kusek & Rist 2004)
To enhance development and aid effectiveness
How ‘evaluation’ fits within this agenda is under debate
Personal experience and literature
Research problem/need - starting point Paris Declaration on Aid Effectiveness includes five partnership
commitments
that need to be interpreted in light of the specific situation of each partner country of ownership, alignment, harmonisation, managing for results, and mutual accountability
(OECD-DAC, 2005)
12 indicators of progress measured nationally and monitored internationally (OECD-DAC, 2006)
Indicator 11: Sound Frameworks to Monitor Results - number of countries with transparent and monitorable performance assessment frameworks to assess progress against:
(a) national development strategies
(b) sector programmes
2008 survey monitoring Paris Declaration on use of frameworks showed increase from 7% in 2005 to 9% in 2008. OECD target for 2010 - reduce gap by one-third. Further survey in 2011
Context – development and evaluation
Monitoring &Evaluationactivities
Management (national, NGO, private
sector and donor)
Country development (national goals and outcomes)
Data andanalysisevidence
Evaluation - changed paradigm Stakeholder dialogue Results framework Needs analysis Sector outcomes frameworks Baselines
Organizational activities Program design Program logic Program measurement Methodology Contribution and attribution Activity and program baselines Measurement methods Data collection Activity and program analysis Feedback into program planning Reporting
Aggregation Sector analysis Country analysis Feedback into management decision-making Reporting
Reconceptualised relationships between country and sector development and evaluation
(Averill, 2010)
Shifting evaluation focus to a strategic level
New paradigms in development cooperation are emerging: shift of focus to developing countries becoming driving force of their
own development and country systems country system - national arrangements and procedures for
public financial management, procurement, audit, monitoring and evaluation and social and environmental procedures (OECD, 2009)
Increasingly, emphasis is now being placed on strengthening national-level monitoring and evaluation systems led by countries (Ba Tall, K. 2009)
Focus is now at country and sector levels but program level activities are important and their alignment to sector goals
agencies and donors align activities and programs to sector/national goals
capacity and design considerations
What are results frameworks?
Key terms and components of frameworks are not commonly defined in the literature and in practice:Results framework (and diagram) - links between country strategic goals, higher level sector outcomes, country organisational structures, key stakeholder relationships and development partners (adapted from Binnendijk, 2001)
Outcomes frameworks - shows hierarchy of key outcomes for sector or overarching multi-program (adapted from Duignan, 2004). May include multiple outcome layers—sector, region, agency, individual
Program logic - links at program level between inputs-outputs-outcomes/impacts, theory of change, context and assumptions. (Results-chain term used interchangeably with logic model or program logic)
Literature: draws from evaluation, development, management and governance knowledge areas
Results Framework (Averill 2011) Three components:
1. Stakeholder / process map
2. Results/outcomes models – strategic and program levels
3. Associated measurement frameworks
Purpose of frameworks – planning, on-going monitoring, evaluation or ‘evaluative’ monitoring
Principles underpinning architecture and use of frameworks are important to enhance development and aid effectiveness (research)
Reposition evaluation as management strategy – linking strategy, learning and performance
Three phases: cyclical and iterative (1/3 principle)
1. Plan
2. Monitor, research and assess
3. Report and change
Example: Uganda - PEAP Key Strategic Results within Greater Accountability Framework (Tumusiime-Mutebile, 2002)
GOVERNANCE
1. Reduced Income Poverty and Inequality
3. Improved GDP growth
2. Improved Human Development
PEAP Key Strategic Results
Policy Makers
Central Government Executive
Citizens Service Providers
Results/outcomes model
OutcomeOutcome OutcomeOutcome
OutcomeOutcome OutcomeOutcome
OutcomeOutcome
OutcomeOutcome
OutcomeOutcome OutcomeOutcome OutcomeOutcome
Indicator Indicator
Quantitative indicator Mechanism / example
Aligning results at different levels:
Using strategic and program models to align results at different levels
Context and need
Inp
uts
Ou
tpu
tsO
utc
om
es /
imp
acts
OutcomeOutcome
OutcomeOutcome OutcomeOutcome
OutcomeOutcome OutcomeOutcome
OutcomeOutcome OutcomeOutcome OutcomeOutcome
Quantitative indicatorMechanism / example
Quantitative indicator
Programme one Programme two
OutcomeOutcome OutcomeOutcome
OutcomeOutcome OutcomeOutcome
OutcomeOutcome
OutcomeOutcome OutcomeOutcome OutcomeOutcome
Quantitative indicatorMechanism / example
Quantitative indicator
Measurement Framework
Model Box Indicator Measure/Unit Projection 2012 2009 2010
M 1 Carnival improves infrastructure at sites 2010/11
Repair work on jetties at three sites, installation of IALA navigation beacons and shuttle boat at Mystery Island
Improvements completed. New navigation beacon at all sites, improvement of wharfs and boat bought for Mystery Island
Nil Work commenced.
Mystery Is Improvements completed.
All key elements completed. Study and basic work on wharf, beacons installed.
Upgrading to make more secure and floating pontoon added. Nav guides installed. Boat provided.
Wala Improvements completed.
0 Nil
Cham Beach Improvements completed.
All key elements completed. 0 Toilet block commenced.
Toilet Block meets specifications
Toilet blocks completed to specs.
Meets environmental standards
Mystery Is Toilet blocks completed to specs.
Completed and used for all ship visits
Bush toilets Built and waiting for tanks to fills before using
Wala Toilet blocks completed to specs.
Completed and used for all ship visits.
Small number flushing toilets 0
Learning and implications
Values - collaborative, ownership, participatory, cultural competency, credible, ‘fit for purpose’ , ethical
Reposition evaluation as management strategy linking strategy and ‘evaluation’
Results Framework - foundation tool - 3 components
Purpose and design of results frameworks (planning, monitoring and ‘evaluation’ or ‘evaluative monitoring’)
Approach - on-going, point in time knowledge
Internal/external – ownership and building capacity for sustainability
Learning and implications Three phases - cyclical and systematic (1/3 principle)
1. Plan - needs, context, scope, theory of change – framework
2. Monitor, research & assess – measurement, data collection, recording, analysis and assessment
3. Report and change – timely, credible, cyclical, iterative.
Building capacity for iterative learning with support
Challenges – ‘tunnelling’, indicators, scope, use of information
Value – reporting on progress and enabling timely response to emerging evidence
Enhancing development and aid effectiveness
Next steps research and practice Research process:
key informant interviews case study fieldwork:
New Zealand Papua New Guinea Samoa Laos
Examining: approaches and the use of results frameworks in public sector different perspectives of stakeholders
Aim - identify principles underpinning the architecture and use of results frameworks to enhance development and aid effectiveness and the implications for practitioners
Continue working with and supporting people and organisations in their planning and evaluation to build capacity and enhance decision making
Share learning from research and practice contributing to knowledge base