+ All Categories
Home > Documents > Government Wide Performance Management Lessons from across the pond

Government Wide Performance Management Lessons from across the pond

Date post: 01-Jan-2016
Category:
Upload: kathleen-weaver
View: 23 times
Download: 0 times
Share this document with a friend
Description:
Government Wide Performance Management Lessons from across the pond. Some Context. The Start. Pre 1998 Comprehensive Spending Review Annual budgeting – though intent to move to RAB had been stated Little connection to outcomes The First Comprehensive Spending Review in 1998 Move to RAB - PowerPoint PPT Presentation
Popular Tags:
26
Government Wide Performance Management Lessons from across the pond
Transcript
Page 1: Government Wide Performance Management Lessons from across the pond

Government Wide Performance ManagementLessons from across the pond

Page 2: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential2

Some Context

Page 3: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential3

The Start

Pre 1998 Comprehensive Spending Review

• Annual budgeting – though intent to move to RAB had been stated

• Little connection to outcomes

The First Comprehensive Spending Review in 1998

• Move to RAB

• Move to three year budgeting

• Introduction of Public Service Agreements – 600 performance targets for 35 areas of Government

Page 4: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential4

The Evolution

“We were about to announce significant increases in spending and I felt it important that we explained to the public what they would get in return. The US had recently adopted GPRA and the UK system of allocating funds to government departments without any sense of what they would deliver seemed behind the times. The first set of PSAs were a little rough and ready but they did establish the principle.”

Suma Chakrabarti, then head of General Expenditure Policy in HM Treasury

Spending Reviews every 2 years until 2004

• 2000 – PSAs reduced to 160 and the introduction of joint targets and Technical Notes and Service Delivery Agreements

• 2001 – Creation of Prime Ministers’ Delivery Unit (PMDU)

• 2002 – Move towards longer term targets and PSAs reduced to 130, more joint targets

• 2004 – Number of targets reduced to 110, full reconciliation between 2000 - 2004 targets published. Targets developed in consultation with delivery chains

The First Comprehensive Spending Review in 1998

• Move to RAB

• Introduction of Public Service Agreements

Page 5: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential5

The Evolution

Page 6: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential6

Measurement and Targets - What did we learn?

Page 7: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential7

When are targets and measurement good?

- To create focus – e.g.: Kennedy

- To drive performance – Jobcentre Plus

- To create accountability – NYPD; funding agencies

- To understand what works – Annual Pupil Level School Census

- To focus efforts when the outcome is immeasurable – More Later

Page 8: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential8

When is Measurement Bad?

- Process focussed – e.g. Housing

- Perverse incentives – e.g. School exclusions

- Too many

- No clear line of sight

- Too late/ infrequent

- Poor Data

Page 9: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential9

Some Examples

Page 10: Government Wide Performance Management Lessons from across the pond

Performance goals should be:• As outcome focussed a possible – to ensure innovation and flexibility in

delivery and reduce the risks of perverse incentives through chasing outputs

• For example: ‒ Life expectancy‒ Even for ‘Miss World’ targets

Key lessons

Page 11: Government Wide Performance Management Lessons from across the pond

Performance goals should be:• Few in number to ensure focus and communicate priorities right down the

delivery chain – including agencies and other key stakeholders. • Aligned all the way down the delivery chain – reflected in the priorities,

objectives and incentive structures for agencies, teams and individuals.

• For example: ‒ UK DfID‒ DCSF

Key lessons

Page 12: Government Wide Performance Management Lessons from across the pond

Performance goals should be:• Stretching – accepting that not all objectives targets will be met but that is

better than setting aspirations too low.

• e.g.‒ Crime Reduction‒ Literacy and Numeracy

Key lessons

Page 13: Government Wide Performance Management Lessons from across the pond

Performance goals should be:• Tested with the delivery chain to increase buy-in and reduce the risk of

perverse incentives in the delivery of outputs (particularly at agency level).

• For example: ‒ School admissions/ exclusions‒ ER‒ Waiting Lists

Key lessons

Page 14: Government Wide Performance Management Lessons from across the pond

Performance goals should be:• Designed with a full appreciation of the how outcomes need to be

distributed as well as the likely implications on delivery.

• For example: ‒ Customer Satisfaction‒ Fear of Crime‒ School Performance‒ Smoking Ban

Key lessons

Page 15: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential15

Making it a reality – What did we learn

Page 16: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

Lessons from PMDU

Controversy without impact

Transformation

Status Quo Improved Outcomes

16

Quality of execution

Page 17: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

PMDU’s Deliverology

Ambition • Believe in step change• Get it done as well as possible

Focus • Clear sustained priorities• Avoiding distractions

Clarity • “Confront the brutal facts”• Know what‟s happening now• Understand stakeholders

Urgency • People are impatient• “If everything seems under control,

you're not going fast enough”

Irreversibility • Structure, culture, results• Avoid celebrating success too soon

17

Page 18: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

The PMDU’s Approach

• Targets, which set measurable goals

• Plans, which are used to manage delivery and set out the key milestones and trajectories

• Monthly reports on key themes

• Stocktakes, which the Prime Minister holds every 2/3 months

• Priority reviews, to check the reality of delivery at the frontline

• Problem-solving/Corrective action, where necessary

• Delivery reports, summarizing the government's progress on delivery every six months

18

Page 19: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

Key Questions to answer in a Delivery Plan

• What is the service delivery chain?

• •Who is accountable at the top . . . and all along the delivery chain?

• •What are the key actions (milestones)?

• •What is the timetable?

• •Who are the key stakeholders? How will they be brought on board?

• •What are the major risks? How will they be managed?

• •What impact will the actions have on the key outcomes (trajectories)?

• •What data do you need? Will it be early enough to act if progress is off track?

19

Page 20: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

Low trajectory (policy has a

lagged impact)

Illustrative performance delivery trajectories

0

10

20

30

40

50

60

2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020

Historical performance

Long term strategic goal

Given the plans and the policies, predict the outcomes. Then,

compare with reality and investigate any differences. Ensure plans and policies are in place to influence

trajectory

Given the plans and the policies, predict the outcomes. Then,

compare with reality and investigate any differences. Ensure plans and policies are in place to influence

trajectory

Plan A

Plan B

Plan C

Policy A

Policy B

Policy C

Medium term contract goal

Immediate progress

indicators or milestones

Source: PMDU

The following illustrative trajectory aims to show that once a target is established, credible plans and policies need to underpin its delivery, supported by a system for monitoring that allows a trajectory to be plotted, assessed and influenced as far as possible in order to address underperformance as soon as possible

High trajectory (policy has an immediate impact)

Page 21: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential21

Annex

Page 22: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

Resources, Inputs, Outputs and Outcomes

Resources Inputs Outputs Outcomes

Value for money

Value for money represents the entire relationship between how resources are consumed and the outcomes achieved • How well have the costs of govt actions been translated into desired outcomes?• Which set of interventions is best able to achieve the desired outcomes at the lowest cost?

Economy EffectivenessEfficiency

How well money is transferred into inputs

Represents the relationship between outputs and inputs

The extent to which outputs achieve the desired outcomes

• Resources: the level of grant funding offered the department or agency to support the agency in terms of revenue and capital

• Inputs: these are the resources used to aid delivery, for example, labour, physical assets, IT systems etc.

• Outputs: these are the final products of the organisation, such as the issue of licences etc.

• Outcomes: these are the final impacts and consequences of government activity. Ultimately, outcomes represent what is trying to be achieved. Examples include, longer life expectancy, better educated citizens etc.

OTHER EXTERNALINFLUENCES

££ PSAPSA

Page 23: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

Target and trajectory setting check listIssue Characteristic of an ideal trajectory

Performance indicator • What is it that you a trying to improve?

• Well-defined and easy to understand by ministers, officials and the public and will it be clear whether the target has been met?

• Should do analysis of ‘region’, ‘category’ and ‘initiative/policy’ (see below) BEFORE setting performance indicator and target

What’s the target? • Should be SMART (specific, measurable, achievable, realistic and time-limited) – you need to be clear about timing, i.e. financial, calendar or academic year, beginning or end, rolling average for a year or a snap-shot

• What is the baseline (value and date) to show relative change?

• When will it be known (data lag) – how much time between provisional and final figures?

• Agreed standard – essential when collecting data from several separate units

• Percentage versus absolute numbers – understand the impact of the denominator (% target could be met with change in the denominator) and think carefully about 100% targets, which can be difficult to guarantee and cost prohibitive for the last few percentage points (marginal diminishing returns)

• Level of accuracy, e.g. 10% or 10.1% or 10,000 or 10,301 extra

• National average with average performance increasing but not all units or floor targets to ensure a basic standard, minimum performance

Data collection mechanism

• Regular and frequent (although cost an obvious limitation) – to monitor trends

• Robust – to be sufficient accurate for measurement

• Consistent – different questions can disrupt time series

• Survey vs. census – cost vs. sampling errors

• Independent – to avoid perception of results bias where unit being monitored also collects the data

• Contextual data, regional breakdown – additional information helps data interpretation and cheaper to collect at the same time

Historical data run • At least as long as projection period – helpful to understand past performance (alternative related indicators might be sufficient)

• Seasonality – end of financial year, holidays, winters?

• Previous peaks and troughs – causes?

• ‘Policy-off’ performance – what happens if you do nothing?

Future estimates • Intermediate points (as many as possible) – internal, not public

• Expected impact of initiatives/policies and by when – how much will each initiative contribute to improvement and when will this impact (immediate be delayed)?

• When will you know if policy/initiative x is working (is it early enough to do something about it?) – test prediction vs. reality and where they differ understand why (don’t wait for figures to become available and defend position)

• Evidence of what works needs to be used to identify actions and assessment of initiatives’ impact – is the system working?

Breakdown by ‘region’ or ‘locality’

• Breakdowns should be quantified and be consistent with the national target with available data for performance management – helps understanding of what works

• Range of performance – understand that it is acceptable to have a wide range of performance but be clear about what is an unacceptable range

• Characteristics of good (or poor) performance – why do some units outperform others?

Breakdown by 'category’ • For example, by gender, eligibility, region

Breakdown by policy/initiative’

• Which are the key policies/initiatives?

• Which are the low-impact policies/initiatives?

• When will individual policies/initiatives impact – monitor the delivery of the plans (collect data which allows evaluation of individual policies/initiatives)?

Page 24: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

Target and trajectory setting

The types of targets selected by agencies usually fall within the following broad categories:

• Volume (e.g. handle x enquires per day)

• Quality of service (e.g. improve customer satisfaction by x% year on year)

• Efficiency (e.g. reduce the time/cost/labour required to do x)

• Financial performance (e.g. reduce the unit cost of x by y% )

Once established, Organisations need to build a trajectory for how a target will be met. Why?

• To predict, monitor and manage performance (at local and national) rather than react, to know whether you are on track

• To consider the impact of policies on performance using evidence

• To understand performance, what works and does not work as well as where it works and does not

• To know when an initiative/policy is working

• To know early enough when something is not working to have time to do something about it

Page 25: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

The characteristics of a good performance measure/target

A performance measure should be:

• Relevant to what the organisation is aiming to achieve;

• able to Avoid perverse incentives - not encourage unwanted or wasteful behaviour;

• Attributable - the activity measured must be capable of being influenced by actions which can be attributed to the organisation, and it should be clear where accountability lies;

• Well-defined - with a clear, unambiguous definition so that data will be collected consistently, and the measure is easy to understand and use;

• Timely, producing data regularly enough to track progress and, quickly enough for the data to still be useful;

• Reliable - accurate enough for its intended use, and responsive to change;

• Comparable with either past periods or similar programmes elsewhere; and

• Verifiable, with clear documentation behind it, so that the processes which produce the measure can be validated.

Page 26: Government Wide Performance Management Lessons from across the pond

© 2009 Deloitte MCS Limited. Private and confidential

The basics of target setting and performance measurement

The purpose of targets• A clear statement of what you are trying to achieve

• A clear sense of direction

• A focus on delivering results

• A basis for monitoring what is and is not working

• Better accountability

Getting the best out of targets• Not too many…

• Followed through consistently…

• Real measure of success…

• Which are owned by deliverers

Put the target in context• A target is often a proxy for a desired result. This should be a constant consideration to avoid perverse

incentives.

• E.g. Reducing accident and emergency waiting times to four hours is a target aimed at delivering:

‒ Faster treatment

‒ Better health outcomes

‒ Better patient experience of the service

Use targets to rethink activity by working through chains of causation• What is the problem to be addressed?

• What is known to work?

• How can what works be progressed by Government intervention?

• Who are the real deliver agents?


Recommended