Developing Forward-Looking Metrics and Reporting

Post on 22-Nov-2014

5,534 views 3 download

description

Break your business unit out of the cycle of wanting to be more forward-looking but never actually developing metrics or producing reporting to support that goal. Learn best practices and hear practical advice for developing forward-looking metrics across the complete life cycle of metric development, including: metric creation and validation, building awareness and acceptance among leadership, standardization and refinement, and integration into existing reporting.

transcript

Developing Forward-Looking

Metrics and Reporting

Jeff Horon, Mike Yiu University of Michigan Medical School

Grant Review and Analysis Office

May 2011

About Us

Both – Business Analysts, U-M Medical School

– Engage in metric and reporting design,

econometric and financial modeling, design

ad hoc and standardized reporting describing

the Medical School’s research enterprise

Jeff – MBA w/Emphases in Strategy and Finance,

Ross School of Business

– Formal training in decision support

Mike – Bachelors in Economics, U-M

Outline

What are metrics anyway?

So, what’s the problem?

Oh great, a problem, what’s the solution?

Case study

What are metrics?

Quantitative values

Measure, distill real-world

a.k.a. Key Performance Indicators (KPI’s),

performance measures, etc.

Connotation of monitoring, control

Ok

Bad

Good

Why engage in analysis and reporting?

Decision Support!

Ad hoc analysis explicitly supports a decision

Metrics and reporting often implicitly support

decisions, specifically:

How are we doing with respect to _____?

Why engage in analysis and reporting?

Metrics and reporting repeatedly draw and refocus managerial attention over time:

How are we doing with respect to _____?

How much attention do I need to pay to _____?

Bad

Good

Good

Ok

[Really]

Good

Bad

Ok Ok

So, what’s the problem?

Most metrics and reporting describe the past or present, so

you may miss opportunities for ‘course correction’ or you

may find yourself in trouble before you detect it!

Normal

Bad

Good

Good

Ok

[Really]

Good

Ok

Bad

Only clear view is backward-looking

?

Reporting Maturity

Backward-looking reporting asks “How did we do?”

and might imply corrective action in situations where ‘it pays

to correct your mistakes.’

Better reporting provides context about the present and

assists in decision support

The best reporting includes forward-looking views that

enable proactive decision-making (Decision Support!

Decision Support! D-e-c-i-s-i-o-n S-u-p-p-o-r-t!)

Problem

Every business unit wants to be forward-looking

Most reporting only provides a backward-looking

view; We only have data about events that have

already happened

How do you break out of this cycle?

Solution

Breaking out requires creating and developing forward-

looking views, using available data to create expectations

about the future

Today Future Past

Ok

Solution

Forward-looking views may allow for ‘course correction’

Normal

Bad

Good

[Really]

Good

Ok

Bad

? ? ?

Possible ‘course correction?

Good

Implementation – How?

Unfortunately, it’s not easy, but we hope to make it tractable

It will require sustained commitment across several critical

steps in the development life cycle:

Create good metrics

Diagnose ‘bad’ metrics

Build awareness and acceptance

Standardize and refine

Integrate with existing reporting

Ask: What do we want to achieve / reward?

(Not: What data do we have available?)

Set good goals (a topic unto itself – SMART Framework: Specific, Measurable, Achievable, Relevant, Time-Bound)

Case Study: The Grant Awards ‘Pipeline’

Create good metrics

Diagnose ‘bad’ metrics

Build awareness and acceptance

Standardize and refine

Integrate with existing reporting

Case Study: Grant Awards ‘Pipeline’

Designed to be forward-looking view of financial

sustainability

Original construction:

All future award commitments for the next fiscal year

or multiple years divided by historical commitments

Historical | Future

Time

Historical | Future

What do we want to achieve / reward?

Create good metrics

Diagnose ‘bad’ metrics

Build awareness and acceptance

Standardize and refine

Integrate with existing reporting

Test, test, test!

Do the metrics reward desired outcomes?

Are the metrics stable? Balanced?

Upon testing, the metric proved to be idiosyncratic –

departments tended to meet the goal in alternating

‘flip-flop’ patterns

Case Study: ‘Flip-Flop’ Pattern

0.00

0.20

0.40

0.60

0.80

1.00

1.20

0 0.2 0.4 0.6 0.8 1 1.2

FY

2008 P

ipelin

e M

etr

ic

FY2007 Pipeline Metric

Pipeline Ratio - Current Fiscal Year vs Prior Fiscal Year

Upon testing, the metric proved to be idiosyncratic – departments tended to meet the goal in alternating ‘flip-flop’ patterns

Diagnosis

By stating the goal relative to the prior year, the Pipeline Ratio could reward poor performance in the prior year and punish good performance in the prior year

Mechanics

When a department performs well, the goal for next year becomes more difficult

When a department performs poorly, the goal for next year becomes easier

These effects combine to flip-flop likely payout structures each year (confirmed by statistical testing):

Year 1 Meets

Goal?

Department Above Goal Yes

Department Below Goal No

Year 2 Meets

Goal?

Easier Goal More Likely

More Difficult Goal Less Likely

Case Study: ‘Flip-Flop’ Pattern

Coefficient P-value

One Year -0.6896 0.0099

Multiple Years -0.6768 0.0014

Case Study: Sawtooth Pattern And there was an additional idiosyncrasy:

Diagnosis

Underlying event patterns will cause the pipeline metric to move in cyclical patterns

Mechanics

Awards are often funded for multiple years, with abrupt endpoints

Renewal does not occur until an award is expiring

These effects produce sawtooth patterns in the pipeline metric, with rapid increases driven by large new awards or renewals, followed by gradual decline over the life of the awards:

Pipeline

Metric

Years

New Award

Linear Decline

In Pipeline

Renewal

Linear Decline

In Pipeline

Case Study: Rewarding Desired Outcomes

Problematically, the highest performing department

by absolute measures fell below goal with respect to

the ratio:

0.00

0.20

0.40

0.60

0.80

1.00

1.20

0 0.2 0.4 0.6 0.8 1 1.2

FY

2008 P

ipelin

e M

etr

ic

FY2007 Pipeline Metric

Pipeline Ratio - Current Fiscal Year vs Prior Fiscal Year

Departments Outperformed

What do we want to achieve / reward?

Create good metrics

Diagnose ‘bad’ metrics

Build awareness and acceptance

Standardize and refine

Integrate with existing reporting

Test, test, test!

Do the metrics reward desired outcomes?

Are the metrics stable? Balanced?

‘Sell’ the idea

Case Study: ‘Selling’ a Refined Pipeline ‘Sold’ by presentation, quasi-white paper

Packaged with a solution – the pipeline metric was refined to:

-Utilize one future year of data to mitigate the sawtooth pattern

-Compare this one future year against the same observation one year earlier (to recognize single- year growth)

-Balanced with a longer-term growth measure (to recognize multiple-year growth, mitigating the effect of punishing good historical performance).

How much refinement?

High Impact

-High per-unit stakes

-High volume

Repeated

Low Impact

-Low per-unit stakes

-Low volume

Not Repeated

High Value

Low Value

What do we want to achieve / reward?

Create good metrics

Diagnose ‘bad’ metrics

Build awareness and acceptance

Standardize and refine

Integrate with existing reporting

Test, test, test!

Do the metrics reward desired outcomes?

Are the metrics stable? Balanced?

‘Sell’ the idea

Into existing reporting

Alongside existing reporting

Practical Implementation

Value Added

Cost

Complexity

Practical Implementation

Value Added

Cost

Value Added

Cost

‘Back of Envelope’

‘Sketching’

‘Low-Fi Prototyping’

Case Study: Related Metrics and Reporting

Sustained commitment and insights from metric

testing, combined with forecasting techniques,

awareness-building, refinement, and standardization

have led to related new metrics and reporting…

Historical and Probable Future Commitments

Time

Trend Analysis

Time

Case Study

…. available ‘on demand’ in a production reporting

environment

Historical and Probable Future Commitments with Gap-to-Trend Prescriptive Analysis

Recap

What do we want to achieve / reward?

Create good metrics

Diagnose ‘bad’ metrics

Build awareness and acceptance

Standardize and refine

Integrate with existing reporting

Test, test, test!

Do the metrics reward desired outcomes?

Are the metrics stable? Balanced?

‘Sell’ the idea

Into existing reporting

Alongside existing reporting

Case Study: Future

Continue to advance reporting maturity of additional

metrics

Improve visualization tactics to describe uncertainty

Q&A

Jeff Horon – jhoron@umich.edu – http://jeffhoron.com

Mike Yiu – myiu@umich.edu

Forecasting: http://jeffhoron.com/2011/02/forecasting/