Post on 31-May-2020
transcript
Pittsburgh, PA 15213-3890
Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213-3890
Sponsored by the U.S. Department of Defense© 2002 by Carnegie Mellon University
Developing Enterprise-WideMeasures for TrackingPerformance of AcquisitionOrganizations
Wolfhart Goethert
Report Documentation Page Form ApprovedOMB No. 0704-0188
Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering andmaintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, ArlingtonVA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if itdoes not display a currently valid OMB control number.
1. REPORT DATE JAN 2003 2. REPORT TYPE
3. DATES COVERED 00-00-2003 to 00-00-2003
4. TITLE AND SUBTITLE Developing Enterprise-Wide Measures for Tracking Performance ofAcquisition Organizations
5a. CONTRACT NUMBER
5b. GRANT NUMBER
5c. PROGRAM ELEMENT NUMBER
6. AUTHOR(S) 5d. PROJECT NUMBER
5e. TASK NUMBER
5f. WORK UNIT NUMBER
7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Carnegie Mellon University,Software Engineering Institute,Pittsburgh,PA,15213
8. PERFORMING ORGANIZATIONREPORT NUMBER
9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)
11. SPONSOR/MONITOR’S REPORT NUMBER(S)
12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited
13. SUPPLEMENTARY NOTES
14. ABSTRACT
15. SUBJECT TERMS
16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as
Report (SAR)
18. NUMBEROF PAGES
28
19a. NAME OFRESPONSIBLE PERSON
a. REPORT unclassified
b. ABSTRACT unclassified
c. THIS PAGE unclassified
Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Purpose of Overall Effort
Develop a methodology to define enterprise-widemeasures that reflect the “health” of a governmentorganization that supports acquisition.
Apply methodology to ensure alignment between theenterprise-level goals of an organization and themeasures used to characterize that organization'sperformance.
Use these measures as a guide to their overallperformance and performance improvement effort.
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Overview Outline
Methodology
Major components• BSC• GQ(I)M
Example use• Initial measurement areas• Indicators
Summary
DevelopStrategic Goals
Mission VisionClarify mission &Vision statement
Strategic Goals
Derive Sub-Goals
Sub-Goals
Map Sub-Goals to eachquadrant of theBalanced Score Card
Apply GQ(I)M to: - identify measurement areas - develop measurement goals - pose relevant questions - postulate indicators - identify data elements
For each BSC Quadrant
Data Elements
Module
Trou
ble
Rep
ort
s
Indicators
Balanced Scorecard
Internal Business• Sub-Goals
Learning & Growth• Sub-Goals
Customer• Sub-Goals
Financial• Sub-Goals
Internal Business• Sub-Goals
Learning & Growth• Sub-Goals
Customer• Sub-Goals
Financial• Sub-Goals
MethodologyOverview
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Major Components
GQ(I)M• Align measures with goals; ensure measures selectedwill be used
Balanced Scorecard• Ensure set of measures provides coverage of allelements of performance; avoid hidden trade-offs
Process Model of Performance• Select measures that are most meaningful with respectto selected areas of performance; prefer outcome thenoutput measures over process and input measures
A Balanced Scorecard Perspectiveon Performance
Source: A Management Guide forthe deployment of strategicmetrics, Ratheon
Ob
ject
ive
Mea
sure
sT
arg
ets
Init
iati
vesCUSTOMER
How do our customers see us?
Ob
ject
ive
Mea
sure
sT
arg
ets
Init
iati
ves
LEARNING andGROWTH
Can we continue to improve and create value?
Ob
ject
ive
Mea
sure
sT
arg
ets
Init
iati
vesFINANCIAL
How do we look to shareholders?
Ob
ject
ive
Mea
sure
sT
arg
ets
Init
iati
ves
INTERNAL BUSINESSPROCESS
What must we excel at?
Visionand
Strategy
Ob
ject
ive
Mea
sure
sT
arg
ets
Init
iati
vesCUSTOMER
How do our customers see us?
Ob
ject
ive
Mea
sure
sT
arg
ets
Init
iati
ves
LEARNING andGROWTH
Can we continue to improve and create value?
Ob
ject
ive
Mea
sure
sT
arg
ets
Init
iati
vesFINANCIAL
How do we look to shareholders?
Ob
ject
ive
Mea
sure
sT
arg
ets
Init
iati
ves
INTERNAL BUSINESSPROCESS
What must we excel at?
Visionand
Strategy
Visionand
Strategy
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Success Vs Progress Indicators
Success CriteriaGoal
Strategy to accomplish the goal
Progress Indicators
Success Indicators
Task 1Task 2Task 3
Task n
Tasks to Accomplish goal
••
Task 1Task 2Task 3
Task n
Tasks to Accomplish goal
••••
Analysis Indicators
80
204060
100
Tasks
Test
Cas
esC
om
ple
te
FunctionsReporting Periods
Planned
Actual
1 2 3 4 1 2 3 4
%
Reporting Periods
80
204060
100
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Identifying Potential Measures: AProcess Model of Performance
Inputs Process Outputs Outcomes
Impact oncustomer or user
Productsand services
Throughput,tasks
Resourcesconsumed
PotentialMeasures
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Identifying Potential Measures: AProcess Model of Performance
Inputs Process Outputs Outcomes
Outcomes: trends incustomersatisfaction surveydata, number ofdefects reportedafter release
Outputs – number ofnew featuresreleased, resolutiontime for customerservice calls
Inputs - dollarsspent on customerservice training,dollars spent onquality assurance
Process - number ofwork productinspectionsperformed, numberof tests performed
Goal: Increase Customer Satisfaction
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Balanced Scorecard Perspective:A Multi-dimensional view
Financial Perspective
Customer Perspective Internal Business Perspective
Innovation and Learning Perspective
Source: Kaplan and Norton, ”Putting the Balanced Scorecard to Work” Harvard Business Review, Sept-Oct 1993
VisionAnd
Strategy
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
DefiningIndicators
&Measures
BasedUponGoals
GOAL(s)
Question 1 Question 2 Question n• • •
SLOC Staff-hours Trouble Reports Milestone dates
Reporting Periods
Total SLOC Planned
Actual
Weeks
Tro
uble
Rp
ts
Module
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Overview Outline
Methodology
Major components• BSC• GQ(I)M
Example use• Initial measurement areas• Indicators
Summary
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Organization
Example based on aggregate of several organizations withsimilar characteristics• Government agency consisting of 300 management,
administrative, and technical personnel• Development, maintenance and enhancement of
system components of fielded systems, and acquisition
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Use of Methodology - Example
Mission
Strategic Goals
Measurement Workshop
Purpose
Sub-Goals
- Clarify Mission and Vision- Develop Strategic Goals- Derive Sub-Goals- Map sub-goals to each quadrant of the BSC
Customer• Timeliness• Responsiveness• Communication• Relationship• Quality of products• Etc.
Develop, acquirer, andmaintain integratedsoftware-intensivesystems
Financial• Funding
stability• Delivered
costs• Etc.
InternalBusiness
• Qualitydeficiencies
• Availableresources
• Etc.
Learning andGrowth
• Enhance staffcapability
• Improvementquality
• Etc.
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Success Criteria
Balanced Scorecard
Internal Business• Sub-Goals
Learning & Growth• Sub-Goals
Customer• Sub-Goals
Financial• Sub-Goals
Internal Business• Sub-Goals
Learning & Growth• Sub-Goals
Customer• Sub-Goals
Financial• Sub-Goals
BSC QuadrantStrategic SuccessSub-Goals Criteria
FinancialInternal Business Process
CustomerLearning and Growth
SuccessCriteria
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Typical Questions Related to Sub-Goals
Customers’ Viewpoint• What is important to our customer? What are the
customers’ “hot buttons”?• How do our customers evaluate timeliness?• What does the customer consider a quality product?
Are there any standards or goals currently set by thecustomer?
• How and what do our customers currently evaluate ourorganization?
• Etc.
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Initial Measurement Areas
Customer
Customer satisfaction with delivered product Compliance with customer requirements On time delivery
Internal Business
Availability and capability of resources (staff)
Status of open deficiencies in delivered projects
Timeliness of projects completion
Innovation & Learning
CMM level Trend in employee satisfaction Meeting functional requirements
Financial
Funding stability Trend in Expenses
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Measurement Areas to Indicators
INTERNAL BUS.
• Meas. Area
LEARNING & GROWTH
CUSTOMER• Meas. Area
FINANCIAL• Meas. Area
Meas. Area
Module
Trou
ble
Rep
ort
s
Indicators
GQ(I)M Methodology
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Internal BusinessStatus of Open Deficiencies in Delivered Projects
x < 30 30 < x ≥≥≥≥ 60 60 < x ≥≥≥≥ 90 x > 90
Number of DeficienciesThat Have Been Open x Days
Severity 1
Severity 2
Severity 3
Severity 4
Severity 5
SeverityLevels Totals
Totals
2 1
3 1
3 24 3 3 2
8 6 3 3
1
1 1
20 13 8 6
3
5
7
12
20
47
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Internal BusinessAvailability and Capability of Resources (Staff)
GOAL
# % # % # %Entry LevelJourneymanHigh Grade
Entry LevelJourneymanHigh Grade
Entry Level
JourneymanHigh Grade
FY 99 FY 00 FY 01
45%
15%
E&S
Tech
Other
40%
GOAL
GOAL
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Internal BusinessTimeliness of Project Completion
10 38%
6 46%
2 15%
Completed Projects in Reporting Period
Reporting Period
Nu
mb
e r o
f P
roje
cts
0
15
10
5
20
25
10
15
20
5
Lat
eO
n T
ime
12 44%
15 56%
15 60%
10 40%
Period 1 Period 2 Period 3
on time or earlyexceeded original schedule by less than 10%exceeded original schedule by more than 10%
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Summary of Initial Results
Customer
Learning and
Growth
Financial
Internal BusinessProcess
Balanced ScorecardCustomer
Learning and
Growth
Financial
Internal BusinessProcess
Balanced ScorecardCustomerCustomer
Learning and
Growth
Learning and
Growth
FinancialFinancial
Internal BusinessProcess
Internal BusinessProcess
Balanced Scorecard
Funding stabilityTrend in Expenses••
Avail. & capability of staffStatus of open deficienciesTimeliness of project completion
Trend in employee satisfactionMeeting functional requirementsCMM Level
•••
Satisfied with deliveredProductCompliant with requirementsOn-time delivery•
•
•
•
••
x < 30 30 < x ≥≥≥≥ 60 60 < x ≥≥≥≥ 90 x > 90
Number of DeficienciesThat Have Been Open x Days
Severity 1
Severity 2
Severity 3
Severity 4
Severity 5
SeverityLevels Totals
Totals
2 1
3 1
3 24 3 3 2
8 6 3 3
1
1 1
20 13 8 6
3
5
7
12
20
47
0
2
4
6
8
10
12
1 2 3 4 5SA/CMM Level
Nu
mb
er o
f O
rgan
izat
ion
s
Total SystemsFull
CompliancePartial
Compliance
# % # %
Compliance with customer requirements
Travel
PurchasesMiscTraining
Personnel
ContractServices
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
IndicatorDocumentation
INDICATOR TEMPLATE
ObjectiveQuestionsVisual Display
Interpretation
Evolution
Assumptions
X-referenceProbing Questions
Input(s)Data Elements
Responsibilityfor Reporting
Form(s)Algorithm
80
2040
60
100
Measurement Goal #_____:INDICATOR TEMPLATE
ObjectiveQuestionsVisual Display
Interpretation
Evolution
Assumptions
X-referenceProbing Questions
Input(s)Data Elements
Responsibilityfor Reporting
Form(s)Algorithm
80
2040
60
100
Measurement Goal #_____:INDICATOR TEMPLATE
ObjectiveQuestionsVisual Display
Interpretation
Evolution
Assumptions
X-referenceProbing Questions
Input(s)Data Elements
Responsibilityfor Reporting
Form(s)Algorithm
80
2040
60
100
Measurement Goal #_____:
Documents the why,what, who, when,where, and how
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Overview Outline
Methodology
Major components• BSC• GQ(I)M
Example use• Initial measurement areas• Indicators
Summary
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Summary
The approach, using the BSC and GQ(I)M, provides a systematicway to obtain indicators and measures that reflect the health andperformance of the organization.
The approach uses an organization’s vision and missionstatements to identify and clarify strategic goals and sub-goals.
The sub-goals are mapped to the balanced scorecard.
The GQ(I)M methodology is then used to identify measures andindicators
We tried it; It worked; Now maturing methodologyBottom Line
DevelopStrategic Goals
Mission draft VisionClarify mission &Vision statement
Strategic Goals
Derive Sub-Goals
Sub-Goals
Map Sub-Goals to eachquadrant of theBalanced Score Card
Apply GQ(I)M to: - identify measurement areas - develop measurement goals - pose relevant questions - postulate indicators - identify data elements
For each BSC Quadrant
Data Elements
Module
Trou
ble
Rep
ort
s
Indicators
Balanced Scorecard
Internal Business• Sub-Goals
Learning & Growth• Sub-Goals
Customer• Sub-Goals
Financial• Sub-Goals
Internal Business• Sub-Goals
Learning & Growth• Sub-Goals
Customer• Sub-Goals
Financial• Sub-Goals
Methodology
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Back-up Material
© 2002 by Carnegie Mellon University
Carnegie MellonSoftware Engineering Institute
Some DefinitionsPerformance Management“The use of performance measurement information to helpset agreed-upon performance goals, allocate and prioritizeresources, inform managers to either confirm or changecurrent policy or program directions to meet those goals,and report on the success in meeting those goals.”
Performance Measurement“A process of assessing progress towards achievingpredetermined goals, including information on [efficiency,quality, and] outcomes….
Source: “Serving the American Public: Best practices in performance measurement,” June 1997.