Date post: | 17-Dec-2014 |
Category: |
Technology |
Upload: | nasapmc |
View: | 13,065 times |
Download: | 0 times |
Jet Propulsion Laboratory
Jairus HihnScott MorganScott Lewicki
Jet Propulsion Laboratory, California Institute of Technology
Bootstrapping Process Improvement Metrics:
CMMI Level 4 Process Improvement Metrics in a Level 3 World
PM Challenge 2011February 9-10, 2011
Long Beach, CA
J e t P r o p u l s i o n L a b o r a t o r y
Background
The Jet Propulsion Laboratory (JPL) is a Federally Funded Research & Development Center (FFRDC) operated by the California Institute of Technology for the National Aeronautics and Space Administration (NASA).
JPL has around 5000 employees and ~1.7 $B
As part of the NASA team, JPL enables the nation to explore space for the benefit of humankind by developing robotic space missions to:
Explore our own and neighboring planetary systems.Search for life beyond the Earth's confines.Further our understanding of the origins and evolution of the
universe and the laws that govern it.Enable a virtual presence throughout the solar system using the
Deep Space Network and evolving it to the Interplanetary Network of the future.
2
J e t P r o p u l s i o n L a b o r a t o r y
3
Background (cont.)
The Software Quality Improvement (SQI) Project at JPL has been in existence for about 8 years and with the increasingly tight budgets more and more managers want to know• Aren’t you done yet• What are we getting for all of this money
Textbooks and the Carnegie Mellon Software Engineering Institute (SEI) often promote the measurement of changes to well defined baseline indicators, pre and post the process change• For example, use of process control charts and
measuring changes in the control limits.• This type of approach works well for CMMI Maturity
Level 4 & 5 organizations.
J e t P r o p u l s i o n L a b o r a t o r y
4
SQI is trying to perform ‘like’ a CMMI Level 4 organization
Setting priorities using rigorous statistics
where appropriate
Capturing product knowledge systematically as well as process knowledge
Using data to guide day-to-day decision
J e t P r o p u l s i o n L a b o r a t o r y
5
When in reality JPL is assessed at CMMI Level 3
Do our best to capture product and process knowledge
Using data to guide decision when we can
There are pockets of Level 1 and 2 behavior
Tools are not standardized A great deal of flexibility
is permitted to the projects Data is inconsistent
J e t P r o p u l s i o n L a b o r a t o r y
6
So What Can We Do?
Put infrastructure in place Drive decisions with objective information and
data when available Must use a multi-pronged approach
• We do rely heavily on self reports Communicate results as widely as possible
• State of Software Report• Noontime seminars• Management reviews• Section manager quarterly meetings
J e t P r o p u l s i o n L a b o r a t o r y
7
Key Questions What does our software world
look like?
How are we doing?
How can we improve?
How is SQI doing?
How much software is there and what are its characteristics?
How many software engineers are there and what are their characteristics?
How are our projects doing?• Documenting Baselines and
Trends Are we following our processes?
Where should we invest limited resources for process and product improvement?
• Identifying weaknesses• Deriving Impact Measures
Are we helping our people do a better job?
• Customer Feedback• Recommendations Tracking
J e t P r o p u l s i o n L a b o r a t o r y
Data Sources
Data Sources• Metrics Collection at Key Milestones• Software Inventory (2006, 2007 and 2009)• Process performance measures (Tailoring Record, Work Product
Checklist)• Defect tracking systems (PRS, ISA, AAMS, local databases)• JPL Human Resources Information System• SQI Surveys and Contact Log• Product and Process Quality Assurance Activities• Customer Feedback
8
Data is gathered from virtually all mission software teams
• We would like to thank the several hundred people who gave their valuable time to make this data available
J e t P r o p u l s i o n L a b o r a t o r y
9
How are we doing?Are we following our
processes?
J e t P r o p u l s i o n L a b o r a t o r y
Establish Policies, Standards and Processes
SDR
· Other JPL & NASA Standards· Corrective Action Notices· Failure Reports· Lessons Learned· JPL Best Practices
CMMI
Tailoring Instructions for the SDSPs
Software Development Standard Processes (SDSPs)
Flight Project Practices (FPP)
SW Management Processes SW Engineering Processes SW Support Processes SW Assurance Processes
Procedures Classes Seminars Examples Sample Text Compliance Matrices Measurement Repository
Local Procedures Project Requirements
Project Standards (e.g., SCD)
Templates
Project SW Plans and Procedures
J e t P r o p u l s i o n L a b o r a t o r y
11
Software Development Standard Processes (SDSP) Applicability
The SDSPs describe how mission software tasks are expected to perform their software development activities• All mission software tasks must start with the SDSPs
as a basis for the activities they will perform • Software tasks then modify the SDSPs to fit their
particular task, based on task characteristics such as size, risk, domain, etc.
• Task-specific modifications of the SDSPs must follow published procedures. The modifications are reviewed by task management, line management, Software Quality Assurance (SQA), and SQI. They are then approved by the Process Owner.
J e t P r o p u l s i o n L a b o r a t o r y
12
Process Performance
Process Performance questions• Are we following our processes?• How does process performance vary by
software characteristics?• What are the least performed processes ?• What process areas should be targeted for
improvement? Process Performance is measured by responses
to the • Tailoring Record (TR), • Work Product Checklist (WPC), • Tailoring Record Review (TRR), • Software Process Review (SPR)• Product and Process Quality Audits (PPQA)
J e t P r o p u l s i o n L a b o r a t o r y
13
Measuring Process Performance
The WPC provides a quick look as to whether 60 key products identified in the SDSPs are being developed
TR and TRR provide risks, strengths, recommendations at planning stage
SPR asks the question: based on the processes you planned to use “How are things working for you?” Specifically:• Are your processes effective? (That is, did you
accomplish the process objectives?)• Have any processes been descoped?• Are your resources adequate? (That is, were
resources adjusted significantly different from plans?)
J e t P r o p u l s i o n L a b o r a t o r y
14
Summary of Risks/Strengths/Recommendations from TRRs
Common JPL Risk AreasNumber of
RisksNumber of Strengths
Number of Recommendations
1) Identify and record requirements 32) Evaluate the requirements for usability and completeness 13) Control requirements changes and evaluate the impact of the changes on the task's schedule and costs
1
4) Develop and evaluate the software architecture, control subsequent changes, and maintain architectural integrity
1 2 5
5) Capture the software design at a sufficient level of detail 16) Evaluate the software design for usability and completeness 7) Deal with design changes and their impact on the schedule and cost 1 8) Provide sufficient guidance to produce quality software 69) Evaluate the quality, functionality, and performance of the software units 110) Test the integrated software product 11) Evaluate and record the results of the integration testing 212) Evaluate the adequacy and completeness of the software delivery 13) Collect, control, and track changes to all configuration items (software and other configuration items)
1 2
14) Provide for objective evaluation of the quality of the processes used and products developed by the task
1 1
15) Address the use and conduct of formal decisions 16) Estimate the effort and cost to develop the product 117) Develop a budget and schedule of sufficient detail to support monitoring of task progress
18) Maintain the budget and schedule currency and accuracy through the duration of the task
19) Evaluate inherited software 120) Consider the requirements for testbeds 21) Identify measures and metrics that permit the effective monitoring of task status and progress
2
22) Sufficient and thorough technical, progress, and milestone reviews 123)Identification, evaluation, and mitigation of risks 224) Provide measures, metrics, and other products to SQI/SMART 225) Identify and communicate the requirements for the product(s) being acquired from contractors
26) Provide contractor surveillance (ie. oversight) 27) Respond to legal, contractual, and institutional requirements 28) Address safety-critical software 129) Monitor and control the task 1
Totals 3 1 25
J e t P r o p u l s i o n L a b o r a t o r y
Software Process Review Results
15
J e t P r o p u l s i o n L a b o r a t o r y
16
TRR and SPR Results
The SPR is the heart of the process review activities identifying problems tasks are having in implementing the SDSPs
• SPR uses the same risk areas as the TRR Half of the risk areas indicate no areas of concern The identified problems are evenly distributed over the
remaining risk areas
DecisionAs there is no clear pattern to the identified problems no action is required.We will continue the SPRs because they engage the projects in constructive conversations related to their process
J e t P r o p u l s i o n L a b o r a t o r y
17
Measuring Process Performance
Overall performance is used to establish baselines
Identifying the least performed activities provides a short list from which stakeholders can be engaged to discuss process improvement opportunities.
Metrics use is one of the least performed activities
Decision• Started Metrics Coaching as a
focused consulting activity • Results reported later under
impact metrics
J e t P r o p u l s i o n L a b o r a t o r y
18
How are we doing?Baselines and Trends
J e t P r o p u l s i o n L a b o r a t o r y
19
Baselines and Trends
How are we doing? • Trends Suggest Improvements in process discipline• Projects following a disciplined process exhibit below average
budget growth and higher productivity rates
19
Project Flight Software
J e t P r o p u l s i o n L a b o r a t o r y
20
Baselines and Trends (cont.)
Doing our best to track these for last 6-7 years Primarily documented in the State of Software Report “Effort Growth from PDR” chart has had significant impact
on managers perceptions However, these types of metrics change slowly and are
impacted by many factors
DecisionWe introduced an approach to quick short
term impact indicators based on Customer Contact and Recommendations Tracking
J e t P r o p u l s i o n L a b o r a t o r y
21
What does our software world look like?
J e t P r o p u l s i o n L a b o r a t o r y
How Much Software?
There are approximately 42 million SLOC in development or actively being maintained at JPL (mission SW only)
22
• 250 software products/tasks− 73% of tasks are Small− 50% is in maintenance− 90% of the code is ground
software
J e t P r o p u l s i o n L a b o r a t o r y
How Much Software? (cont.)
23
Our data indicates that the majority of software tasks were small ground maintenance tasks while we were focusing on large flight development tasks.
DecisionWe modified our focus to include
addressing the needs of small tasks
J e t P r o p u l s i o n L a b o r a t o r y
What are Primary Languages?
The major languages in use are C, Java, FORTRAN, and C++ Other data indicates that Fortran is all legacy and that Java use is
growing
24
J e t P r o p u l s i o n L a b o r a t o r y
What are Primary Languages? (cont.)
25
So if we want to introduce new language specific tools to impact software quality, the data enabled us to identify that C and Java tools should be addressed first.
DecisionWe defined focused consulting tasks to introduce the static code analyzers to the software community
• Coverity Prevent for C• Findbugs for Java
J e t P r o p u l s i o n L a b o r a t o r y
26
The Software Community
How many software engineers are there and what are their characteristics?There are approximately 900 people in the
software communityLess than 50% have formal software
degrees
DecisionJPL is implementing a software certification program for flight software developers to make certain they have the appropriate formal software training
J e t P r o p u l s i o n L a b o r a t o r y
27
Is SQI effectively meeting its stakeholder’s needs?
J e t P r o p u l s i o n L a b o r a t o r y
28
Customer Feedback
Customer interaction and feedback is collected via• Annual customer survey• Systematically tracking contacts and
recommendations for process improvement
In addition, feedback is collected through the Tailoring Records, Tailoring Record Reviews and Software Process Reviews as previously described
J e t P r o p u l s i o n L a b o r a t o r y
29
SQI Annual Survey
Survey provides important information as to the primary roles being performed in the software community
Survey indicates that < 1/3 of the software community understands the SDR and SDSPs
DecisionSet goal to increase awareness
of the process requirements and extensive workaids that are available by 10%
J e t P r o p u l s i o n L a b o r a t o r y
30
Recommendations Tracking
Indicates whether a recommendation or significant support was useful or not useful
Reported in Monthly Management Reviews (MMRs) along with midyear and end year reviews
Issue is that surveys are a highly subjective report and do not indicate clear impact on major metrics
J e t P r o p u l s i o n L a b o r a t o r y
31
Recommendations Tracking (cont.)
One of SQI’s most important products is the advice and recommendations provided to SW tasks• Originate from many sources
SQI has been tracking the outcome of these recommendations that it makes to SW tasks for over 2 years
The outcome is evaluated by the person receiving the recommendation as:• Useful• Somewhat useful• Not useful
Not value added Too costly
• Institutional Benefit
J e t P r o p u l s i o n L a b o r a t o r y
32
Recommendation Tracking (cont.)
367 recommendations have been recorded to-date
J e t P r o p u l s i o n L a b o r a t o r y
33
Recommendation Usefulness based on Source
J e t P r o p u l s i o n L a b o r a t o r y
34
Recommendations Tracking (cont.) Approximately 30% of recommendations are Not Useful The main concern is with those that are seen as not value added
as they potentially represent ‘bad’ approaches which should be modified or dropped
Not surprisingly the data indicates that when we address a direct customer request they almost always find it useful.
However, when unsolicited changes in behavior are being proposed they are perceived as not useful from 25 to 40% of the time. These will never be zero but can be reduced.
DecisionInstructed our people to take the time to better understand a tasks
environment and seek to identify changes that will be useful to the project. In other words, do not make suggestions because the SDSPs say so or because CMMI says so.
(We have observed a decrease in not-useful responses over the past year)
J e t P r o p u l s i o n L a b o r a t o r y
35
Impact Measures
Each SQI Team selected one primary task for which to define and derive an Impact metric• Traces from Success Criteria• Measurement Estimation & Analysis (MEsA) Team focusing
on Metrics Coaching• Assets & Infusion (A&I) Team focusing on TRR and SPR• CMMI Team is focusing on recommendations from appraisal
preparation and improving appraisal efficiency Reported in MMRs along with midyear and end year reviews Issue is that is currently less quantitative then had originally
intended
J e t P r o p u l s i o n L a b o r a t o r y
Using Metrics Coaching as an Example, Focused consulting activity where we
proactively work with tasks to infuse best practices
In this case our goal was to increase metrics use by tasks based on each managers area of concern
Assisted tasks with collecting and analyzing identified metrics
Involved 28 tasks initially Tracked metrics plans and actual usage for
14 tasks 5 of these tasks participated in the CMMI
appraisal and had high performance rates
Impact Measures (cont.)
J e t P r o p u l s i o n L a b o r a t o r y
Impact Measures (cont.)
Success Criteria - Number of tasks with defined metrics which cover all metrics categories as appropriate to their lifecycle increases by 50%
Recommendations• 25 metrics coaching recommendations to
date• Success Criteria -75% of recommendations
are at least somewhat usefulo 21 recommendations have been closed
for 86% U/SUResults• Tasks we worked with increased metrics
identified in plan and exceeded goal• Actual use of metrics just barely met goal• However, if remove assessment tasks
implementation drops to 30%
Metrics Coaching
J e t P r o p u l s i o n L a b o r a t o r y
38
Impact Measures (cont.) Documented metrics use was 2/3rds of planned metrics
• If remove tasks involved in the CMMI appraisal implementation drops to 30%, far short of the success criteria
• One task even went backward Specific Task impacts
• Project 1− Included more requirements metrics in MMRs than they would have− Agreed to more extensive reuse tracking but final results not in
• Project 2− Added bull’s-eye chart for tracking cost and schedule at their level
• Project 3− Made significant changes to JIRA to improve tracking but final results
not in So while this approach worked well for an earlier focused consulting task
emphasizing the use of static code analysis tools it did not work well for this type of activity.
DecisionWe terminated the metrics coaching task as it
was expensive with little impact
J e t P r o p u l s i o n L a b o r a t o r y
39
Wrap Up
We are struggling to get hard quantitative indications of impact of SQI and process improvement but we are making progress
It is better to move forward with ‘measurement’ as best you can because • Partial results cause customers to ask for more and
be more willing to provide assistance• We better understand the barriers • Learning how to do it better
J e t P r o p u l s i o n L a b o r a t o r y
40
Acknowledgements and Reports Available
Detailed report is available on the NASA Engineering Network under the Software Engineering/Software Process Asset Library (PAL)
R. Gostelow, J. Hihn. et. al. JPL Mission Software: State of Software Report 2008 External Release, JPL D-29114, 2008.
This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration. © 2010 California Institute of Technology. Government sponsorship acknowledged.
This work would not have been possible without the support of John Kelly in the NASA Office of the Chief Engineer and the JPL Project and Engineering Management Council.