March 2004 1Chapter 4 – R. S. Pressman SRIMCA
Software Process and Project Metrics
Outline:
In the Software Metrics Domain:product metricsproject metricsprocess metrics
Software Measurementsize-oriented metricsfunction-oriented metrics
Metrics for Software Quality
March 2004 2Chapter 4 – R. S. Pressman SRIMCA
Measure, Metrics, and Indicator
Measure -- Provides a quantitative indication of the extent, amount, dimensions, capacity, or size of some product or process attribute.
Metrics -- A quantitative measure of the degree to which a system, component, or process possesses a given attribute.
Software Metrics -- refers to a broad range of measurements for computer software.
Indicator -- a metric or combination of metrics that provide insight into the software process, a software project, or the product itself.
March 2004 3Chapter 4 – R. S. Pressman SRIMCA
In the Process and Project Domains
Process Indicator
enable insight into the efficacy of an existing process to assess the current work status Goal -- to lead to long-term software process improvement
Project Indicator assess the status of an ongoing project track potential risks uncover problem areas before they go “critical” evaluate the project team’s ability to control the product quality
March 2004 4Chapter 4 – R. S. Pressman SRIMCA
Process Metrics and Software Process Improvement
Customer characteristics
People
Project
Business conditions
Technology
Process
Development environment
March 2004 5Chapter 4 – R. S. Pressman SRIMCA
Measurement
What to measure? errors uncovered before release defects delivered to and reported by end users work products delivered human effort expended calendar time expended schedule conformance
At what level of aggregation? By team? Individual? Project?
March 2004 6Chapter 4 – R. S. Pressman SRIMCA
Privacy Issues
Should they be used for personnel evaluation?
Some issues?
Privacy? Is total assignment being measured? Are the items being measured the same as for other
individuals being measured? Are the conditions of measurement the same across
individuals? However, they can be useful for individual improvement.
March 2004 7Chapter 4 – R. S. Pressman SRIMCA
Use of Software Metrics
Use common sense and organizational sensitivity. Provide regular feedback to individuals and teams. Don’t use metrics to appraise individuals. Set clear goal and metrics. Never use metrics to threaten individuals or teams Problems != negative. These data are merely an indicator
for process improvement. Don’t obsess on a single metric to the exclusion of other important
metrics. Do not rely on metrics to solve your problems. Beware of people performing to metrics rather than product quality or safety.
March 2004 8Chapter 4 – R. S. Pressman SRIMCA
Statistical Software Process Improvement (SSPI)
All errors and defects are categorized by origin. The cost to correct each error and defect is recorded. The number of errors and defects in each category is counted and ranked in descending order.
The overall cost of errors and defects in each category is computed.
Resultant data are analyzed to uncover the categories that result in highest cost to the organization.
Plans are developed to modify the process with the intent of eliminating (or reducing) the class of errors and defects that is most costly.
March 2004 9Chapter 4 – R. S. Pressman SRIMCA
Typical Causes of Product Defects
March 2004 11Chapter 4 – R. S. Pressman SRIMCA
Project Metrics Software Project Measures Are Tactical
used by a project manager and a software team to adapt project work flow and technical activities
The Intent of Project Metrics Is Twofold to minimize the development schedule to avoid delays and
mitigate potential problems and risks to assess project quality on an ongoing basis and modify
the technical approach to improvement quality Production Rates
pages of documentation review hours function points delivered source lines errors uncovered during SW engineering tasks
March 2004 12Chapter 4 – R. S. Pressman SRIMCA
Software Metrics
Direct measures Cost and effort applied (in SEing process) Lines of code(LOC) produced Execution speed CPU utilization Memory size Defects reported over certain period of time
Indirect Measures Functionality, quality, complexity, efficiency, reliability,
maintainability.
March 2004 13Chapter 4 – R. S. Pressman SRIMCA
Software Measurement
Size-Oriented Metrics are derived by normalizing quality and/or productivity
measures by considering the “size” of the software that has been produced.
lines of code often as normalization value.
project LOC effort $(000) pp.doc errors defects people
alpha 12,100 24 168 365 134 29 3beta 27,200 62 440 1224 321 86 5
gamma 20,200 43 314 1050 256 64 6
. . . . . .. . . . .
. . . . .
March 2004 14Chapter 4 – R. S. Pressman SRIMCA
Typical Size-Oriented Metrics
Errors per KLOC Defects per KLOC Dollars per KLOC Pages of documentation per KLOC Errors per person month LOC per person month Dollars per page of documentation
March 2004 15Chapter 4 – R. S. Pressman SRIMCA
Software Measurement
Function-Oriented Metrics use “functionality” to measure derived from “function point” using an empirical relationship based on countable (direct) measure of SW information
domain and assessments of software complexity Use of Function-Oriented Metrics
Measuring scale of a project Normalizing other metrics, e.g., $/FP, errors/FP
March 2004 16Chapter 4 – R. S. Pressman SRIMCA
Function Point Calculation
Weighting Factormeasurement parameter count simple average complex
number of user outputs * 4 5 7 =# of user inquiries * 3 4 6 =number of files * 7 10 15 =# of external interfaces * 5 7 10 =
count_total
number of user inputs * 3 4 6 =
March 2004 17Chapter 4 – R. S. Pressman SRIMCA
Function Point Calculation
Computing function pointsRate each factor on a scale of 0 to 5
no influence incidental moderate average significant essential
1 2 3 4 5 6
1. does the system require reliable backup and recovery?
2. are data communications required?
3. are there distributed processing functions?
4. is performance critical?
........
14. is the application designed to facilitate change and ease of use by the user?
March 2004 18Chapter 4 – R. S. Pressman SRIMCA
Function-Oriented Metrics
FP = count_total * [0.65 + 0.01 * sum of Fi]
Outcome:
errors per FP
defects per FP
$ per FP
page of documentation per FP
FP per person_month
March 2004 19Chapter 4 – R. S. Pressman SRIMCA
Function Point Extensions
Function Points emphasizes “data dimension” Transformations added to capture “functional dimension” Transitions added to capture “control dimension”
March 2004 20Chapter 4 – R. S. Pressman SRIMCA
3-D Function Point Calculation
March 2004 21Chapter 4 – R. S. Pressman SRIMCA
Reconciling Different Metrics
C++ 64 Visualbasic 32
March 2004 22Chapter 4 – R. S. Pressman SRIMCA
Metrics for Software Productivity
LOC and FP Measures Are Often Used to Derive Productivity Metrics
5 Important Factors That Influence SW Productivity people factors problem factors process factors product factors resource factors
March 2004 23Chapter 4 – R. S. Pressman SRIMCA
Measures of Software Quality
Correctness is the degree to which the software performs its required
function. the most common measure for correctness is defects per KLOC (per year)
Maintainability the ease that a program can be corrected adapted if the environment changes enhanced if the customer desires changes in requirements based on the time-oriented measure mean time to change (MTTC). Spoilage – a cost oriented metric for maintainability
March 2004 24Chapter 4 – R. S. Pressman SRIMCA
Measures of Software Quality (Cont’d)
Integrity to measure a system’s ability to withstand attacks (both
accidental and intentional) on its security threat and security are defined
integrity = sum [ 1 - threat * (1- security)] Usability - an attempt to quantify “user friendliness”
physical/intellectual requirement to learn time required to become moderately efficient in the use the net increase in productivity user attitudes toward system
March 2004 25Chapter 4 – R. S. Pressman SRIMCA
Defect Removal Efficiency
A Quality Metric That Provides Benefit at Both the Project and Process Level
DRE = E / ( E + D )
E = # of errors found before delivery of the software to the end user
D = # of defects found after delivery
More generally,
DREi = Ei / ( Ei + Ei+1 )
Ei = # of errors found during SE activity i
March 2004 26Chapter 4 – R. S. Pressman SRIMCA
Integrating Metrics within the Processes
Arguments for S/w Metrics Measurement is used to establish process baseline from
which improvements can be assessed. Developers are anxious to find after design:
Which user reqs. are most likely to change?Which components in this system are most error prone?How much testing should be planned for each
component?How many errors can I expect when testing commences?
Answers to these can be found if metrics are collected and used as technical guide.
March 2004 27Chapter 4 – R. S. Pressman SRIMCA
Integrating Metrics within the Processes
Establishing a baseline Benefits can be obtained at process, project & product levels Consists of data collected from past s/w development Baseline data should have following attributes:
Data must be reasonably accurateCollect data for as many projects as possibleMeasures must be consistentApplications should be similar to work
Metrics collection, computation and Evaluation
March 2004 28Chapter 4 – R. S. Pressman SRIMCA
Summary View
March 2004 29Chapter 4 – R. S. Pressman SRIMCA
Summary
Metrics are a tool which can be used to improve the productivity and quality of the software system
Process metrics takes a strategic view to the effectiveness of a software process
Project metrics are tactical that focus on project work flow and technical approach
Size-oriented metrics use the line of code as a normalizing factor
Function-oriented metrics use function points Four quality metrics------correctness, integrity, maintainability, and
usability were discussed