+ All Categories
Home > Documents > MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights...

MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights...

Date post: 02-Jan-2016
Category:
Upload: everett-harper
View: 214 times
Download: 0 times
Share this document with a friend
79
MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: http://www.spr.com Email: [email protected] Capers Jones, President Quality Seminar: talk 2 June 11, 2011
Transcript
Page 1: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASURING SOFTWARE QUALITY

AND CUSTOMER SATISFACTION

Copyright © 2011 by Capers Jones. All Rights Reserved.

Capers Jones & Associates LLC

Web: http://www.spr.com Email:

[email protected]

Capers Jones, President

Quality Seminar: talk 2

June 11, 2011

Page 2: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\2 Copyright © 2011 by Capers Jones. All Rights Reserved.

FUNDAMENTAL SOFTWARE BUSINESS LAWS

LAW 1: Enterprises that master computers and softwarewill succeed; enterprises that fall behind will fail!

LAW 2: Quality control is the key to mastering computingand software. Enterprises that control qualitywill succeed. Enterprises that do not controlquality will fail.

LAW 3: Quality cannot be controlled unless it can be measured.

Page 3: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\3 Copyright © 2011 by Capers Jones. All Rights Reserved.

PERCENTAGE OF SOFTWARE EFFORT BY TASK

Size in Mgt./ DefectFunction Points Support Removal Paperwork Coding Total

10,240 18% 35% 35% 12% 100%5,120 17% 33% 32% 18% 100%2,580 16% 31% 29% 24% 100%1,280 15% 29% 26% 30% 100%

640 14% 27% 23% 36% 100%320 13% 25% 20% 42% 100%160 12% 23% 17% 48% 100%

80 11% 21% 14% 54% 100%40 10% 19% 11% 60% 100%20 9% 17% 8% 66% 100%10 8% 15% 5% 72% 100%

Page 4: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\4 Copyright © 2011 by Capers Jones. All Rights Reserved.

BASIC DEFINITIONS

SOFTWARE “Software that combines theQUALITY characteristics of low defect

rates and high user satisfaction”

USER “Clients that are pleased with a SATISFACTION vendor’s products, quality levels,

ease of use, and support”

Page 5: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\5 Copyright © 2011 by Capers Jones. All Rights Reserved.

A PRACTICAL DEFINITION OF SOFTWAREQUALITY (PREDICTABLE AND MEASURABLE)

• Low Defect Potentials (< 3.0 per Function Point)• High Defect Removal Efficiency (> 95%)• Unambiguous, Stable Requirements (< 2.5% change)• Explicit Requirements Achieved (> 97.5% achieved)• High User Satisfaction Ratings (> 90% “excellent”)

- Installation- Ease of learning- Ease of use- Functionality- Compatibility- Error handling- User information (screens, manuals, tutorials)- Customer support- Defect repairs

Page 6: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\6 Copyright © 2011 by Capers Jones. All Rights Reserved.

CAUTIONS ABOUT HAZARDOUS DEFINITIONS

“Quality Means Conformance to requirements.”

Requirements contain 15% of software errors.

Requirements Grow at 2% per month.

Do you conform to requirements errors?

Do you conform to totally new requirements?

Whose requirements are you trying to satisfy?

Page 7: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\7 Copyright © 2011 by Capers Jones. All Rights Reserved.

CAUTIONS ABOUT HAZARDOUS METRICS

“Cost per Defect”

• Approaches infinity as defects near zero

• Conceals real economic value of quality

Page 8: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\8 Copyright © 2011 by Capers Jones. All Rights Reserved.

COST PER DEFECT PENALIZES QUALITY

A B C DPoor Good Excellent Zero

Quality Quality Quality Defects

Function Points 100 100 100 100

Bugs Discovered 500 50 5 0

Preparation $5,000 $5,000 $5,000 $5,000

Removal $5,000 $2,500 $1,000 $ 0

Repairs $25,000 $5,000 $1,000 $ 0

Total $35,000 $12,500 $7,000 $5,000

Cost per Defect $70 $250 $1,400

Cost per Function Point $350 $125 $70 $50

Page 9: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\9 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE MEASURES AND REPORTS

Quality Measures

User Satisfaction Measures

• Ease of Use • Functionality • Support

Defect Measures

Testing Measures

Post-Release Measures

Non-Test Measures

Operational Measures

Down Time

Response Time

On-Going Project Measures

Milestone Measures

Cost & Budget Variance Measures

Monthly Progress Report

• Completed Milestones • Plan vs. Actuals • Red Flag Items

Productivity Measures

Completed Project Measures

• Defect Volumes • Defect Severities • Defect Origins • Defect Causes • Removal Efficiency

• Development • Enhancement • Conversion • Packages • Contracts

Enterprise Measurement

Program

Soft Factor Measures

Production Library & Backlog

Measures

Monthly Quality Report

Employee Opinion Survey

Annual Productivity

Report

Annual Satisfaction

Survey

Enterprise Demographic

Measures

ReuseMeasures

• Designs• Code• Documents• Test Cases• Estimates

AnnualProductivityReport

Function point uses highlighted

MonthlyQualityReport

Page 10: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\10 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE QUALITY MEASUREMENT VIEWS

Software Quality Measures Need to Meet the Needs of SevenDifferent Views:

1. Measures Important to Executives

2. Measures Important to Project Managers

3. Measures Important to Software Engineers

4. Measures Important to Quality Assurance

5. Measures Important to Testers

6. Measures Important to Customer Support

7. Measures Important to Customers

Page 11: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\11 Copyright © 2011 by Capers Jones. All Rights Reserved.

EXECUTIVE SOFTWARE QUALITY CONCERNS

Risk of Schedule Slips

Risk of Cost Overruns

Risk of Deferred Functions

Risk of Poor User Satisfaction

Risk of Poor Quality

Risk that Competitive Quality will be Better

Page 12: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\12 Copyright © 2011 by Capers Jones. All Rights Reserved.

PROJECT MANAGEMENT SOFTWAREQUALITY CONCERNS

Risk of Creeping Requirements

Risk of Excessive Schedule Pressure

Risk of Schedule Slips

Risk of Cost Overruns

Risk of Deferred Functions

Risk of Poor User Satisfaction

Risk of High Defect Levels

Risk of Inadequate Defect Removal

Risk of Staff Burnout

Page 13: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\13 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE ENGINEERING QUALITY CONCERNS

Risk of Creeping Requirements

Risk of Deferred Functions

Risk of Poor User Satisfaction

Risk of High Defect Levels

Risk of Excessive Schedule Pressure

Risk of Inadequate Defect Removal

Risk of Staff Burnout

Page 14: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\14 Copyright © 2011 by Capers Jones. All Rights Reserved.

QUALITY ASSURANCE SOFTWARE CONCERNS

Risk of Inadequate Defect Predictions

Risk of Inadequate Defect Tracking

Risk of Excessive Schedule Pressure

Risk of Excessive Complexity Levels

Risk of Unplanned Changes

Risk of Poor User Satisfaction

Risk of High Defect Levels

Risk of Inadequate Reviews and Inspections

Risk of Inadequate Testing

Page 15: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\15 Copyright © 2011 by Capers Jones. All Rights Reserved.

TESTING SOFTWARE QUALITY CONCERNS

Risk of Excessive Schedule Pressure

Risk of Unplanned Changes

Risk of Excessive Complexity Levels

Risk of Premature Testing

Risk of Inadequate Defect Tracking

Risk of Inadequate Test Coverage

Risk of Defective Test Cases

Risk of Duplicate Test Cases

Page 16: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\16 Copyright © 2011 by Capers Jones. All Rights Reserved.

CUSTOMER SUPPORT QUALITY CONCERNS

Risk of High Incidents Levels

Risk of High Defect Levels

Risk of Overloading Support Personnel

Risk of Slow Turnaround Time on Repairs

Risk of Invalid Defect Reports

Risk of High Volumes of Duplicate Defects

Risk of Software that is Hard to Learn

Risk of Software that is Hard to Use

Page 17: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\17 Copyright © 2011 by Capers Jones. All Rights Reserved.

Risk of Schedule Slips

Risk of Cost Overruns

Risk of Deferred Functions

Risk of High Defect Levels

Risk of Software that is Hard to Learn

Risk of Software that is Hard to Use

Risk of Poor Customer Support

Risk of Poor Data Quality

CLIENT SOFTWARE QUALITY CONCERNS

Page 18: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\18 Copyright © 2011 by Capers Jones. All Rights Reserved.

RISKS VERSUS DOMAINS MATRIX

Schedule Slips

Execs Proj Mgrs Soft Eng. Quality Assurance Testers Cust. Support CustomersRisks

Cost OverrunsDeferred FunctionsPoor User SatisfactionPoor QualityBetter Quality by CompetitionCreeping RequirementsExcessive Schedule PressureHigh Defect LevelsInadequate Defect RemovalStaff Burnout

Unplanned Changes

Inadequate Defect PredictionsInadequate Defect TrackingExcessive Complexity Levels

Inadequate Reviews & InspectionsInadequate TestingPremature TestingInadequate Test CoverageDefective Test CasesDuplicate Test Cases

Invalid Defect Reports

High Incidents LevelsOverloading Support PersonnelSlow Repair Turnaround Time

High Duplicate Defect VolumesHard to Learn SoftwareHard to Use SoftwarePoor Customer SupportPoor Data Quality

XXXXXX

XX

XXXX

XXX

X

XX

XX

XX

XX

X

XX

X

X

XX

X

XX

X

X

XXX

X

X

XXXXXX

XXX

X

XXXX

Page 19: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\19 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE QUALITY MEASUREMENT ESSENTIALS

“Hard Data” (objective, quantifiable)

Defect detection efficiency (DDE)

Defect removal efficiency (DRE)

Delivered defects

Defect repair intervals

Number of test cases

Test coverage

Code complexity levels

Bad-fix injections

Page 20: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\20 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE QUALITY MEASUREMENT ESSENTIALS (cont.)

“Soft Data” (subjective, somewhat ambiguous)

Requirements creep

Defect potentials

Defect severity levels

Defect or enhancement determination

User satisfaction levels

Customer support levels

Schedule pressure

Page 21: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\21 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE QUALITY MEASUREMENT ESSENTIALS (cont.)

Normalization Metrics (for comparisons)

Function points

Feature points

Data points

Natural metrics

lines of code (LOC)

pages

Page 22: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\22 Copyright © 2011 by Capers Jones. All Rights Reserved.

KIVIAT GRAPH OF MAJOR SOFTWARE RISKSScheduleSlippage

UnplannedChanges

InadequateDefect

Removal

HighDefectLevels

PoorCustomerSupport

PoorUser

Satisfaction

DeferredFunctions

CostOverrun

Page 23: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\23 Copyright © 2011 by Capers Jones. All Rights Reserved.

WATCH FOR MEASUREMENT “CULTURESHOCK”

• Resistance from Managers

• Resistance from Technical Staff

Page 24: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\24 Copyright © 2011 by Capers Jones. All Rights Reserved.

PROBABLE MANAGEMENT REACTIONS TO SOFTWARE MEASUREMENT PROGRAMS

Management Level Response to Measurements

CEO/President Eager to get useful data

Senior Vice Presidents Good source of measurement sponsors

Vice Presidents Somewhat apprehensive but often good sponsors

Directors Apprehensive

Third-line managers Very apprehensive

Second-line managers Very apprehensive

First-line managers Varies with self-confidence

Page 25: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\25 Copyright © 2011 by Capers Jones. All Rights Reserved.

NORMAL TIMING & SEQUENCE FOR SOFTWAREMEASUREMENT PROGRAM CREATION

Time from StartActivity of Program Duration

1) Select metrics to be used Within 2 weeks 1 or 2 weeks

2) Define activities to be measured Within 2 weeks 1 or 2 weeks

3) Select “soft” influential factors Within 3 weeks 1 or 2 weeks

4) Build a mock-up report Within 4 weeks 1 week or less

5) Select projects to be measured Within 5 weeks 1 week or less

6) Kick-off meeting Within 6 weeks A few hours

7) Collect initial project data Within 7 weeks 2 or 3 weeks

8) Produce pilot report Within 8 weeks 1 week or less

Page 26: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\26 Copyright © 2011 by Capers Jones. All Rights Reserved.

METRIC CHOICES

Logical source code statement

Physical lines of code (LOC)

Natural metrics such as pages, test cases, and screens produced

McCabe’s cyclomatic and essential complexity metrics

Halstead or “software science” metrics

IFPUG Function Points

COSMIC Function Points

Story Points

Use Case Points

The British Mark II function Point metric

The Kemerer object-oriented metrics

Basili Goal-Question metrics

Page 27: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\27 Copyright © 2011 by Capers Jones. All Rights Reserved.

SIX CRITERIA FOR SOFTWARE METRICS SELECTION

1) It is best if the metric has a standard definition and is not too ambiguous.

2) Metrics which have a formal user group make it easier to get comparative data.

3) It is most convenient if the metric is supported by tools and automation.

4) It is helpful to have conversion rules between the metric and other metrics.

5) The metric should be able to deal with non-coding activities.

6) The metric should not be severely biased or unsuited for statistical analysis.

Page 28: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\28 Copyright © 2011 by Capers Jones. All Rights Reserved.

HOW METRICS MEET THE SIX CRITERIA

Metric

Natural Logical FunctionCriteria Metrics Statements LOC Points 0 - 0 Metrics

1) Lack of ambiguity No Yes No No No

2) User group No No No Yes No

3) Automation No Yes Yes Yes Yes

4) Conversion rules No Yes No Yes No

5) Non-code uses Yes No No Yes No

6) Lack of bias No No No Yes No

Page 29: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\29 Copyright © 2011 by Capers Jones. All Rights Reserved.

PHYSICAL LINES OF CODE VERSUSLOGICAL STATEMENTSPhysical Lines of Code Logical Statements

Advantages AdvantagesEasy to count Easy to count

Conversion to function points

Consistent for many languages

Disadvantages DisadvantagesVaries by programmer Varies by programmermore than 10 to 1 more than 5 to 1

Varies by languages Varies by languagemore than 10 to 1 more than 3 to 1

Impossible for modern languages Hard for modern languages

Worthless for economic studies Difficult for economic studies

Page 30: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\30 Copyright © 2011 by Capers Jones. All Rights Reserved.

PROBLEMS WITH PHYSICAL “LINES OF CODE”

• Standardization Problems

• Economic Problems

Page 31: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\31 Copyright © 2011 by Capers Jones. All Rights Reserved.

STANDARDIZATION PROBLEMS FOR LOC

1. No international standards group like IFPUG

2. No standard definition for any language

3. No general standards for 500 current languages

4. No rules for icons, generators, and modern quasi-languages

5. Fluctuates from language to language

6. Fluctuates from programmer to programmer

Page 32: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\32 Copyright © 2011 by Capers Jones. All Rights Reserved.

ECONOMIC PROBLEMS WITH LOC

1. Conversion to logical statements is difficult

2. Conversion to function points is difficult

3. Source code is less than 20% of total software expense

4. Source code is uncertain during requirements and design

5. Source code can’t measure paperwork

6. Source code metrics are decoupled from standard economics

7. Source code is of no interest to users

8. Source code has no business value

Page 33: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\33 Copyright © 2011 by Capers Jones. All Rights Reserved.

FOUR LANGUAGE COMPARISON OFSOFTWARE DEFECT POTENTIALS

Defect Origin Assembly Ada Objective C Full Reuse

Requirements 35 35 35 15Design 75 75 50 6Code 165 25 10 2Documents 50 50 50 10Bad Fixes 25 15 5 2

TOTAL DEFECTS 300 200 150 35

Defects per KLOC 30 100 120 140

Defects per Function 6 4 2.4 0.7 Point

Page 34: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\34 Copyright © 2011 by Capers Jones. All Rights Reserved.

0

1

2

3

4

5

0 10 20 30 40 50 60 70

Assembly

CFortran

COBOL

PL/I

ADA83ADA9X

C++Generators

Smalltalk

LOC VERSUS FUNCTION POINT QUALITYLEVELS

Defects per KLOC

Page 35: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\35 Copyright © 2011 by Capers Jones. All Rights Reserved.

BASIS OF THE “LINES OF CODE” QUALITYPARADOX

When defects are found in multiple components, it is invalid toassign all defects to a single component.

Software defects are found in:

requirementsdesignsource codeuser documentsbad fixes (secondary defects)

Requirements and design defects outnumber code defects.

“Defects per KLOC” makes major sources of software defectsinvisible.

Page 36: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\36 Copyright © 2011 by Capers Jones. All Rights Reserved.

• Formal Inspections and static analysis

• Joint Application Design (JAD)

• Quality Metrics

• Removal Efficiency Measurements

• Functional Metrics

• Active Quality Assurance

• User Satisfaction Surveys

• Formal Test Planning

• Quality Estimation Tools

• Complexity Metrics

• Quality Function Deployment (QFD)

• Six-Sigma for software

CONSISTENTLY GOOD QUALITY RESULTS

Page 37: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\37 Copyright © 2011 by Capers Jones. All Rights Reserved.

MIXED QUALITY RESULTS

• Total Quality Management (TQM)

• SEI Assessments by uncertified assessors

• SEI Maturity Levels as only goal for improvement

• Baldrige Awards

• IEEE Quality Standards

• Testing by Developers

• DOD 2167A

• Reliability Models

Page 38: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\38 Copyright © 2011 by Capers Jones. All Rights Reserved.

QUESTIONABLE QUALITY RESULTS

• ISO 9000-9004 Quality Standards

• Informal Testing

• Passive Quality Assurance

• LOC metrics

• Cost per defect metrics

Page 39: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\39 Copyright © 2011 by Capers Jones. All Rights Reserved.

GAPS IN ISO QUALITY PROCESSES

SPR ISO

Defect Potential Estimation Yes Missing

Defect Removal Efficiency Yes MissingEstimation and Measurement

Delivered Defect Estimation Yes Yesand Measurement

User Satisfaction Measurement Yes Yes

Inspections and Reviews Rigorous Informal

Testing Rigorous Rigorous

Process Analysis Rigorous Informal

Page 40: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\40 Copyright © 2011 by Capers Jones. All Rights Reserved.

OVERVIEW OF ISO 9000-9004 STANDARDSTOPICS

ISO assessment accuracy Emerging topic, needs more workISO assessment validity Emerging topic; needs more workISO costs/investments required No current literature at all; costs may be highISO impact on software quality Many claims, but little empirical dataISO impact on software productivity No current literature at all

ISO impact on software schedules No current literature; longer schedules likelyISO value to domestic companies No current literature at all; low value likelyISO value to internationals ISO certification now mandatoryISO certification automation No current literature; no known toolsISO impact on paperwork May increase paperwork volumes

Page 41: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\41 Copyright © 2011 by Capers Jones. All Rights Reserved.

U. S. SOFTWARE QUALITY AVERAGES

(Defects per Function Point))

System Commercial Information Military OverallSoftware Software Software Software Average

DefectPotentials 6.0 5.0 4.5 7.0 5.6

DefectRemoval 94% 90% 73% 96% 88%Efficiency

DeliveredDefects 0.4 0.5 1.2 0.3 0.65

First YearDiscovery Rate 65% 70% 30% 75% 60%

First YearReported 0.26 0.35 0.36 0.23 0.30Defects

Page 42: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\42 Copyright © 2011 by Capers Jones. All Rights Reserved.

10

9

8

7

6

5

4

3

2

1

0

50% 55% 60% 65% 70% 75% 80% 85% 90% 95% 100%

Defect Removal Efficiency

Malpractice

U.S.Average

Best In Class

SOFTWARE QUALITY ZONES

Page 43: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\43 Copyright © 2011 by Capers Jones. All Rights Reserved.

CURRENT U.S. AVERAGES FOR SOFTWARE QUALITY

Defect Removal DeliveredDefect Origins Potential Efficiency Defects

Requirements 1.00 77% 0.23Design 1.25 85% 0.19Coding 1.25 95% 0.09Documents 0.60 80% 0.12Bad Fixes 0.40 70% 0.12

TOTAL 5.00 85% 0.75

CONCLUSIONS

Projects with large volumes of coding defects have the highest removal efficiencies

High-level and O-O languages have low volumes of coding defects

(Data expressed in terms of defects per function point)

Page 44: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\44 Copyright © 2011 by Capers Jones. All Rights Reserved.

U.S. BEST IN CLASS RESULTS FOR DEFECT POTENTIALS AND REMOVAL EFFICIENCY

(Data Expressed in Terms of Defects per Function Point)

Defect Removal Delivered

Defect Origins Potentials Efficiency Defects

Requirements 0.40 85% 0.080

Design 0.60 97% 0.020

Coding 1.00 99% 0.010

Document 0.40 98% 0.010

Bad Fixes 0.10 95% 0.010

Total 2.50 96% 0.130

Page 45: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\45 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE DEFECTS FOR LEVELS OF THE SEI CMMI

(Data Expressed in Terms of Defects per Function Point)

Defect Removal Delivered

SEI CMM Levels Potentials Efficiency Defects

SEI CMM 1 5.00 85% 0.75

SEI CMM 2 4.50 89% 0.50

SEI CMM 3 3.50 91% 0.32

SEI CMM 4 2.50 93% 0.18

SEI CMM 5 2.00 95% 0.10

Page 46: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\46 Copyright © 2011 by Capers Jones. All Rights Reserved.

INTERNATIONAL DEFECT POTENTIALS AND REMOVAL EFFICIENCY

Defect Potential Defect Removal Delivered DefectsPer Function Efficiency Per Function

Country Point Levels Point

Japan 4.50 93% 0.32

Canada 4.55 86% 0.64

United States 5.00 85% 0.75

Norway 4.95 84% 0.79

Sweden 5.00 84% 0.80

France 4.75 83% 0.82

Italy 4.85 83% 0.82

India 5.10 84% 0.82

Germany 4.95 83% 0.84

England 4.85 82% 0.87

South Korea 5.20 83% 0.88

Russia 5.50 80% 1.10

Page 47: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\47 Copyright © 2011 by Capers Jones. All Rights Reserved.

U.S. INDUSTRIES EXCEEDING 95% INCUMULATIVE DEFECT REMOVAL EFFICIENCY

Year 95% Exceeded

(Approximate)

1. Telecommunications Manufacturing 1975

2. Computer Manufacturing 1977

3. Aero-space Manufacturing 1979

4. Military and Defense Manufacturing 1980

5. Medical Instrument Manufacturing 1983

6. Commercial Software Producers 1992

Page 48: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\48 Copyright © 2011 by Capers Jones. All Rights Reserved.

U.S. INDUSTRIES MAINTAINING MARKETSHARE INTERNATIONALLY

1. Telecommunications Manufacturing

2. Computer Manufacturing

3. Military and Defense Manufacturing

4. Commercial Software Producers

5. Aero-space Manufacturing

6. Medical Instrument Manufacturing

Page 49: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\49 Copyright © 2011 by Capers Jones. All Rights Reserved.

SPR QUALITY PERFORMANCE LEVELSDEFECT POTENTIALS

(Sum of Requirements, Design, Code, Document, and Bad Fix Defects)

Total PotentialSPR Performance Defects per Level Function Point

1. Excellent < 2.00

2. Good 3.00

3. Average 5.00

4. Marginal 6.00

5. Poor > 7.00

Page 50: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\50 Copyright © 2011 by Capers Jones. All Rights Reserved.

SPR QUALITY PERFORMANCE LEVELSCUMULATIVE DEFECT REMOVAL EFFICIENCY

(Development Defects + 1 Year of User Defect Reports)

SPR Performance Efficiency Measured at Level One Year of Usage

1. Excellent > 98%

2. Good 95%

3. Average 85%

4. Marginal 80%

5. Poor < 75%

Page 51: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\51 Copyright © 2011 by Capers Jones. All Rights Reserved.

OPTIMIZING QUALITY AND PRODUCTIVITY

Projects that achieve 95% cumulative Defect RemovalEfficiency will find:

1) Minimum schedules

2) Maximum productivity

3) High levels of user satisfaction

4) Low levels of delivered defects

Page 52: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\52 Copyright © 2011 by Capers Jones. All Rights Reserved.

ORIGIN OF SOFTWARE DEFECTS

• Because defect removal is such a major cost element, studying defect origins is a valuable undertaking.

IBM Corporation (MVS) SPR Corporation (client studies)

45% Design errors 20% Requirements errors25% Coding errors 30% Design errors20% Bad fixes 35% Coding errors

5% Documentation errors 10% Bad fixes 5% Administrative errors 5% Documentation errors

100% 100%

TRW Corporation Mitre Corporation Nippon Electric Corp.

60% Design errors 64% Design errors 60% Design errors 40% Coding errors 36% Coding errors 40% Coding errors100% 100% 100%

Page 53: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\53 Copyright © 2011 by Capers Jones. All Rights Reserved.

FUNCTION POINTS AND DEFECT POTENTIALS

Function points raised to the 1.25 power can predict theprobable number of bugs in requirements, design, code,user document, and bad fixes.

Function Points Potential Defects

1 110 18

100 3161,000 5,623

10,000 100,000100,000 1,778,279

Page 54: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\54 Copyright © 2011 by Capers Jones. All Rights Reserved.

FUNCTION POINTS AND DEFECT REMOVAL

Function Points raised to the 0.3 power can predict theoptimal number of defect removal stages.

FUNCTION DEFECT REMOVALPOINTS STAGES

1 1

10 2

100 4

1,000 8

10,000 16

100,000 32

Page 55: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\55 Copyright © 2011 by Capers Jones. All Rights Reserved.

FUNCTION POINTS AND TEST CASES

Function Points raised to the 1.2 power can predict theoptimal number of test cases. Mathematical test casedesign lowers power to 1.1.

FUNCTION POINTS TEST CASES 1.2 TEST CASES 1.1

1 1 1

10 16 13

100 250 160

1,000 4,000 2,000

10,000 63,000 25,000

100,000 1,000,000 316,000

Page 56: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\56 Copyright © 2011 by Capers Jones. All Rights Reserved.

QUALITY PERFORMANCE LEVELSTEST CASES CONSTRUCTED PER FUNCTION POINT

Test Step Lowest Median Highest

Unit Test 0.2 0.3 0.5

New Function Test 0.2 0.35 0.6

Regression Test 0.2 0.3 0.5

Integration Test 0.25 0.4 0.75

Stress Test 0.05 0.1 0.2

System Test 0.1 0.25 0.4

Field Test 0.1 0.15 0.25

Acceptance Test 0.1 0.15 0.25

Total 1.2 2.0 3.0

Page 57: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\57 Copyright © 2011 by Capers Jones. All Rights Reserved.

RELATIONSHIP OF SOFTWAREQUALITY AND PRODUCTIVITY

• The most effective way of improving software productivityand shortening project schedules is to reduce defect levels.

• Defect reduction can occur through:

1. Defect prevention technologiesStructured designStructured codeHigh-level languagesEtc.

2. Defect removal technologiesDesign reviewsCode inspectionsTestsCorrectness proofs

Page 58: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\58 Copyright © 2011 by Capers Jones. All Rights Reserved.

Requirements Design Code Document PerformanceDefects Defects Defects Defects Defects

Excellent Good Not Fair PoorApplicable

Excellent Excellent Fair Not ExcellentApplicable

Fair Good Excellent Fair Fair

Good Excellent Excellent Fair Good

Excellent Excellent Excellent Excellent Good

Good Excellent Fair Poor Good

JAD’s

Prototypes

StructuredMethods

Six-Sigma

Blueprints &Reusable Code

QFD

DEFECT PREVENTION METHODS

Page 59: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\59 Copyright © 2011 by Capers Jones. All Rights Reserved.

DEFECT REMOVAL METHODS

Requirements Design Code Document PerformanceDefects Defects Defects Defects Defects

Good Excellent Excellent Good Fair

Good Fair Fair Not GoodApplicable

Poor Poor Good Fair Excellent

Poor Poor Good Fair Poor

Reviews/Inspections,Static analysis

Prototypes

Testing(all forms)

CorrectnessProofs

Page 60: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\60 Copyright © 2011 by Capers Jones. All Rights Reserved.

DEFECT REMOVAL ASSUMPTIONS

ManagementMethods Training Experience Enthusiasm Support

1. Excellent Formal Formal Substantial Good Good

2. Good Formal Formal Mixed Good Moderate

3. Average Informal Informal Mixed Mixed Mixed

4. Marginal Informal Informal Little Minimal Minimal

5. Poor Informal Informal None Negative Minimal

Page 61: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\61 Copyright © 2011 by Capers Jones. All Rights Reserved.

QUALITY MEASUREMENT EXCELLENCE

TestDefect Defect Usability Complexity Coverage Removal Maintenance

Estimation Tracking Measures Measures Measures Measures Measures

1. Excellent Yes Yes Yes Yes Yes Yes Yes

2. Good Yes Yes Yes No Yes No Yes

3. Average No Yes Yes No Yes No Yes

4. Marginal No No Yes No Yes No Yes

5. Poor No No No No No No No

Page 62: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\62 Copyright © 2011 by Capers Jones. All Rights Reserved.

INADEQUATE DEFECT REMOVAL IS THELEADING CAUSE OF POOR SOFTWARE QUALITY

• Individual programmers are only 35% efficient infinding bugs in their own software

• The sum of all normal test steps is often less than75% effective (1 of 4 bugs remains)

• Design Reviews and Code Inspections however areoften 65% effective; can top 85%

• Static analysis are often 65% effective; can top 85%.

• Reviews and Inspections can lower costs andschedules by as much as 30%

Page 63: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\63 Copyright © 2011 by Capers Jones. All Rights Reserved.

DEFECT REMOVAL EFFICIENCY (cont.)

1 2 3 4 5 6 7 8 9 10 Defects

First operation 6defects from 10or 60% efficiency

Second operation 2 defectsfrom 4 or 50% efficiency

Cumulative efficiency 8defects from 10 or 80%efficiency

Defect removalefficiency = Percentage of defects removed by a single

level of review, inspection or test

Cumulative defectremoval efficiency = Percentage of defects removed by a series

of reviews, inspections or tests

Page 64: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\64 Copyright © 2011 by Capers Jones. All Rights Reserved.

RANGES OF DEFECT REMOVAL EFFICIENCY

Lowest Median Highest

1. Requirements review 20% 30% 50%

2. Top-level design reviews 30% 40% 60%

3. Detailed functional design reviews 30% 45% 65%

4. Detailed logic design reviews 35% 55% 75%

5. Code inspections, static analysis 35% 60% 85%

6. Unit tests 10% 25% 50%

7. Function tests 20% 35% 55%

8. Integration tests 25% 45% 60%

9. System Test 20% 35% 50%

10. Site/installation tests 25% 50% 65%

90% 99% 99%

Page 65: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\65 Copyright © 2011 by Capers Jones. All Rights Reserved.

NORMAL DEFECT ORIGIN/DISCOVERY GAPS

Defect Origins

DefectDiscovery

Requirements Design Coding Documentation Testing Maintenance

Requirements Design Coding Documentation Testing Maintenance

Zone of Chaos

Page 66: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\66 Copyright © 2011 by Capers Jones. All Rights Reserved.

Defect Origins

DefectDiscovery

Requirements Design Coding Documentation Testing Maintenance

Requirements Design Coding Documentation Testing Maintenance

DEFECTS WITH INSPECTIONS, STATIC ANALYSIS

Page 67: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\67 Copyright © 2011 by Capers Jones. All Rights Reserved.

TECHNOLOGY COMBINATIONS DEFECT REMOVAL EFFICIENCY

Lowest Median Highest

1. No Design Inspections 30% 40% 50%No Inspections or static analysisNo Quality AssuranceNo Formal Testing

WORST CASE RANGE

SOFTWARE DEFECT REMOVAL RANGES

Page 68: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\68 Copyright © 2011 by Capers Jones. All Rights Reserved.

TECHNOLOGY COMBINATIONS DEFECT REMOVAL EFFICIENCYLowest Median Highest

2. No design inspections 32% 45% 55%No inspections or static analysisFORMAL QUALITY ASSURANCENo formal testing

3. No design inspections 37% 53% 60%No inspections or static analysisNo quality assuranceFORMAL TESTING

4. No design inspections 43% 57% 65%INSPECTIONS, STATIC ANALYSISNo quality assuranceNo formal testing

5. FORMAL DESIGN INSPECTIONS 45% 60% 68%No inspections or static analysisNo quality assuranceNo formal testing

SOFTWARE DEFECT REMOVAL RANGES (cont.)SINGLE TECHNOLOGY CHANGES

Page 69: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\69 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE DEFECT REMOVAL RANGES (cont.)

TECHNOLOGY COMBINATIONS DEFECT REMOVAL EFFICIENCYLowest Median Highest

6. No design inspections 50% 65% 75%No inspections or static analysisFORMAL QUALITY ASSURANCEFORMAL TESTING

7. No design inspections 53% 68% 78%INSPECTIONS, STATIC ANALYSISFORMAL QUALITY ASSURANCENo formal testing

8. No design inspections 55% 70% 80%INSPECTIONS, STATIC ANALYSISNo quality assuranceFORMAL TESTING

TWO TECHNOLOGY CHANGES

Page 70: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\70 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE DEFECT REMOVAL RANGES (cont.)

TECHNOLOGY COMBINATIONS DEFECT REMOVAL EFFICIENCYLowest Median Highest

9. FORMAL DESIGN INSPECTIONS 60% 75% 85%No inspections or static analysisFORMAL QUALITY ASSURANCENo formal testing

10. FORMAL DESIGN INSPECTIONS 65% 80% 87%No inspections or static analysisNo quality assuranceFORMAL TESTING

11. FORMAL DESIGN INSPECTIONS 70% 85% 90%INSPECTIONS, STATIC ANALYSISNo quality assuranceNo formal testing

TWO TECHNOLOGY CHANGES (cont.)

Page 71: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\71 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE DEFECT REMOVAL RANGES (cont.)

TECHNOLOGY COMBINATIONS DEFECT REMOVAL EFFICIENCYLowest Median Highest

12. No design inspections 75% 87% 93%INSPECTIONS, STATIC ANALYSISFORMAL QUALITY ASSURANCEFORMAL TESTING

13. FORMAL DESIGN INSPECTIONS 77% 90% 95%No inspections or static analysisFORMAL QUALITY ASSURANCEFORMAL TESTING

14. FORMAL DESIGN INSPECTIONS 83% 95% 97%INSPECTIONS, STATIC ANALYSISFORMAL QUALITY ASSURANCENo formal testing

15. FORMAL DESIGN INSPECTIONS 85% 97% 99%INSPECTIONS, STATIC ANALYSISNo quality assuranceFORMAL TESTING

THREE TECHNOLOGY CHANGES

Page 72: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\72 Copyright © 2011 by Capers Jones. All Rights Reserved.

SOFTWARE DEFECT REMOVAL RANGES (cont.)

TECHNOLOGY COMBINATIONS DEFECT REMOVAL EFFICIENCY

Lowest Median Highest

1. Formal design inspections 95% 99% 99%inspections, static analysisFormal quality assuranceFormal testing

BEST CASE RANGE

Page 73: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\73 Copyright © 2011 by Capers Jones. All Rights Reserved.

MEASURING SOFTWARE QUALITY

1. Find an executive sponsor

2. Select the measurement manager

3. Select the measurement staff

4. Define the set of removal steps to be measured

5. Define the defect origins

6. Define the defect severities

7. Define the format of the output report

8. Select the projects to be measured

9. Collect the initial data

10. Produce the initial report

Page 74: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\74 Copyright © 2011 by Capers Jones. All Rights Reserved.

KEY MEASUREMENT FOR SOFTWARE QUALITY

There are four key software quality measures:

• Defect origins

• Defect severities

• Defect causes

• Defect removal efficiency

Page 75: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\75 Copyright © 2011 by Capers Jones. All Rights Reserved.

DEFECT ORIGINS

1. Requirements

2. Design

3. Code

4. Documentation

5. Bad fixes

Page 76: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\76 Copyright © 2011 by Capers Jones. All Rights Reserved.

DEFECT SEVERITIES

Severity 1: Software cannot be used at all

Severity 2: Major functions disabled

Severity 3: Minor functions disabled

Severity 4: Cosmetic problems

Page 77: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\77 Copyright © 2011 by Capers Jones. All Rights Reserved.

DEFECT CAUSES

1. Errors of omission

2. Errors of commission

3. Errors of clarity and ambiguity

4. Errors of speed or capacity

5. Errors of unknown cause

Page 78: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\78 Copyright © 2011 by Capers Jones. All Rights Reserved.

CONCLUSIONS/OBSERVATIONS ON DEFECT REMOVAL

• No single method is adequate.

• Testing alone is insufficient.

• Reviews, inspections and tests combined give highefficiency, lowest costs and shortest schedules.

• Reviews, inspections, tests, static analysis and prototypes give highest cumulative efficiency.

• Six-Sigma, CMMI, formal inspections, static analysis,and formal testing are best overall quality approaches.

• ISO 9000-9004 are expensive with little improvement.

Page 79: MEASURING SOFTWARE QUALITY AND CUSTOMER SATISFACTION Copyright © 2011 by Capers Jones. All Rights Reserved. Capers Jones & Associates LLC Web: .

MEASQUAL\79 Copyright © 2011 by Capers Jones. All Rights Reserved.

REFERENCES ON SOFTWARE QUALITY

Jones, Capers & Subramanyam, Jitendra; The Economics of Software Quality, Addison Wesley, 2011 (July)

Jones, Capers; Software Engineering Best Practices, McGraw Hill, 2010

Jones, Capers; Estimating Software Costs, McGraw Hill, 2007.

Jones, Capers; Assessments, Benchmarks, and Best Practices, Addison Wesley, 2000.

Jones, Capers; Applied Software Measurement; McGraw Hill, 2008.

Kan, Steve; Metrics and Models in Software Quality Engineering,Addison Wesley, 2003.

Radice, Ron; High-quality, Low-cost Software Inspections,Paradoxican Publishing, 2002.

Wiegers, Karl; Peer Reviews in Software, Addison Wesley, 2002.


Recommended