+ All Categories
Home > Documents > Note 7. Software Metrics -...

Note 7. Software Metrics -...

Date post: 18-Feb-2019
Category:
Upload: vuongngoc
View: 221 times
Download: 1 times
Share this document with a friend
32
Computer Science and Software Engineering University of Wisconsin - Platteville Note 6. Software Metrics Yan Shi Lecture Notes for SE 3730 / CS 5730 Part of the slides are adopted from Pearson’s slides
Transcript

Computer Science and Software Engineering

University of Wisconsin - Platteville

Note 6. Software Metrics

Yan Shi

Lecture Notes for SE 3730 / CS 5730

Part of the slides are adopted from Pearson’s slides

Outline

What is a software metric?

Software size metrics— KLOC

— Function Points

Software quality metrics— process metrics

— product metrics

Software complexity metrics— Halstead’s metrics

What is a Metric?

A metric is a measurable indication of some quantitative aspect of a system with the following characteristics:— Measurable— Independent of human influence— Accountable: save the raw data— Precise

E.g., Number of errors found per person hours.

A metric can be a “result” or a “predictor”.— determine the quality— predict and improve quality

Software Size Metrics

KLOC – CoCoMo (Constructive Cost Model):— Real-time embedded system:

— System programs:

— Commercial applications:

— KLOC has dependence on the programming language or development tool.

Function Point: — measure the project size by functionality

specified for that system.

40-160 LOC/man-month

150-400 LOC/man-month

200-800 LOC/man-month

result or predictor?

COCOMO Overview

COCOMO consists of a hierarchy of three increasingly detailed and accurate forms.— Basic COCOMO Model— Intermediate COCOMO Model— Detailed COCOMO Model

All of these models can be applied to a variety of projects, whose characteristics determine the value of constant to be used in subsequent calculations.— Organic: well understood problem; small team with past

experience — Semi-detached: somewhere in between— Embedded: highest level of complexity; large team with

experience and creativity.

Basic COCOMO Model

Effort Applied (E) = ab(KLOC)bb [ man-month ]

Development Time (D) = cb(Effort Applied)d

b [months]

People required (P) = Effort Applied / Development Time [count]

Software Project ab bb cb db

organic 2.4 1.05 2.5 0.38

Semi-detached 3.0 1.12 2.5 0.35

embedded 3.6 1.20 2.5 0.32

Function Point Method

Step1: Compute crude function points (CFP)

Step2: Compute the relative complexity adjustment factor (RCAF) for the project.

— RCAF varies between 0 and 70.

Step3: Compute the number of function points (FP):

FP = CFP (0.65+0.01RCAF)

Function Point Method Step1

Step1: Compute crude function points (CFP)

— identify functional components

— evaluate each component as simple, average or complex

— apply weighting factors to the components

— CFP is the summed weighted values

Function Point Method Step1

Step1: Compute crude function points (CFP)

2.evaluate each component

1.identify functional components

3.apply weighting factors

4.CFP is the summed weighted values

FP Example: Attend Master System

Data Flow Diagram

© Pearson Education Limited 2004 & cah, UoN 2008

FP Example: Attend Master System

Software

system

components

Complexity level Total

CFPSimple average complex

Count Weight

FactorPoints Count Weight

FactorPoints Count Weight

FactorPoints

A B C=

AxBD E F=

DxEG H I=

GxH

User inputs 1 3 3 --- 4 --- 1 6 6 9

User outputs --- 4 --- 2 5 10 1 7 7 17User online queries 1 3 3 1 4 4 1 6 6 13

Logical files 1 7 7 --- 10 --- 1 15 15 22External interfaces --- 5 --- --- 7 --- 2 10 20 20

Total CFP 81

© Pearson Education Limited 2004 & cah, UoN 2008

Function Point Method Step2

Step2: Compute the relative complexity adjustment factor (RCAF) for the project.

— assign grades (0-5) to the 14 subjects that substantially affect the development effort:

— RCAS = sum of all grades

Function Point Method Step2

© Pearson Education Limited 2004 & cah, UoN 2008

FP Example: Attend Master System

No Subject Grade

1 Requirement for reliable backup and recovery 0 1 2 3 4 5

2 Requirement for data communication 0 1 2 3 4 5

3 Extent of distributed processing 0 1 2 3 4 5

4 Performance requirements 0 1 2 3 4 5

5 Expected operational environment 0 1 2 3 4 5

6 Extent of online data entries 0 1 2 3 4 5

7 Extent of multi-screen or multi-operation online data input 0 1 2 3 4 5

8 Extent of online updating of master files 0 1 2 3 4 5

9 Extent of complex inputs, outputs, online queries and files 0 1 2 3 4 5

10 Extent of complex data processing 0 1 2 3 4 5

11 Extent that currently developed code can be designed for reuse 0 1 2 3 4 5

12 Extent of conversion and installation included in the design 0 1 2 3 4 5

13 Extent of multiple installations in an organization and variety of customer organizations

0 1 2 3 4 5

14 Extent of change and focus on ease of use 0 1 2 3 4 5

Total = RCAF 41

© Pearson Education Limited 2004 & cah, UoN 2008

FP Example: Attend Master System

© Pearson Education Limited 2004 & cah, UoN 2008

FP = CFP x (0.65 + 0.01 x RCAF)

FP = 81 x (0.65 + 0.01 x 41) = 85.86

FP: Benefits and Drawbacks

FP can be used as a predictor to estimate the project size/ resources needed during project planning!— as long as we have requirement specification

However, it is very subjective

— depend on the estimators’ expert knowledge — cannot be counted automatically

Not universally applicable – best for data processing systems Detailed requirement specification may not be available during

planning phase!

Estimating LOC based on FP and language:QSM Function Point Programming Language Table

Software Quality Metrics

Quality metrics

Process metrics

Quality

Timetable

Effectiveness

Productivity

Product metrics

Development

Maintainace

Software Process Quality Metrics

Error Density and Severity Metrics. need:

Errors counted:— NCE: number of code errors— WCE: weighted number of code errors— NDE: number of development errors (design+code)— WDE: weighted number of development errors

How to decide weight?— Critical: blocking other tests and alpha release, 9— Severe: blocking other tests and beta release, 6-8— Moderate: testing workaround possible, but blocking final release ~3— Very minor: fix before the “Sun Burns Out” 1

Product size: KLOC or FP

Error Density metrics

Code Name Calculation formula

CED Code Error Density CED = NCE / KLOC

DED Development Error Density DED = NDE / KLOC

WCED Weighted Code Error Density WCDE = WCE / KLOC

WDED Weighted Development Error Density WDED = WDE / KLOC

WCEF Weighted Code Errors per Function Point

WCEF = WCE / NFP

WDEF Weighted Development Errors per Function Point

WDEF = WDE / NFP

CED > 2 and WCED > 4 : unacceptable software quality

Error Severity Metrics

Code Name Calculation formula

ASCE Average Severity of Code Errors

WCEASCE = -----------

NCE

ASDE Average Severity of Development Errors

WDEASDE = -----------

NDE

Software Process Timetable Metrics

Code Name Calculation formula

TTO Time Table Observance MSOTTTO = -----------

MS

ADMC Average Delay of Milestone Completion

TCDAMADMC = -----------

MS

MSOT = Milestones completed on time.MS = Total number of milestones.TCDAM = Total Completion Delays (days, weeks, etc.) for all milestones.

Error Removal Effectiveness Metrics

Code Name Calculation formula

DERE Development Errors Removal Effectiveness

NDEDERE = ----------------

NDE + NYF

DWERE Development Weighted Errors Removal Effectiveness

WDEDWERE = ------------------

WDE+WYF

NYF = number of software failures detected during a year of maintenance service.

WYF = weighted number of software failures detected during a year of maintenance service.

Software Process Productivity Metrics

Code Name Calculation formula

DevP Development Productivity DevH

DevP = ----------KLOC

FDevP Function point Development Productivity

DevHFDevP = ----------

NFP

CRe Code Reuse ReKLOC

Cre = --------------KLOC

DocRe Documentation Reuse ReDoc

DocRe = -----------NDoc

DevH = Total working hours invested in the development of the software system.ReKLOC = Number of thousands of reused lines of code.ReDoc = Number of reused pages of documentation.NDoc = Number of pages of documentation.

Software Product Metrics

HD(help desk) quality metrics:— HD calls density metrics - measured by the number of calls. — HD calls severity metrics - the severity of the HD issues raised. — HD success metrics – the level of success in responding to HD

calls.

HD productivity metrics. HD effectiveness metrics.

Corrective maintenance quality metrics.— Software system failures density metrics — Software system failures severity metrics — Failures of maintenance services metrics — Software system availability metrics

Corrective maintenance productivity and effectiveness metrics.

Define New Software Quality Metrics

Complexity Metrics

Halstead’s metrics

Complexity of a piece of code depends on:

— n1: number of unique operators

— n2: number of unique operands

— N1: total number of occurrences of operators

— N2: total number of occurrences of operands

McCabe’s metrics

— graph complexity based on control flow graphs

Halstead’s Metrics[1977]

Program length: N = N1 + N2

Program vocabulary: n = n1 + n2

Estimated length: = n1 log2 n1 + n2 log2 n2

— Close estimate of length for well structured programs

Purity ratio: PR = /N— code optimization: the higher the ratio above 1.0, the

more optimized the code.

Program volume: V = N log2 n— Number of bits to provide a unique designator for each of

the n items in the program vocabulary.

Difficulty:

Program effort: E=D*V— This is a good measure of program understandability

Exercise: Halstead’s Metrics

Distinct operators: if ( ) { } > < = * ;

Distinct operands: k 5 2 x

n1: number of unique operators n2: number of unique operands N1: total number of occurrences of operators N2: total number of occurrences of operands

Calculate Halstead’s metrics!

if (k < 5) {

if (k > 2)x = x*k;

}

10

4

13

7

Exercise: Halstead’s Metrics

Program length: N = N1 + N2 = 20

Program vocabulary: n = n1 + n2 = 14

Estimated length: = n1 log2 n1 + n2 log2 n2 = 41.2

Purity ratio: PR = /N = 2.1

Program volume: V = N log2 n = 76.1

Difficulty: = 8.75

Program effort: E = D*V = 665.9

Object Oriented Metrics

A way to characterize the “object-orientedness” of a design (Shyam Chidamberand Chris Kemerer, 1994)— Weighted methods per class

— depth of inheritance tree

— number of children

— coupling between object classes

— response for a class

— lack of cohesion in methods

Defect Cost Analysis

Defect injection point: — In what stage of the development cycle was the defect

put into the system?

Defect detection point:— In what stage of the development cycle was the defect

discovered?

The latency between injection and detection point of a defect:— the longer, the more expensive to fix.

This analysis help evolve a process to prevent defects and reduce the latency.

Summary

What is a software metric? Software size metrics

— KLOC— Function Points

Software quality metrics— process metrics: development

quality, timetable, effectiveness, productivity

— product metrics: maintainace

Software complexity metrics— Halstead’s metrics: n1, n2, N1, N2

Defect Cost Analysis


Recommended