Post on 14-Dec-2014
description
transcript
1
© 2006, Raytheon Co., All Rights Reserved
DFSS DFSS forfor
Software-Intensive SystemsSoftware-Intensive Systems
Neal Mackertich & Trindy LeForgeNeal Mackertich & Trindy LeForgeRaytheon CompanyRaytheon Company
2 © 2006, Raytheon Co., All Rights Reserved
Right Architecture?
The Voice of the Customer
Software Project Challenges
Resource Availability
Defect Management
Product Cost
What to Test?
Schedule pressure
Technical Performance
3 © 2006, Raytheon Co., All Rights Reserved
IPDS provides an integrated set of best
practices for the entire product development life cycle. through a just-in-time tailoring
process.
R6s is a business strategy for process
improvement. It guides us to use CMMI and IPDS as
tools to deliver value to customers and integrate
industry best practices.
CMMI provides guidance for creating, measuring,managing,
and improving specific processes.
Each plays an integral role in thesuccess of programs, projects and organizations
Programs integrate R6Sigma, IPDS, and CMMI into their plans
Process IntegrationProcess Integration
4 © 2006, Raytheon Co., All Rights Reserved
DFSS
Affordability
ProducibilityRobust
Performance
Design for Six Sigma is a methodology used within IPDP to predict, manage and improve Producibility, Affordability, and Robust
Performance for the benefit of our customers.
In essence, DFSS is the application of R6s in product optimization
Design for Six SigmaDesign for Six Sigma
5 © 2006, Raytheon Co., All Rights Reserved
DFSS Enablement of DFSS Enablement of Software Project PerformanceSoftware Project Performance
Program Execution Gates 5-11
Business Business Decision Decision Gates 1-4Gates 1-4
5 - System Integration,Verification and Validation
4 - Product Design andDevelopment
Planning 6 - Production and Deployment
Planning 7 - Operations and Support
1 - Capture/Proposal Development 3 - Requirements and
Architecture Development5 6 7 8 109
= GATE
Internal Preliminary
Design Review
Start-Up Review Internal
System Functional
Review
Internal Critical Design Review
Internal Test/ShipReadiness
Review Internal ProductionReadiness
Review
2 - Project Planning, Management and Control
1 2 3 4
Bid / Proposal Review
11
Test Optimization
Architectural
Evaluation
VOC Modeling
& Analysis
Defect Prediction& Prevention
Critical Chain Project
Management
Performance Modeling
DFSS is the application of Six Sigma in product optimization
6 © 2006, Raytheon Co., All Rights Reserved
Usage Based ModelingUsage Based Modeling
S1
S3
S5 Exit
S4
S2
S6
30%
70% 100%
100%
10%
100%
25%
75%90%
• States - represent pertinent usage history i.e.. The state of the software from an external user’s perspective.
• Arcs - represent state transitions caused by applying stimuli
• Transition probabilities - simulate expected user behavior
7 © 2006, Raytheon Co., All Rights Reserved
CW
MI (
Sty
le G
uide
)C
WM
I (Fr
amew
ork)
JTM
(SN
S)
JTM
(Non
-SN
S)
CD
LIM
IDD
(AM
DW
S)
EO
M
AM
DW
S
MS
CT
Sol
ipsy
s P
ub/S
ubTD
F
Ada
ptiv
e La
yer
DLI
M (C
urre
nt D
esig
n)
AS
C (T
oast
er)
VM
E R
acks
SLA
MR
AA
M P
ub/S
ub
IFC
S S
oftw
are
Mod
ules
IFC
S S
oftw
are
Arc
hite
ctur
e
System-of-System Design Elements JLENS Spiral 2 Design ElementsWeight Direction of Improvement SLAMRAAM Design ElementsCost 5 -1 0 0 0.3 0.3 0.3 0.3 0.3 0.9 0.9 0.3 0.3 0.3 0.3 0.5 0.5 0.3 0.3 0Schedule 5 -1 0 0 0.9 0.9 0.9 0.3 0.3 0.3 0.3 0 0 0 0 0 0 0 0 0Interoperability 4 1 0.3 0.3 0.9 0.9 0.9 0.3 0.3 0.3 0.3 0 0 0.9 0.9 0 0 0.3 0.3 0.3Performance 4 1 0 0 0.7 0.7 0.7 0.3 0 0.9 0.9 0.3 0.3 0.9 0.9 0 0 0.3 0.7 0.7System-of-systems3 1 0.9 0.9 0.9 0.9 0.9 0.9 0.9 0.3 0.3 0 0 0 0 0.3 0.3 0 0.3 0.3Reuse 2 1 0.9 0.9 0.9 0.9 0.9 0.9 0.9 0.3 0.3 0.3 0.3 0.3 0.3 0.9 0.9 0.9 0.9 0.9Proprietary 1 -1 0 0 0.3 0 0 0 0 0 0.3 0.3 0.3 0 0 0 0 0.3 0 0
5.7 5.7 4.6 4.9 4.9 3.9 2.7 0.3 0 0 0 6.3 6.3 0.2 0.2 2.4 5.2 6.7
System-of-System Design Elements JLENS Spiral 2 Design Elements
VOC Analysis using QFDVOC Analysis using QFD
Customer PrioritiesCustomer Priorities
Reduce Cost & Schedule
Increase Interoperability & Performance
Increase System-of-Systems Interoperability
Increase Reuse
Reduce Proprietary Software
Baseline Design AlternativesBaseline Design Alternatives
System-of-Systems Components
JLENS Spiral 2 Components
SLAMRAAM Components
SLAMRAAM provides low-cost design alternatives
Availability of SoS elements has significant impacts on schedule
Baseline elements with limited Reuse & interoperability
Baseline elements that improve interoperability
VOICE
OF THE
CUSTOMER
PR
IOR
ITIE
S4
CO
MP
ET
ITIV
EE
VA
LU
AT
ION
HOWs
DIRECTION OF IMPROVEMENT
RELATIONSHIPMATRIX
CORRELATIONMATRIX
HOW MUCHes
RANKINGS / PRIORITIES
8 © 2006, Raytheon Co., All Rights Reserved
““New & Nifty” Pen – QFD ExampleNew & Nifty” Pen – QFD Example
Design a Pen for Students Ages 5 - 18
© 2006, Raytheon Co., All Rights Reserved
““If the system is going to be built by more than one person If the system is going to be built by more than one person - and these days, what system isn’t - it is the architecture - and these days, what system isn’t - it is the architecture that lets them communicate and negotiate work that lets them communicate and negotiate work assignments. If the requirements include goals for assignments. If the requirements include goals for performance, security, reliability, or maintainability, then performance, security, reliability, or maintainability, then architecture is the design artifact that first expresses how architecture is the design artifact that first expresses how the system will be built to achieve those goals.”the system will be built to achieve those goals.” ““Evaluating Software Architectures- Methods & Case Studies” SEI Series in Software Evaluating Software Architectures- Methods & Case Studies” SEI Series in Software Engineering (Clements, Kazman & Klein)Engineering (Clements, Kazman & Klein)
“… “… it is possible to evaluate an architecture and it is possible to evaluate an architecture and its design decisions, to analyze decisions, in the context of its design decisions, to analyze decisions, in the context of the goals and requirements that are levied on systems that the goals and requirements that are levied on systems that will be built from it.”will be built from it.” (Paul Clements- “Evaluating Software Architectures”)(Paul Clements- “Evaluating Software Architectures”)
Architectural EvaluationArchitectural Evaluation
© 2006, Raytheon Co., All Rights Reserved
Architectural Tradeoff Analysis MethodArchitectural Tradeoff Analysis Method
ATAM is a structured process in which the right questions are asked early ATAM is a structured process in which the right questions are asked early to:to:
Discover risks -- alternatives that might create future problems
Discover sensitivity points -- alternatives for which a slight change makes a significant difference
Discover tradeoffs -- decisions affecting more than one attribute
DD(X) customer observation: “The ATAM report represents a very good DD(X) customer observation: “The ATAM report represents a very good application of the ATAM, given the relative newness of the methodology application of the ATAM, given the relative newness of the methodology and without assistance from ATAM expertise. Many new risks were and without assistance from ATAM expertise. Many new risks were identified and documented and many more existing risks in the risk identified and documented and many more existing risks in the risk register were corroborated as a result. A variety of scenarios and register were corroborated as a result. A variety of scenarios and threads were generated during the ATAM and over 130 sensitivity threads were generated during the ATAM and over 130 sensitivity points were identified and mapped to risks. As a result of the ATAM points were identified and mapped to risks. As a result of the ATAM tradeoff and sensitivity analyses, (there were) nine identified significant tradeoff and sensitivity analyses, (there were) nine identified significant findings.”findings.”
11 © 2006, Raytheon Co., All Rights Reserved
Architectural FMECA is a Architectural FMECA is a systematic technique utilized to:systematic technique utilized to:
Identify architectural risk and Identify architectural risk and opportunitiesopportunities
Prioritize according to:Prioritize according to:– Impact– Criticality– Probability of occurrence– Probability of early detection
Identify alternatives / hybrids to Identify alternatives / hybrids to leverage opportunities and mitigation leverage opportunities and mitigation riskrisk
Document and track actions taken to Document and track actions taken to reduce riskreduce risk
Improve the probability of successImprove the probability of success
Description of FMECA Worksheet
P roduct/P rocess Process FMECA Number
Subprocess Failure Mode, Effects and Criticality Analysis P repared By
Function (Process FMECA) FM ECA Date
Design Lead Key Date Revision Date
Core Team P age of
Action Results
P rocess / Subprocess
Failure Mode(s) Failure Effect(s)SEV
Cause(s) of Failure
OCC
Current P rocess Controls
DET
RPN
Recommended Action(s)
Responsibility & Target
Completion DateActions Taken
Vanishing Vendor Item (VVI) process
VVI program execution error
Inability to produce quarterly deliverable report
8 SQL execution error with new version of Oracle
3 no controls in place, error is detected after program execution only
8 192 Add process controls. Receive automatic notifications by IS when ORACLE is updated.
B. Lazatin 02/11/02
Response Plans and Tracking
Risk Priority Number - The combined weighting of Severity, Occurrence, and Detectability.RPN = SEV x OCC x DET
Occurrence - Write down the potential cause(s), and on a scale of 1-10, rate the frequency that the cause will occur (10 = most likely). See Occurrence sheet.
Severity - On a scale of 1-10, rate the Severity of each failure (10 = most severe). See Severity sheet. Detectability - Examine the current
controls, then, on a scale of 1-10, rate the likelihood the defect is detected (10 = least detectable). See Detectability sheet.
Write down each failure mode and potential consequence(s) of that failure.
Architectural FMECAArchitectural FMECA
12 © 2006, Raytheon Co., All Rights Reserved
Critical Chain Project ManagementCritical Chain Project Management
Critical Chain Multi-Project ManagementCritical Chain Multi-Project Management was first was first developed by Dr. Eliyahu M. Goldratt founder of The developed by Dr. Eliyahu M. Goldratt founder of The Theory of ConstraintsTheory of Constraints
An established and practical project management An established and practical project management technique companies have successfully used to reduce technique companies have successfully used to reduce development cycle timedevelopment cycle time
Based on statistical methodologies
Being used at Raytheon and across industryBeing used at Raytheon and across industry
Increasingly accepted by our customersIncreasingly accepted by our customers
13 © 2006, Raytheon Co., All Rights Reserved
When (if) we look at our “Resource Allocation” we typicallydiscover highdegrees of
multi-tasking …
……wastewasteWhy don’t we
meet our schedule
commitments?!?
Resource 1
Resource 3 Time
Resource 2
Multi-Tasking is an Insidious Generator of…Multi-Tasking is an Insidious Generator of…
14 © 2006, Raytheon Co., All Rights Reserved
Assigning more than one task to a resourceAssigning more than one task to a resource
Effect of Multi-TaskingEffect of Multi-Tasking
Expectations: Task A
Task B
Task C
What happens: A B C A B C
Lost Productivity due to context switching
Should happen: Task B Task C Task APriorities are established and followed
15 © 2006, Raytheon Co., All Rights Reserved
1 2 43 5 6 7 8 9 Buffer95%
Protecting Our CustomersProtecting Our Customers
A single “project buffer” could be defined to cover variation A single “project buffer” could be defined to cover variation that occurs on all individual tasks (project buffer is not that occurs on all individual tasks (project buffer is not same thing as management reserve)same thing as management reserve)
The project buffer The project buffer PROTECTS OUR CUSTOMERSPROTECTS OUR CUSTOMERS from from the variability and uncertainty within each taskthe variability and uncertainty within each task
The PM can now manage the uncertainty as a whole, not The PM can now manage the uncertainty as a whole, not as a compilation of the individual tasksas a compilation of the individual tasks
16 © 2006, Raytheon Co., All Rights Reserved
It all starts with your model...It all starts with your model...
An Equation, Model, or Simulation of the system An Equation, Model, or Simulation of the system function is usually available: function is usually available:
InputInput OutputOutputy = f(x)y = f(x)
NETD
f
Df
N xf
D
D xdN
dTBW
L
o
c RO
vL
o
pk neq
4
12 54
4
2
1 2
.
det det
*det
/
Sprea
dshe
ets
Need to transform this deterministic approach to a probabilistic approach.
43
432
1
RR
RRR
RVV in
out
222
3 11
1122 baW
gabEhfn
17 © 2006, Raytheon Co., All Rights Reserved
DFSS Statistical Requirements AnalysisDFSS Statistical Requirements Analysis
A
B
C
D
E
Y
Y = f (A, B, C, D, E, F,...,M)0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
180
187
194
201
208
215
222
229
236
243
250
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
17
17.6
18.2
18.8
19.4 20
20.6
21.2
21.8
22.4 23
0
0.2
0.4
0.6
0.8
1
1.2
1.4
15
15.8
16.6
17.4
18.2 19
19.8
20.6
21.4
22.2 23
0
0.2
0.4
0.6
0.8
1
1.2
1.4
15
16.5 18
19.5 21
22.5 24
25.5 27
28.5 30
0
0.2
0.4
0.6
0.8
1
1.2
1.4
15
16.5 18
19.5 21
22.5 24
25.5 27
28.5 30
Response
F
G
H
I
J
K
L
M
0
0.05
0.1
0.15
0.2
0.25
15
16.5 18
19.5 21
22.5 24
25.5 27
28.5 30
Design Variables
Allocation/Flow Down
18 © 2006, Raytheon Co., All Rights Reserved
DFSS Performance Analysis results in...DFSS Performance Analysis results in...
A prediction of the A prediction of the Response Statistical Response Statistical PropertiesProperties
A prediction of the A prediction of the Probability of Non-Probability of Non-ComplianceCompliance
An assessment of An assessment of the the Contribution of Contribution of Parameter VariationParameter Variation to the Response to the Response VariationVariation
A
B
C
D
E
Y
f (A, B, C, D, E, F,...,M)
Parameters
Response
F
G
H
I
J
K
L
M
Function
G-Sys. Losses -.45
A-Pavg .35
D-Ant. Eff, .35
F-Integ. Eff. .34
J-Rec. BW -.34
B-Ant. Gain .29
H-Tgt RCS .23
C-Ant. Aperture .21
K-Pulse Width -.19
M-Rec. Out SNR -.15
I-Noise Figure -.12
L-Rep. Freq. -.03
-1 -0.5 0 0.5 1
Measured by Rank Correlation
Certainty is 95.12% from 4.00E+1 to 5.30E+1
.000
.007
.014
.020
.027
0
33.75
67.5
101.2
135
3.75E+1 4.25E+1 4.75E+1 5.25E+1 5.75E+1
Prob(LL<Y<UL)
y, y, 1y, 2y
.000
.007
.014
.020
.027
3.75E+1 4.25E+1 4.75E+1 5.25E+1 5.75E+1
PDF(Y)
Results from Crystal Ball Monte Carlo SW
19 © 2006, Raytheon Co., All Rights Reserved
Defect DensityDefect DensityPredictive Model MotivationPredictive Model Motivation
Defect Containment is a lagging indicator.Defect Containment is a lagging indicator.
Within a single development program it is difficult at best to:
› Analyze the data
› Determine root cause
› Take action
› Measure results
Data does not give the complete picture until the development is over.
An effective defect prediction model provides the ability to mix An effective defect prediction model provides the ability to mix actual and predictive performance to show what the picture will actual and predictive performance to show what the picture will look like before you get there.look like before you get there.
This enables us to make better decisions and take more effective preventive action.
20 © 2006, Raytheon Co., All Rights Reserved
Defect ModelPrevious Program/Release
Standard Defect Containment ChartModel Raw Defect Counts
STAGE ORIGINATED
From
Bas
eline
Cap
ture
Syste
m R
equir
emen
ts
Analys
is
Softw
are
Requir
emen
ts
Analys
is
Prelim
inary
Des
ign
Detail
ed D
esign
Comb
Sw Des
ign A
nd C
ode
Comb
Sw Det
ail D
esign
and
Code
Code
Unit T
est
Softw
are
Inte
grat
ion
Softw
are
Qualifi
catio
n Tes
t
Syste
m In
tegr
ation
Syste
m T
est
Post S
yste
m T
est
War
rant
y
Post-W
arra
nty
New D
efec
t Tot
als
Baseli
ne D
efec
t Tot
als
Progr
am D
efec
t Tot
als
Baseli
ne Im
pact
STAGE
DETECTED
At Baseline Capture 6 1 6 6 100%
System Requirements Analysis 5 5 0 5 0%
Software Requirements Analysis 1 5 164 2 171 1 172 1%
Preliminary Design 4 31 475 1 507 4 511 1%
Detailed Design 7 2 35 12 187 2 1 239 7 246 3%
Comb Sw Design And Code 0 0 0
Comb Sw Detail Design and Code 0 0 0
Code 25 4 15 8 26 657 7 717 25 742 3%
Unit Test 10 2 2 9 127 29 1 170 10 180 6%
Software Integration 18 6 5 22 188 3 44 7 1 1 277 18 295 6%
Software Qualification Test 14 15 38 6 35 124 6 7 716 2 8 957 14 971 1%
System Integration 23 1 5 16 102 1 8 5 13 151 23 174 13%
System Test 12 7 1 18 71 4 5 4 55 165 12 177 7%
Post System Test 1 1 4 1 2 9 0 9 0%
Warranty 0 0 0
Post-Warranty 0 0 0
16% 54% 93% 60% 51% 63% 69% 97% 59% 86% 100% 3368 120 3488 3%
16% 51% 75% 49% 0% 0% 47% 21% 36% 79% 57% 96% 100%
1% 9% 15% 9% 0% 0% 38% 1% 2% 22% 1% 2% 0% 0%
Standard Deviation Of Underlying Product Data For % Containment 38.33% 24.02% 47.12% 38.61% 32.88% 42.75% 39.28% 25.04% 37.30% 0.00%
Baseline Impact Totals
% Defects Originated In This Phase Out Of All Defects
STAGE
DETECTED
% Defects Originated in This Phase That Were Contained By This Phase
% Defects Originated in This Phase Plus Defects That Escaped From Earlier Phases That Were Contained By This Phase
Prorate the raw defectcounts by the ratio of:SLOC_New/SLOC_Model
STAGE ORIGINATED
From
Bas
eline
Cap
ture
Syste
m R
equir
emen
ts
Analys
is
Softw
are
Requir
emen
ts
Analys
is
Prelim
inary
Des
ign
Detail
ed D
esign
Comb
Sw Des
ign A
nd C
ode
Comb
Sw Det
ail D
esign
and
Code
Code
Unit T
est
Softw
are
Inte
grat
ion
Softw
are
Qualifi
catio
n Tes
t
Syste
m In
tegr
ation
Syste
m T
est
Post S
yste
m T
est
War
rant
y
Post-W
arra
nty
New D
efec
t Tot
als
Baseli
ne D
efec
t Tot
als
Progr
am D
efec
t Tot
als
Baseli
ne Im
pact
STAGE
DETECTED
At Baseline Capture 6 1 6 6 100%
System Requirements Analysis 5 5 0 5 0%
Software Requirements Analysis 1 5 164 2 171 1 172 1%
Preliminary Design 4 31 475 1 507 4 511 1%
Detailed Design 7 2 35 12 187 2 1 239 7 246 3%
Comb Sw Design And Code 0 0 0
Comb Sw Detail Design and Code 0 0 0
Code 25 4 15 8 26 657 7 717 25 742 3%
Unit Test 10 2 2 9 127 29 1 170 10 180 6%
Software Integration 18 6 5 22 188 3 44 7 1 1 277 18 295 6%
Software Qualification Test 14 15 38 6 35 124 6 7 716 2 8 957 14 971 1%
System Integration 23 1 5 16 102 1 8 5 13 151 23 174 13%
System Test 12 7 1 18 71 4 5 4 55 165 12 177 7%
Post System Test 1 1 4 1 2 9 0 9 0%
Warranty 0 0 0
Post-Warranty 0 0 0
16% 54% 93% 60% 51% 63% 69% 97% 59% 86% 100% 3368 120 3488 3%
16% 51% 75% 49% 0% 0% 47% 21% 36% 79% 57% 96% 100%
1% 9% 15% 9% 0% 0% 38% 1% 2% 22% 1% 2% 0% 0%
Standard Deviation Of Underlying Product Data For % Containment 38.33% 24.02% 47.12% 38.61% 32.88% 42.75% 39.28% 25.04% 37.30% 0.00%
Baseline Impact Totals
% Defects Originated In This Phase Out Of All Defects
STAGE
DETECTED
% Defects Originated in This Phase That Were Contained By This Phase
% Defects Originated in This Phase Plus Defects That Escaped From Earlier Phases That Were Contained By This Phase
Projected Raw Defect CountsFor New Program
New ProgramDefect Containment
Current Raw Defect Counts
STAGE ORIGINATED
From
Bas
eline
Cap
ture
Syste
m R
equir
emen
ts
Analys
is
Softw
are
Requir
emen
ts
Analys
is
Prelim
inary
Des
ign
Detail
ed D
esign
Comb
Sw Des
ign A
nd C
ode
Comb
Sw Det
ail D
esign
and
Code
Code
Unit T
est
Softw
are
Inte
grat
ion
Softw
are
Qualifi
catio
n Tes
t
Syste
m In
tegr
ation
Syste
m T
est
Post S
yste
m T
est
War
rant
y
Post-W
arra
nty
New D
efec
t Tot
als
Baseli
ne D
efec
t Tot
als
Progr
am D
efec
t Tot
als
Baseli
ne Im
pact
STAGE
DETECTED
At Baseline Capture 6 1 6 6 100%
System Requirements Analysis 5 5 0 5 0%
Software Requirements Analysis 1 5 164 2 171 1 172 1%
Preliminary Design 4 31 475 1 507 4 511 1%
Detailed Design 7 2 35 12 187 2 1 239 7 246 3%
Comb Sw Design And Code 0 0 0
Comb Sw Detail Design and Code 0 0 0
Code 25 4 15 8 26 657 7 717 25 742 3%
Unit Test 10 2 2 9 127 29 1 170 10 180 6%
Software Integration 18 6 5 22 188 3 44 7 1 1 277 18 295 6%
Software Qualification Test 14 15 38 6 35 124 6 7 716 2 8 957 14 971 1%
System Integration 23 1 5 16 102 1 8 5 13 151 23 174 13%
System Test 12 7 1 18 71 4 5 4 55 165 12 177 7%
Post System Test 1 1 4 1 2 9 0 9 0%
Warranty 0 0 0
Post-Warranty 0 0 0
16% 54% 93% 60% 51% 63% 69% 97% 59% 86% 100% 3368 120 3488 3%
16% 51% 75% 49% 0% 0% 47% 21% 36% 79% 57% 96% 100%
1% 9% 15% 9% 0% 0% 38% 1% 2% 22% 1% 2% 0% 0%
Standard Deviation Of Underlying Product Data For % Containment 38.33% 24.02% 47.12% 38.61% 32.88% 42.75% 39.28% 25.04% 37.30% 0.00%
Baseline Impact Totals
% Defects Originated In This Phase Out Of All Defects
STAGE
DETECTED
% Defects Originated in This Phase That Were Contained By This Phase
% Defects Originated in This Phase Plus Defects That Escaped From Earlier Phases That Were Contained By This Phase
Stage % CompleteSystem Requirements Analysis ASoftware Requirements Analysis BPreliminary Design CDetailed Design DCode EUnit Test FSoftware Integration GSoftware Qualification Test HSystem Integration ISystem Test JPost System Test K
Current SLOC Estimate = XXXXXXX
Percent complete is supplied on anas-needed for analysis (minimummonthly). % should match EVMS.
Current SLOCestimate is requiredfor model calibration – should match tracking book
The model assumption is that if a stage is XX% complete, then we have detected XX% of the defects “normally” found in that stage.
Percent complete is used as a multiplier (1-%complete) for the raw defect counts as for each stage detected to calculate defect counts for defects that are not yet detected.
STAGE ORIGINATED
From
Bas
eline
Cap
ture
Syste
m R
equir
emen
ts
Analys
is
Softw
are
Requir
emen
ts
Analys
is
Prelim
inary
Des
ign
Detail
ed D
esign
Comb
Sw Des
ign A
nd C
ode
Comb
Sw Det
ail D
esign
and
Code
Code
Unit T
est
Softw
are
Inte
grat
ion
Softw
are
Qualifi
catio
n Tes
t
Syste
m In
tegr
ation
Syste
m T
est
Post S
yste
m T
est
War
rant
y
Post-W
arra
nty
New D
efec
t Tot
als
Baseli
ne D
efec
t Tot
als
Progr
am D
efec
t Tot
als
Baseli
ne Im
pact
STAGE
DETECTED
At Baseline Capture 6 1 6 6 100%
System Requirements Analysis 5 5 0 5 0%
Software Requirements Analysis 1 5 164 2 171 1 172 1%
Preliminary Design 4 31 475 1 507 4 511 1%
Detailed Design 7 2 35 12 187 2 1 239 7 246 3%
Comb Sw Design And Code 0 0 0
Comb Sw Detail Design and Code 0 0 0
Code 25 4 15 8 26 657 7 717 25 742 3%
Unit Test 10 2 2 9 127 29 1 170 10 180 6%
Software Integration 18 6 5 22 188 3 44 7 1 1 277 18 295 6%
Software Qualification Test 14 15 38 6 35 124 6 7 716 2 8 957 14 971 1%
System Integration 23 1 5 16 102 1 8 5 13 151 23 174 13%
System Test 12 7 1 18 71 4 5 4 55 165 12 177 7%
Post System Test 1 1 4 1 2 9 0 9 0%
Warranty 0 0 0
Post-Warranty 0 0 0
16% 54% 93% 60% 51% 63% 69% 97% 59% 86% 100% 3368 120 3488 3%
16% 51% 75% 49% 0% 0% 47% 21% 36% 79% 57% 96% 100%
1% 9% 15% 9% 0% 0% 38% 1% 2% 22% 1% 2% 0% 0%
Standard Deviation Of Underlying Product Data For % Containment 38.33% 24.02% 47.12% 38.61% 32.88% 42.75% 39.28% 25.04% 37.30% 0.00%
Baseline Impact Totals
% Defects Originated In This Phase Out Of All Defects
STAGE
DETECTED
% Defects Originated in This Phase That Were Contained By This Phase
% Defects Originated in This Phase Plus Defects That Escaped From Earlier Phases That Were Contained By This Phase
New Program -Projected Defect ContainmentSum of Current Raw Defect Counts Plus Raw Defects Not Yet Detected
Model ChartsNormalized By Size
Defect DensityDefect DensityPredictive Model ApproachPredictive Model Approach
21 © 2006, Raytheon Co., All Rights Reserved
Defect DensityDefect DensityBy Stage Detected and OriginatedBy Stage Detected and Originated
Defect density by stage uses total defects and source lines of code Defect density by stage uses total defects and source lines of code counts to assess the efficiency of catching and containing software counts to assess the efficiency of catching and containing software defects in the various lifecycle stages.defects in the various lifecycle stages.
Predicted Defect Detection Profile
0.00
5.00
10.00
15.00
20.00
25.00
30.00
35.00
Requirements Design Code & Unit Test SQT Sys Int & Test Post System Test
De
fect
s / K
SL
OC
Actual Defects/KSLOC Remain Defects/KSLOC Mean LCL UCL
Predicted Defect Origination Profile
0.00
5.00
10.00
15.00
20.00
25.00
30.00
35.00
40.00
45.00
Requirements Design Code & Unit Test SQT Sys Int & Test Post System Test
Def
ects
/ K
SLO
C
Actual Defects/KSLOC Remain Defects/KSLOC Mean LCL UCL
22 © 2006, Raytheon Co., All Rights Reserved
Defect Density – Code Inspection Analysis Tips
Tips:Tips: The first level of understanding in analyzing Defect Density at The first level of understanding in analyzing Defect Density at Code Inspection is:Code Inspection is:
Analysis of each data point resulting from code peer review using the Code Inspection Calculator.
u Chart
UCL=17.012
LCL=0.0
CEN=5.778
-10
-5
0
5
10
15
20
25
30
156 157 158 159 160 161 162 163 164 165 166 167 168 169 172 174 175 177 182 183 185 191 193 195 196
Inspection Number
(defects detected) / (KSLOC Inspected)
When number of defects is between the control limits, fix
findings and pass to next stage.
When number of defects is below the lower limit, unit(s) may need more inspection.
When number of defects is above the upper limit, unit(s)
may need rework and a repeated inspection.
23 © 2006, Raytheon Co., All Rights Reserved
Test OptimizationTest Optimization
Industry studies have show test to represent between 30 and Industry studies have show test to represent between 30 and 50% of product development costs. If this is even close to 50% of product development costs. If this is even close to accurate, test represents fertile ground for optimization.accurate, test represents fertile ground for optimization.
Towards this end, The Department of Defense and the National Towards this end, The Department of Defense and the National Academy of Sciences have cited two industry best practices in Academy of Sciences have cited two industry best practices in the area statistically based test optimization:the area statistically based test optimization:
Usage-based Stochastic Modeling (based on VOC analysis)Usage-based Stochastic Modeling (based on VOC analysis)
Combinatorial Design Methods (CDM)Combinatorial Design Methods (CDM)
24 © 2006, Raytheon Co., All Rights Reserved
CDM Advantages / ResultsCDM Advantages / Results
N-way combinations provides reasonable, statistically balanced N-way combinations provides reasonable, statistically balanced coverage to be augment with domain expertise.coverage to be augment with domain expertise.
CDM is more realistic than full/fractional experimental designs:CDM is more realistic than full/fractional experimental designs:
Compatible with constraints
Compatible with factors at different levels
Can account for previous test
Drastically reduces the total number of test cases when compared to Drastically reduces the total number of test cases when compared to all combinations.all combinations.
Since generating test cases is very quick and simple, there are no Since generating test cases is very quick and simple, there are no major barriers to using CDM as part of the testing process.major barriers to using CDM as part of the testing process.
Can be used in almost all phases of systems & subsystem testing.Can be used in almost all phases of systems & subsystem testing.
IIS Satellite Ground Systems CDM application resulted in significant IIS Satellite Ground Systems CDM application resulted in significant schedule (68%) and cost savings (67%).schedule (68%) and cost savings (67%).
25 © 2006, Raytheon Co., All Rights Reserved
A Testing ScenarioA Testing Scenario
22 Test Cases22 Test Cases2 Way Combinations2 Way Combinations
1920 Test Cases1920 Test CasesAll CombinationsAll Combinations
If Destination Format is GIF, then # colors cannot be 16 bit or 24 bit.If Destination Format is GIF, then # colors cannot be 16 bit or 24 bit.ConstraintsConstraints
Correct conversion (True or False)Correct conversion (True or False)OutputsOutputs
Source Format (GIF, JPG, TIFF, PNG)Source Format (GIF, JPG, TIFF, PNG)
Dest. Format (GIF, JPG, TIFF, PNG)Dest. Format (GIF, JPG, TIFF, PNG)
Size (Small, Med, Large)Size (Small, Med, Large)
# colors (4 bit, 8 bit, 16 bit, 24 bit)# colors (4 bit, 8 bit, 16 bit, 24 bit)
Destination (Local drive, network drive)Destination (Local drive, network drive)
Windows Version (95, 98, NT, 2000, Me)Windows Version (95, 98, NT, 2000, Me)
InputsInputs
Graphics manipulation function that converts from one format to another.Graphics manipulation function that converts from one format to another.Example ApplicationExample Application
Black-box type testing geared to functional requirements of an application.Black-box type testing geared to functional requirements of an application.DefinitionDefinition
26 © 2006, Raytheon Co., All Rights Reserved
World-Class Software DFSS World-Class Software DFSS
Voice of the Customer modeling and analysis an integral part of the Requirements analysis process.
Up-front Software Architectural trade space evaluation (vs. validation).
Statistically managing our software development cycle time using critical chain concepts.
Statistical modeling & optimization of the performance / cost software design trade space.
Enabled decision-making through predictive defect modeling.
Stochastically modeled System Integration, Verification & Validation.