+ All Categories
Home > Documents > COCOMO III Workshop v1€“ Effort, Schedule, Cost, Quality Level (Defects) • COCOMO®III can be...

COCOMO III Workshop v1€“ Effort, Schedule, Cost, Quality Level (Defects) • COCOMO®III can be...

Date post: 08-Jul-2019
Category:
Upload: phungdieu
View: 217 times
Download: 0 times
Share this document with a friend
44
University of Southern California Center for Systems and Software Engineering COCOMO ® III Brad Clark, PhD USC Center for Systems and Software Engineering 2017 Annual Research Review April 4, 2017
Transcript

University of Southern CaliforniaCenter for Systems and Software Engineering

COCOMO® III

Brad Clark, PhDUSC Center for Systems and Software Engineering

2017 Annual Research ReviewApril 4, 2017

University of Southern CaliforniaCenter for Systems and Software Engineering

The COCOMO® III Project • COCOMO (COnstructure COst MOdel) is the most widely used,

free, open source software cost estimation model in the world. – Registered Trademark for intellectual property protection– COCOMO 81 and COCOMO II models are open and free for anyone

to use– Models have been commercialized

• It has been 17 years since the COCOMO II model has been updated and calibrated to new Software Engineering practices.

Apr 5, 2017 Copyright © USC-CSSE 2

University of Southern CaliforniaCenter for Systems and Software Engineering

COCOMO® III Project Purpose• Broaden audiences of COCOMO® and address scope of modern

projects: mobile devices, web/internet, big data, cloud-targeted, and multi-tenant software

• Modernize model size inputs• Consider the impact of modern development processes (e.g.

Agile)• Improve the accuracy and realism of estimates

– Improve driver definitions– New and updated software cost drivers and adjust their ratings as

needed– Quality estimation capability– Point and range estimates based on risk

• Improve value of COCOMO® in decision-making

Apr 5, 2017 Copyright © USC-CSSE 3

University of Southern CaliforniaCenter for Systems and Software Engineering

COCOMO® III Project Scope• COCOMO® III will product estimates for:

– Effort, Schedule, Cost, Quality Level (Defects)• COCOMO® III can be applied at various moments in a project’s

lifecycle:– Early Estimation, Initial Project Estimation, Project Re-estimation

• COCOMO® III’s functional vision– Single and Multiple component estimate– Analysis of alternatives– Analysis with Size-Effort-Schedule as independent variables– Support for different lifecycle processes– Lifecycle cost estimation– Legacy system transformation

Apr 5, 2017 Copyright © USC-CSSE 4

University of Southern CaliforniaCenter for Systems and Software Engineering

Model Scope• Model – Development Process Compatibility• Application Domain• Model Breath and Depth• Model WBS• Workload Sizing• Cost estimating relationships

– Parametric models– Simple relationships

Apr 5, 2017 Copyright © USC-CSSE 5

University of Southern CaliforniaCenter for Systems and Software Engineering

Model – ICSM Common Cases • Since 2000, a plethora of development processes have arisen

– Non-Developmental Item such as COTS– Agile Development– Brownfield Development

• One size fits all estimation model is no longer feasible

• 11 common development cases are presented next with indications on where COCOMO III is suitable

Apr 5, 2017 Copyright © USC-CSSE 6

Source: Boehm, B., Lane, J., Koolmanojwong, S. and Turner, R. “The Incremental Commitment Spiral Model”, Addison-Wesley 2014, Chapter 11.

University of Southern CaliforniaCenter for Systems and Software Engineering

Model Match to ICSM Common Cases-1Case 1: Use Non-Developmental Item (NDI)• Example: Small accounting system• Size variable, complexity low• Typical Change Rate/Month: Negligible • Criticality: n/a• NDI Support: Complete• Personnel Capability: NDI-experienced (medium)• Activities: Acquire NDI, Use NDI• Time/Build: n/a• Time/Increment: Vendor-driven

Case 2: Agile• Example: E-services• Size, Complexity: Low• Typical Change Rate/Month: 1-30%• Criticality: Low to medium• NDI Support: Good, in place• Personnel Capability: Agile-ready, medium-high experience• Activities: Skip Valuation and Architecting phases; Scrum plus agile methods of choice• Time/Build: <= 1 day• Time/Increment: 2-6 weeks

Apr 5, 2017 Copyright © USC-CSSE 7

University of Southern CaliforniaCenter for Systems and Software Engineering

Model Match to ICSM Common Cases -2Case 3: Architected Agile• Example: Business data processing• Size, Complexity: Medium• Typical Change Rate/Month: 1-10 %• Criticality: Medium to high• NDI Support: Good, most in place• Organizational Personnel Capability: Agile-ready, medium to high experience• Activities: Combine Valuation, Architecting phases. Complete NDI preparation. Architecture-

based Scrum of Scrums• Time/Build: 2-4 weeks• Time/Increment: 2-6 months

Case 4: Formal Methods• Example: Security kernel; Safety-critical LSI chip• Size, Complexity: Low• Typical Change Rate/Month: 0.3%• Criticality: Extra high• NDI Support: None• Organizational Personnel Capability: Strong formal methods experience• Activities: Precise formal specification. Formally-based programming language; formal

verification• Time/Build: 1-5 days• Time/Increment: 1-4 weeks

Apr 5, 2017 Copyright © USC-CSSE 8

University of Southern CaliforniaCenter for Systems and Software Engineering

Model Match to ICSM Common Cases -3Case 5: Hardware with Embedded Software Component*• Example: Multi-sensor control device• Size, Complexity: Low• Typical Change Rate/Month: 0.3 - 1 %• Criticality: Medium to very high• NDI Support: Good, in place• Organizational Personnel Capability: Experienced, medium-high• Activities: Concurrent hardware/software engineering. CDR-level review. IOC development, LRIP,

FRP. Concurrent version N+1 engineering• Time/Build: Software 1-5 days• Time/Increment: Market-driven

Case 6: Indivisible IOC*• Example: Complete vehicle platform• Size, Complexity: Medium to high• Typical Change Rate/Month: 0.3 – 1%• Criticality: High to very high• NDI Support: Some in place• Organizational Personnel Capability: Experienced, medium to high• Activities: Determine minimum-IOC likely, conservative cost. Add deferrable software features as

risk reserve. Drop deferrable features to meet conservative cost. Strong award fee for features not dropped.

• Time/Build: Software: 2-6 weeks• Time/Increment: Platform: 6-18 months

Apr 5, 2017 Copyright © USC-CSSE 9

* Means this process is suitable for COCOMO III

University of Southern CaliforniaCenter for Systems and Software Engineering

Model Match to ICSM Common Cases -4Case 7: NDI-Intensive• Example: Supply chain management• Size, Complexity: Medium to high• Typical Change Rate/Month: 0.3 – 3%• Criticality: Medium to very high• NDI Support: NDI-driven architecture• Organizational Personnel Capability: NDI-experienced, medium to high• Activities: Thorough NDI-suite life cycle cost-benefit analysis, selection, concurrent

requirements/architecture definition. Pro-active NDI evolution influencing, NDI upgrade synchronization

• Time/Build: Software: 1-4 weeks• Time/Increment: Systems: 6-18 months

Apr 5, 2017 Copyright © USC-CSSE 10

University of Southern CaliforniaCenter for Systems and Software Engineering

Model Match to ICSM Common Cases -5Case 8: Hybrid Agile/Plan-Driven System*• Example: C4ISR system• Size, Complexity: Medium to very high• Typical Change Rate/Month: Mixed parts; 1-10%• Criticality: Mixed parts; Medium to very high• NDI Support: Mixed parts• Organizational Personnel Capability: Mixed parts• Activities: Full ICSM, encapsulated agile in high change, low-medium criticality parts (Often

HMI, external interfaces). Full ICSM, three-team incremental development, concurrent V&V, next-increment re-baselining

• Time/Build: 1-2 months• Time/Increment: 9-18 month

Case 9: Multi-Owner System of Systems*• Example: Net-centric military operations• Size, Complexity: Very high• Typical Change Rate/Month: Mixed parts; 1-10 %• Criticality: Very high• NDI Support: Many NDIs, some in place• Organizational Personnel Capability: Related experience, medium to high• Activities: Full ICSM; extensive multi-owner team building, negotiation. Full ICSM; large

ongoing system/software engineering effort• Time/Build: 2-4 months• Time/Increment: 18-24 months

Apr 5, 2017 Copyright © USC-CSSE 11

* Means this process is suitable for COCOMO III

University of Southern CaliforniaCenter for Systems and Software Engineering

Model Match to ICSM Common Cases -6Case 10: Family of Systems• Example: Medical device product line• Size, Complexity: Medium to very high• Typical Change Rate/Month: 1-3%• Criticality: Medium to very high• NDI Support: Some in place• Organizational Personnel Capability: Related experience, medium to high• Activities: Skip Valuation and Architecting phases. Scrum plus agile methods of choice• Time/Build: 1-2 months• Time/Increment: 9-18 months

Case 11: Brownfield• Example: Incremental legacy phaseout• Size, Complexity: High to very high• Typical Change Rate/Month: 0.3-3%• Criticality: Medium-high• NDI Support: NDI as legacy replacement• Organizational Personnel Capability: Legacy re-engineering• Activities (Incremental Definition): Re-engineer/refactor legacy into services. Incremental

legacy phaseout• Time/Build: 2-6 weeks/refactor• Time/Increment: 2-6 months

Apr 5, 2017 Copyright © USC-CSSE 12

University of Southern CaliforniaCenter for Systems and Software Engineering

Best Fits of Estimation-Types to ICSM Common Cases

• Pure Agile: Planning Poker, Agile COCOMO III• Architected Agile

– COSYSMO for architecting; Planning Poker, CAIV-SAIV for sprints, releases; IDPD for large systems

• Formal Methods: $/SLOC by Evaluated Assurance Level• NDI/Services-Intensive: Oracle, SAP, other ERP

– RICE Objects: (R)eports, (I)nterfaces, (C)onversions, (E)nhancements– COCOTS, Value-Added Function Points, Agile for portions

• Hybrid Agile/Plan-Driven– Expert Delphi, COCOMO III, Agile for portions; IDPD

• Systems of Systems– COSYSMO for Integrator; Hybrid Agile/Plan-Driven for component systems

(COCOMO III)• Family of Systems: COPLIMO• Brownfield: COSYSMO for refactoring; above for rebuilding

8/23/2016 Copyright © USC-CSSE 13

University of Southern CaliforniaCenter for Systems and Software Engineering

COCOMO® III Suite of Models Concept

Copyright © USC-CSSE 14Apr 5, 2017

COPROMO

Legend:Model has been calibrated with historical project data and expert (Delphi) dataModel is derived from COCOMO III

COPLIMO

COPSEMOCOCOMO III

COINCOMO

Model Extensions to address other ICSM Common Cases

AGILECOCOMO III

University of Southern CaliforniaCenter for Systems and Software Engineering

New Feature: Application Domain Types

Real-Time• Sensor Control and Signal

Processing• Vehicle Control• Vehicle Payload• Real Time Embedded• Mission Processing

Engineering• Systems Software• Automation and Process Control• Simulation & Modeling

Automated Information Systems• Mission Planning• Training• Test• Data Processing

Apr 5, 2017 Copyright © USC-CSSE15

Selecting an Application Domain “pre-sets” model drivers

University of Southern CaliforniaCenter for Systems and Software Engineering

Model Breadth• There are a number of different activities in software

development:– Requirements analysis– Architecting– Detailed Design– Assembling or Coding– Integration Testing– System Testing– Acceptance Testing– Deployment– Training

• COCOMO III will cover a subset of these activities

Apr 5, 2017 Copyright © USC-CSSE 16

University of Southern CaliforniaCenter for Systems and Software Engineering

Model Depth• Development activities include/exclude different types of work:

– Management– Requirements analysis– Product design– Programming– Test and evaluation– Configuration Management / Quality Assurance– Documentation

• COCOMO III covers a number of work types (next slide)– The work covered is an indicator for whether the model is suitable for

estimating a development process (re: 11 cases discussed earlier)

Apr 5, 2017 Copyright © USC-CSSE 17

University of Southern CaliforniaCenter for Systems and Software Engineering

COCOMO III Depth

Copyright © USC-CSSE 18Apr 5, 2017

Subsystem Work Breakdown Structure

Management Engineering Programming Test & Evaluation Data

• Cost, Schedule, Performance Management

• Contract Management

• Subcontract Management

• Customer Interface

• Branch Office Management

• Management Reviews & Audits

• Software Requirements

• Product Design• Configuration

Management• End Item

Acceptance• Quality

Assurance

• Detailed Design• Code and Unit

Test• Integration

• Product Test• Plans• Procedures• Test• Reports

• Acceptance Test• Plans• Procedures• Test• Reports

• Test Support• Test beds• Test tools• Test data

• Manuals

University of Southern CaliforniaCenter for Systems and Software Engineering

Workload Sizing• The amount of development work to be done is expressed as

either a functional or product size

• The desire is for COCOMO III to use different size types organically as a size input– Want to move away from converting one size type to another, e.g.

Function Points to Source Lines of Code

Apr 5, 2017 Copyright © USC-CSSE 19

• Software Requirements• Function Point• SNAP Points• Fast Function Points• COSMIC Points• Automated Function Points

• Feature Points• Use Case Points• Story Points (Agile

Development)• Source Lines of Code

University of Southern CaliforniaCenter for Systems and Software Engineering

Reused Functionality• Currently COCOMO III uses the reuse model from COCOMO II

– Model is based on source lines of code

Apr 5, 2017 Copyright © USC-CSSE 20

AAF = 0.4×DM( )+ 0.3×CM( )+ 0.3× IM( )

AAM =

[AA+AAF(1+ (0.02×SU×UNFM))]100

, for AAF ≤ 50

[AA+AAF+ (SU×UNFM)]100

, for AAF > 50

#

$%%

&%%

Equivalent KSLOC = Adapted KSLOC ⋅AAM

• AAF: Adaption Adjustment Factor• DM: percent design modified• CM: percent of code and unit test modified• IM: percent of integration and test modified

• AAM: Adaption Adjustment Multiplier• SU: Software Understanding• UNFM: Programmer Unfamiliarity• AA: Assessment and Assimilation

University of Southern CaliforniaCenter for Systems and Software Engineering

Defect removal profile levels

Software development and maintenance estimates for:• Effort• Cost & Schedule

distributed by:o Phaseo Activityo Increment

• Quality

Local calibration to organization’s data

COCOMOIII

Model

Copyright © USC-CSSE 21Apr 5, 2017

COCOMO is an open and free model

Software product size estimate

Software product, platform, personnel & project attributes

Software reuse, maintenance, and increment parameters

University of Southern CaliforniaCenter for Systems and Software Engineering

Software product size estimate

COCOMO III Model Concept

Copyright © USC-CSSE 22Apr 5, 2017

DefectIntroduction

Model

DefectRemoval

Model

ScheduleModel

EffortModel

Number of est. residual defects and the residual defect density

Number of est. non-trivial defects for Requirements, Design, & Code

Defect removal profile levels

Software product, platform, personal & project attributes

Labor RatesCosts ($$)

Effort (Person Months)

Staffing Levels

Schedule (Months)

University of Southern CaliforniaCenter for Systems and Software Engineering

Where:A, B, C, D are constants determined by calibration E represents (dis)economies of scale and project-wide scale factors

COCOMO III Effort & Schedule Estimation Model

8/23/2016 Copyright © USC-CSSE23

Effort (PM) = A * SizeE * Product(19 Cost Drivers)

E = B + Sum(5 Cost Drivers)

Schedule (M) = C * PMF * SCED%/100

F = D + 0.2(E-B)

University of Southern CaliforniaCenter for Systems and Software Engineering

COCOMO III Defect Introduction and Removal Model

Copyright © USC-CSSE 248/23/2016

Defect Introduction (DI) = A * SizeE * Product(DI Drivers)

E = Initially set to 1.0

Residual Defects = C * DI * Product(1 – DRF)

DRF: Defect Removal Fraction from 3 profiles:1. Automated Analysis2. People Reviews3. Execution Testing

University of Southern CaliforniaCenter for Systems and Software Engineering

Defect Estimation Scope• Estimates only “Nontrivial” defects

– Critical: causes a system to crash or unrecoverable data loss or jeopardizes personnel

– High: causes impairment of critical system function and no workaround solution exists

– Medium: causes impairment of critical system function, through a workaround solution does exist

• Defect estimates are for only 3 artifacts:– Requirements– Design– Code

Apr 5, 2017 Copyright © USC-CSSE 25

University of Southern CaliforniaCenter for Systems and Software Engineering

COCOMO III Cost Drivers -1• Product Attributes

– Impact of Software Failure (FAIL) (formerly RELY)– Product Complexity (CPLX)– Developed for Reusability (RUSE)– Required Software Security (SECU) - New– Dropped:

• Documentation Match to Lifecycle Needs• Database Size

• Platform Attributes– Platform Constraints (PLAT) – New– Platform Volatility (PVOL)

Apr 5, 2017 Copyright © USC-CSSE 26

University of Southern CaliforniaCenter for Systems and Software Engineering

COCOMO III Cost Drivers -2• Personnel Attributes

– Analyst Capability (ACAP)– Programmer Capability (PCAP)– Personnel Continuity (PCON)– Applications Experience (APEX)– Language and Tool Experience (LTEX)– Platform Experience (PLEX)

Apr 5, 2017 Copyright © USC-CSSE 27

University of Southern CaliforniaCenter for Systems and Software Engineering

COCOMO III Cost Drivers -3• Project Attributes

– Precedentedness (PREC)– Development Flexibility (FLEX)– Opportunity and Risk Resolution (RESL)– Stakeholder Team Cohesion (TEAM)– Process Capability & Usage (PCUS) (formerly PMAT)– Use of Software Tools (TOOL)– Multisite Development (SITE)

• Defect Removal Profile– Automated Analysis– People Reviews– Execution Testing and Tools

Apr 5, 2017 Copyright © USC-CSSE 28

University of Southern CaliforniaCenter for Systems and Software Engineering

Cost Driver Values -1

Copyright © USC-CSSE 29Apr 5, 2017

Product AttributesDriver VL L N H VH XH PRFAIL 0.82 0.92 1.00 1.10 1.26 1.54CPLX 0.73 0.87 1.00 1.17 1.34 1.74 2.38RUSE 0.95 1.00 1.07 1.15 1.24 1.31SECU

Platform AttributesDriver VL L N H VH XH PR

PLAT 1.00 1.08 1.23 1.54 1.54

PVOL 0.87 1.00 1.15 1.30 1.49

University of Southern CaliforniaCenter for Systems and Software Engineering

Cost Driver Values -2

Copyright © USC-CSSE 30Apr 5, 2017

Personnel AttributesDriver VL L N H VH XH PRACAP 1.42 1.19 1.00 0.85 0.71 2.00PCAP 1.34 1.15 1.00 0.88 0.76 1.76PCON 1.29 1.12 1.00 0.90 0.81 1.51APEX 1.22 1.10 1.00 0.88 0.81 1.51LTEX 1.20 1.09 1.00 0.91 0.84 1.43PLEX 1.19 1.09 1.00 0.91 0.85 1.40

Project AttributesDriver VL L N H VH XH PRPREC 6.20 4.96 3.72 2.48 1.24 0.00 1.33FLEX 5.07 4.05 3.04 2.03 1.01 0.00 1.26RESL 7.07 5.65 4.24 2.83 1.41 0.00 1.39TEAM 5.48 4.38 3.29 2.19 1.10 0.00 1.29PCUS 7.80 6.24 4.68 3.12 1.56 0.00 1.43TOOL 1.17 1.09 1.00 0.90 0.78 1.50SITE 1.22 1.09 1.00 0.93 0.86 0.80 1.53

University of Southern CaliforniaCenter for Systems and Software Engineering

Defect Removal Drivers

Copyright © USC-CSSE 31Apr 5, 2017

Defect RemovalProfile Artifact VL L N H VH XH

Automated AnalysisRequirements 0.00 0.00 0.10 0.27 0.34 0.40Design 0.00 0.00 0.13 0.28 0.44 0.50Code 0.00 0.10 0.20 0.30 0.48 0.55

People ReviewsRequirements 0.00 0.25 0.40 0.50 0.58 0.70Design 0.00 0.28 0.40 0.54 0.70 0.78Code 0.00 0.30 0.48 0.60 0.73 0.83

Execution TestingRequirements 0.00 0.23 0.40 0.50 0.57 0.60Design 0.00 0.23 0.43 0.54 0.65 0.70Code 0.00 0.38 0.58 0.69 0.78 0.88

University of Southern CaliforniaCenter for Systems and Software Engineering

The New “Nominal”• Cost Driver definition refinement

• The New “Nominal– Study of productivity trends over the past 40 years reveals a shift in

“nominal” ratings– Following slides show the shift in ratings for selected cost drivers,

i.e., the new “nominal”

Apr 5, 2017 Copyright © USC-CSSE 32

University of Southern CaliforniaCenter for Systems and Software Engineering

Impact of Productivity Trends

Apr 5, 2017 Copyright © USC-CSSE 33

Cost driver Kendall’s τ p-valueTOOL Use of Software Tools -0.37 2.20E-16PMAT Process Maturity (PCUS) -0.30 1.22E-13STOR Main Storage Constraint -0.29 1.31E-11TIME Execution Time Constraint -0.26 6.62E-10PLEX Platform Experience -0.17 1.98E-05PVOL Platform Volatility -0.18 2.04E-05APEX Applications Experience 0.17 4.88E-05LTEX Language and Tool Experience -0.15 2.84E-04DATA Database Size 0.13 1.81E-03RELY Required Software Reliability -0.10 1.42E-02CPLX Product Complexity -0.10 1.58E-02PREC Precedentedness of Application -0.09 2.13E-02ACAP Analyst Capability 0.08 4.87E-02

Kendall's Rank Correlation Coefficients between the Completion Year and COCOMO II Cost Drivers (sorted by degrees of correlation)

University of Southern CaliforniaCenter for Systems and Software Engineering

Use of Software Tools

Copyright © USC-CSSE 34Apr 5, 2017

University of Southern CaliforniaCenter for Systems and Software Engineering

Process Maturity (Now PCUS)

Copyright © USC-CSSE 35Apr 5, 2017

University of Southern CaliforniaCenter for Systems and Software Engineering

Execution Time Constraint-TIME

Copyright © USC-CSSE 36Apr 5, 2017

University of Southern CaliforniaCenter for Systems and Software Engineering

Main Storage Constraint-STOR

Copyright © USC-CSSE 37Apr 5, 2017

University of Southern CaliforniaCenter for Systems and Software Engineering

Platform Experience-PLEX

Copyright © USC-CSSE 38Apr 5, 2017

University of Southern CaliforniaCenter for Systems and Software Engineering

Applications Experience-APEX

Copyright © USC-CSSE 39Apr 5, 2017

University of Southern CaliforniaCenter for Systems and Software Engineering

Language and Tool Experience-LTEX

Copyright © USC-CSSE 40Apr 5, 2017

University of Southern CaliforniaCenter for Systems and Software Engineering

Next Steps• Refine the definition for Required Software Security• Shift cost driver rating definitions to accommodate changes in

“nominal”• Create a Rosetta Stone for converting COCOMO II data to the

COCOMO III format– Test the model on past data

• Setup data collection

Apr 5, 2017 Copyright © USC-CSSE 41

University of Southern CaliforniaCenter for Systems and Software Engineering

Required Software Security (SECU)*• Is this correlated to Process Maturity?• Should it be an extension to FAIL• Should it address the threat?

– Malicious hackers– Foreign governments

• Should it address exposure– On the network– Under guard, separated from outside environment

• Where are the descriptors:– Confidentiality– Integrity– Availability

Apr 5, 2017 Copyright © USC-CSSE 42

University of Southern CaliforniaCenter for Systems and Software Engineering

Invitation to Participate• CSSE invites you to collaborate on model development

– Review model formulation– Submit data for model calibration

• Actual Size• Effort• Schedule• Defects• Model Parameters

– Review of COCOMO III model– If you contribute data for model calibration, you will receive:

• An advanced copy of the new model• Comparison of your data with respect to other data points used to

calibrate the model• Please talk with me afterwards if you are interested

Apr 5, 2017 Copyright © USC-CSSE 43

Want to [email protected]

Make suggestions and be a model reviewer!

University of Southern CaliforniaCenter for Systems and Software Engineering

Copyright © USC-CSSE44

Questions?

For more information, contact:Brad [email protected]

Apr 5, 2017


Recommended