+ All Categories
Home > Documents > Software Development Cost Estimating Handbook › cop › ce › DAU Sponsored Documents...The...

Software Development Cost Estimating Handbook › cop › ce › DAU Sponsored Documents...The...

Date post: 04-Jul-2020
Category:
Upload: others
View: 4 times
Download: 0 times
Share this document with a friend
252
Naval Center for Cost Analysis Air Force Cost Analysis Agency Software Development Cost Estimating Handbook Volume I Developed by the Software Technology Support Center September 2008 Resource manual for education and support in developing credible software development cost estimates
Transcript
  • Naval Center for Cost Analysis Air Force Cost Analysis Agency

    Software Development Cost Estimating

    Handbook Volume I

    Developed by the Software Technology Support Center

    September 2008

    Resource manual for education and support in developing credible software development cost estimates

  • Executive Summary

    The purpose of the Software Development Estimating Handbook is to provide the cost analyst with a

    resource manual to use in developing credible software development cost estimates. A realistic estimate is

    based upon a solid understanding of the software development process and the historical data that forms a

    framework for the expected values. An estimating methodology that follows a proven process consistent

    with best practices and Department of Defense (DoD) policies further contributes to estimate validity.

    The information is presented at two levels. One level will help the experienced analyst immediately focus

    on the material necessary to develop an estimate. The second level of information is for the novice, or

    infrequent user, to use as educational information regarding the software development and estimating

    processes.

    The estimating process starts with a determination of the purpose of the estimate. Next, the cost (or effort)

    and schedule for the software development project are determined using three factors: effective size,

    development environment, and product complexity.

    The key, and most important, element in the software estimate is the effective size of the software product.

    Determining size can be approached from several directions depending upon the software size measure

    (lines of code, function points, use cases, etc.) used by the development organization. A system developed

    by writing lines of code requires a different estimating approach than a previously developed or off-the-

    shelf application. The acquisition phase also influences the analyst’s approach because of the amount and

    type of software development data available from the program or developers.

    The development environment is the next most important effort and schedule driver. The environment can

    be factored into five categories: (1) developer capability or efficiency, (2) personnel experience, (3)

    development system characteristics, (4) management characteristics, and (5) product characteristics. The

    last four categories are largely driven by the product requirements. These factors take into consideration

    the development environment itself, the capabilities and experience of the developers, the developing

    organization’s management style, security requirements, and so on. These factors, along with software size

    and complexity, combine to determine the productivity or efficiency with which a developer can ―build‖

    and test the software. Ultimately, these environment characteristics drive the cost and schedule of the

    software development and implementation of the system.

    It is uncertain who first coined the phrase, ―A fool with a tool is still a fool.‖ Plugging numbers into a

    parametric model without knowing if the results are realistic fits this adage. This handbook addresses

    estimate realism using historical data, industry best practices, and authoritative insight. The insight comes

    from experts in the fields of software development and cost estimating. This information helps the analyst

    conduct a ―sanity check‖ of their estimate results. A well-understood and validated estimate offers a

    defensible position for program office analysts, component cost agency analysts, and independent

    evaluators. A reasonable estimate is useful in budgeting, milestone decision reviews, and determining the

    life cycle or other costs of the program.

    The contents of this volume, ten sections and nine appendices, are grouped into four major parts. An

    introduction and the basics of the software development process lead off the tutorial. The next two parts

    cover the estimating process and related details. Concepts and examples presented in the sections are

    covered in greater detail in the appendices. The idea behind this structure is to present principles for

    instruction and reference in the core sections and, then, examine details and related examples.

    This handbook was written for use by the Naval Center for Cost Analysis (NCCA) and the Air Force Cost

    Analysis Agency (AFCAA). The information herein is not intended to dictate policy or supplant guidance

    given in official documents. However, the authors hope that everyone within the software cost estimating

    community will find it useful. The extent of information on software development and cost estimating

    presented within these pages is not intended to be all-inclusive. Yet, the handbook is meant to be

    comprehensive and complete, providing a single-resource document for use in creating estimates.

  • ii

    Table of Contents Executive Summary ............................................................................................................. i

    Acknowledgements ............................................................................................................ xi

    List of Figures ................................................................................................................... xii

    List of Tables ................................................................................................................... xiii

    List of Equations .............................................................................................................. xvi

    Section 1 Introduction ...................................................................................................... 1-1

    1.1 Development constraints ............................................................................................ 1-4

    1.2 Major cost factors ...................................................................................................... 1-4

    1.2.1 Effective size ....................................................................................................... 1-5

    1.2.2 Product complexity ............................................................................................. 1-5

    1.2.3 Development environment .................................................................................. 1-6

    1.2.4 Product characteristics ........................................................................................ 1-6

    1.3 Software support estimation ...................................................................................... 1-6

    1.4 Handbook overview ................................................................................................... 1-7

    Section 2 Software Development Process ....................................................................... 2-1

    2.1 The Defense Acquisition System ............................................................................... 2-1

    2.1.1 Framework Elements .......................................................................................... 2-1

    2.1.2 User Needs and Technology Opportunities ........................................................ 2-2

    2.1.3 Pre-Systems Acquisition ..................................................................................... 2-2

    2.1.3.1 Concept Refinement Phase .......................................................................... 2-2

    2.1.3.2 Milestone A .................................................................................................. 2-3

    2.1.3.3 Technology Development Phase.................................................................. 2-3

    2.1.3.4 Milestone B .................................................................................................. 2-3

    2.1.4 Systems Acquisition............................................................................................ 2-3

    2.1.4.1 System Development & Demonstration ...................................................... 2-3

    2.1.4.2 Milestone C .................................................................................................. 2-3

    2.1.4.3 Production & Deployment ........................................................................... 2-4

    2.1.5 Sustainment ......................................................................................................... 2-4

    2.2 Waterfall Model ......................................................................................................... 2-4

    2.2.1 Requirements Analysis and Specification .......................................................... 2-6

    2.2.2 Full-Scale Development...................................................................................... 2-6

  • iii

    2.2.3 System Integration and Test................................................................................ 2-7

    2.3 Software Development Products................................................................................ 2-7

    2.3.1 Software Development Plan ............................................................................... 2-8

    2.3.1.1 Project Organization .................................................................................... 2-8

    2.3.1.2 Schedule ....................................................................................................... 2-8

    2.3.1.3 Software Design Document ......................................................................... 2-8

    2.3.1.4 Quality Plan ................................................................................................. 2-9

    2.3.2 Software Requirements Specification ................................................................. 2-9

    2.3.3 Interface Control Document ............................................................................... 2-9

    Section 3 Levels of Detail in Software Estimates ............................................................ 3-1

    3.1 Estimate Foundation Factors...................................................................................... 3-1

    3.2 System-level estimating model .................................................................................. 3-2

    3.3 Component-level estimating model ........................................................................... 3-4

    3.4 Estimating Process ..................................................................................................... 3-6

    Section 4 System-Level Estimating Process .................................................................... 4-1

    4.1 Product complexity .................................................................................................... 4-2

    4.2 Size estimating process .............................................................................................. 4-3

    4.2.1 Effective source lines of code (ESLOC) ............................................................. 4-3

    4.2.2 Function point counting ...................................................................................... 4-4

    4.3 Software size growth.................................................................................................. 4-4

    4.4 Productivity factor ..................................................................................................... 4-5

    4.4.1 Productivity factor table ...................................................................................... 4-6

    4.4.2 ESC metrics ........................................................................................................ 4-6

    4.5 System-level cost estimating...................................................................................... 4-7

    4.6 Reality check .............................................................................................................. 4-8

    4.7 Allocate development effort ...................................................................................... 4-8

    4.8 Allocate maintenance effort ..................................................................................... 4-10

    4.8.1 Software enhancement ...................................................................................... 4-10

    4.8.2 Knowledge retention ......................................................................................... 4-11

    4.8.3 Steady state maintenance effort ........................................................................ 4-11

    Section 5 Component-Level Estimating Process ............................................................. 5-1

    5.1 Staffing profiles ......................................................................................................... 5-3

    5.2 Product complexity .................................................................................................... 5-4

    5.3 Size estimating process .............................................................................................. 5-4

  • iv

    5.4 Development environment ......................................................................................... 5-5

    5.4.1 Personnel evaluation ........................................................................................... 5-5

    5.4.2 Development environment evaluation ................................................................ 5-5

    5.4.3 Product impact evaluation................................................................................... 5-5

    5.4.4 Basic technology constant................................................................................... 5-6

    5.4.5 Effective technology constant ............................................................................. 5-6

    5.5 Development cost and schedule calculations ............................................................. 5-7

    5.6 Verify estimate realism .............................................................................................. 5-8

    5.7 Allocate development effort and schedule ................................................................. 5-8

    5.7.1 Effort allocation .................................................................................................. 5-8

    5.7.2 Schedule allocation ............................................................................................. 5-9

    5.8 Allocate maintenance effort ..................................................................................... 5-10

    Section 6 Estimating Effective Size ................................................................................. 6-1

    6.1 Source code elements ................................................................................................. 6-2

    6.1.1 Black box vs. white box elements ....................................................................... 6-2

    6.1.2 NEW source code ............................................................................................... 6-3

    6.1.3 MODIFIED source code ..................................................................................... 6-3

    6.1.4 DELETED source code....................................................................................... 6-3

    6.1.5 REUSED source code ......................................................................................... 6-3

    6.1.6 COTS software.................................................................................................... 6-3

    6.1.7 Total SLOC ......................................................................................................... 6-4

    6.2 Size Uncertainty ......................................................................................................... 6-4

    6.3 Source line of code (SLOC) ....................................................................................... 6-5

    6.3.1 Executable ........................................................................................................... 6-5

    6.3.2 Data declaration .................................................................................................. 6-5

    6.3.3 Compiler directives ............................................................................................. 6-6

    6.3.4 Format statements ............................................................................................... 6-6

    6.4 Effective source lines of code (ESLOC) .................................................................... 6-6

    6.4.1 Effective size as work ......................................................................................... 6-6

    6.4.2 Effective size equation ........................................................................................ 6-8

    6.4.2.1 Design factor ................................................................................................ 6-9

    6.4.2.2 Implementation factor .................................................................................. 6-9

    6.4.2.3 Test factor .................................................................................................... 6-9

    6.5 Size Growth ............................................................................................................. 6-10

  • v

    6.5.1 Maximum size growth ...................................................................................... 6-11

    6.6 Size Risk .................................................................................................................. 6-14

    6.6.1 Source code growth........................................................................................... 6-14

    6.7 Function Points ........................................................................................................ 6-15

    6.7.1 Function Point counting .................................................................................... 6-15

    6.7.2 Function Point components............................................................................... 6-16

    6.7.2.1 Application boundary................................................................................. 6-17

    6.7.2.2 Internal Logical File ................................................................................... 6-18

    6.7.2.3 External Interface File................................................................................ 6-19

    6.7.2.4 External Input............................................................................................. 6-20

    6.7.2.5 External output ........................................................................................... 6-20

    6.7.2.6 External Inquiry ......................................................................................... 6-20

    6.7.2.7 Transforms ................................................................................................. 6-21

    6.7.2.8 Transitions.................................................................................................. 6-22

    6.7.3 Unadjusted Function Point Counting ................................................................ 6-23

    6.7.4 Adjusted Function Points .................................................................................. 6-23

    6.7.4.1 Value Adjustment Factor ........................................................................... 6-23

    6.7.4.2 Adjusted Function Point calculation .......................................................... 6-25

    6.7.5 Backfiring ......................................................................................................... 6-25

    6.7.6 Function Points and Objects ............................................................................. 6-26

    6.7.7 Zero Function Point Problem ............................................................................ 6-27

    Chapter 7 Productivity Factor Evaluation........................................................................ 7-1

    7.1 Introduction ................................................................................................................ 7-1

    7.2 Determining Productivity Factor ............................................................................... 7-2

    7.2.1 ESC Metrics ........................................................................................................ 7-4

    7.2.2 Productivity Index ............................................................................................... 7-6

    7.3 System-level estimating ............................................................................................. 7-7

    Section 8 Evaluating Developer Capability ..................................................................... 8-1

    8.1 Importance of developer capability ........................................................................... 8-1

    8.2 Basic technology constant.......................................................................................... 8-2

    8.2.1 Basic technology constant parameters ................................................................ 8-3

    8.2.1.1 Analyst capability ........................................................................................ 8-3

    8.2.1.2 Programmer capability ................................................................................. 8-5

    8.2.1.3 Application domain experience ................................................................... 8-5

  • vi

    8.2.1.4 Learning curve ............................................................................................. 8-6

    8.2.1.5 Domain experience rating ............................................................................ 8-7

    8.2.1.6 Modern practices .......................................................................................... 8-7

    8.2.1.7 Modern tools ................................................................................................ 8-8

    8.2.2 Basic technology constant calculation ................................................................ 8-9

    8.3 Mechanics of communication .................................................................................. 8-10

    8.3.1 Information convection ..................................................................................... 8-11

    8.3.2 Radiation ........................................................................................................... 8-12

    8.3.3 Communication barriers.................................................................................... 8-12

    8.3.3.1 Skunk Works .............................................................................................. 8-12

    8.3.3.2 Cube farm................................................................................................... 8-13

    8.3.3.3 Project area................................................................................................. 8-13

    8.3.4 Utensils for creative work ................................................................................. 8-13

    Section 9 Development Environment Evaluation ............................................................ 9-1

    9.1 Learning curve vs. volatility ...................................................................................... 9-2

    9.2 Personnel experience characteristics .......................................................................... 9-2

    9.2.1 Programming language experience ..................................................................... 9-3

    9.2.2 Practices and methods experience ...................................................................... 9-5

    9.2.3 Development system experience......................................................................... 9-5

    9.2.4 Target system experience .................................................................................... 9-6

    9.3 Development support characteristics ......................................................................... 9-6

    9.3.1 Development system volatility ........................................................................... 9-6

    9.3.2 Practices/Methods volatility................................................................................ 9-7

    9.4 Management characteristics ....................................................................................... 9-7

    9.4.1 Multiple security classifications ......................................................................... 9-7

    9.4.2 Multiple development organizations................................................................... 9-7

    9.4.3 Multiple development sites ................................................................................. 9-8

    9.4.4 Resources and support location .......................................................................... 9-8

    Section 10 Product Characteristics Evaluation .............................................................. 10-1

    10.1 Product complexity ................................................................................................ 10-1

    10.2 Display requirements ............................................................................................. 10-3

    10.3 Rehosting requirements ......................................................................................... 10-4

    10.4 Memory constraints ............................................................................................... 10-4

    10.5 Required reliability ................................................................................................ 10-5

  • vii

    10.6 Real-time performance requirements ..................................................................... 10-6

    10.7 Requirements volatility .......................................................................................... 10-6

    10.8 Security requirements ............................................................................................ 10-8

    Appendix A Acronyms ................................................................................................... A-1

    Appendix B Terminology ............................................................................................... B-1

    Appendix C Bibliography ............................................................................................... C-1

    Appendix D Software Life Cycle Approaches ............................................................... D-1

    D.1 Waterfall................................................................................................................... D-1

    D.2 Spiral development .................................................................................................. D-2

    D.3 Evolutionary development ....................................................................................... D-2

    D.4 Incremental development ......................................................................................... D-3

    D.5 Agile development (Extreme programming) ........................................................... D-3

    D.6 Rapid application development ................................................................................ D-4

    D.7 Other approaches...................................................................................................... D-4

    Appendix E Software Estimating Models........................................................................ E-1

    E.1 Analogy models ......................................................................................................... E-2

    E.2 Expert judgment models ............................................................................................ E-2

    E.2.1 Delphi Method .................................................................................................... E-3

    E.2.2 Wideband Delphi Method .................................................................................. E-4

    E.3 Bottom-up estimating ................................................................................................ E-4

    E.4 Parametric models ..................................................................................................... E-5

    E.5 Origins and evolution of parametric software models .............................................. E-5

    E.6 First-order models ..................................................................................................... E-6

    E.7 Second-order models ................................................................................................. E-8

    E.8 Third-order model ................................................................................................. E-9

    Appendix F System-Level Estimate Case Study ............................................................. F-1

    F.1 HACS baseline size estimate ................................................................................. F-1

    F.2 HACS size growth calculation............................................................................... F-4

    F.3 HACS effort calculation ........................................................................................ F-5

    F.4 HACS Reality check .............................................................................................. F-7

    F.5 HACS development effort allocation .................................................................... F-8

    F.6 HACS maintenance effort calculation ................................................................... F-9

    Appendix G Component-Level Estimate Case Study ..................................................... G-1

    G.1 HACS baseline size estimate ................................................................................... G-1

  • viii

    G.2 HACS size estimate ................................................................................................. G-2

    G.3 HACS size growth calculation ................................................................................. G-4

    G.4 HACS environment .................................................................................................. G-6

    G.4.1 HACS developer capability .............................................................................. G-6

    G.4.2 personnel evaluation ......................................................................................... G-6

    G.4.3 development environment ................................................................................. G-7

    G.4.4 HACS product impact ....................................................................................... G-7

    G.4.5 HACS effective technology constant ................................................................ G-8

    G.5 HACS development effort and schedule calculations.............................................. G-8

    G.6 Verify HACS estimate realism .............................................................................. G-10

    G.7 Allocate HACS development effort and schedule ................................................. G-11

    G.7.1 HACS effort allocation ................................................................................... G-11

    G.7.2 Schedule allocation ......................................................................................... G-13

    G.8 Allocate HACS maintenance effort ....................................................................... G-14

    Appendix H The Defense Acquisition System ............................................................... H-1

    H.1 Basic Definitions ...................................................................................................... H-2

    H.2 Acquisition Authorities ............................................................................................ H-2

    H.3 Acquisition Categories ............................................................................................. H-2

    H.3.1 ACAT I ............................................................................................................. H-3

    H.3.2 ACAT II ............................................................................................................ H-3

    H.3.3 ACAT III ........................................................................................................... H-4

    H.3.4 ACAT IV........................................................................................................... H-4

    H.3.5 Abbreviated Acquisition Programs (AAPs) ...................................................... H-4

    H.4 Acquisition Management Framework ...................................................................... H-4

    H.4.1 Framework Elements ........................................................................................ H-4

    H.4.2 User Needs and Technology Opportunities ...................................................... H-5

    H.4.3 Pre-Systems Acquisition ................................................................................... H-6

    H.4.3.1 Concept Refinement Phase ........................................................................ H-6

    H.4.3.2 Milestone A ................................................................................................ H-7

    H.4.3.3 Technology Development Phase ................................................................ H-7

    H.4.3.4 Milestone B ................................................................................................ H-7

    H.4.4 Systems Acquisition .......................................................................................... H-8

    H.4.4.1 System Development and Demonstration .................................................. H-8

    H.4.4.2 Milestone C ................................................................................................ H-9

  • ix

    H.4.4.3 Production and Deployment....................................................................... H-9

    H.4.5 Sustainment ..................................................................................................... H-10

    H.4.5.1 Sustainment Effort ................................................................................... H-10

    H.4.5.2 Disposal Effort ......................................................................................... H-10

    H.5 Cost Analysis ......................................................................................................... H-11

    H.5.1 Cost Estimating ............................................................................................... H-12

    H.5.2 Estimate Types ................................................................................................ H-13

    H.5.2.1 Life-Cycle Cost Estimate ......................................................................... H-13

    H.5.2.2 Total Ownership Cost .............................................................................. H-13

    H.5.2.3 Analysis of Alternatives ........................................................................... H-14

    H.5.2.4 Independent Cost Estimate....................................................................... H-14

    H.5.2.5 Program Office Estimate (POE) .............................................................. H-14

    H.5.2.6 Component Cost Analysis ........................................................................ H-14

    H.5.2.7 Economic Analysis .................................................................................. H-14

    H.6 Acquisition Category Information ......................................................................... H-15

    H.7 Acquisition References .......................................................................................... H-17

    H7.1 Online Resources ............................................................................................. H-17

    H7.2 Statutory Information ....................................................................................... H-17

    H7.3 Acquisition Decision Support Systems ............................................................ H-21

    Appendix I Data Collection .............................................................................................. I-1

    I.1 Software data collection overview .............................................................................. I-1

    I.1.1 Model comparisons............................................................................................... I-1

    I.1.2 Format ................................................................................................................... I-3

    I.1.3 Structure................................................................................................................ I-4

    I.2 Software data collection details ................................................................................... I-4

    I.3 CSCI description.......................................................................................................... I-4

    I.3.1 Requirements ........................................................................................................ I-6

    I.3.2 Systems integration............................................................................................... I-6

    I.4 Size data ....................................................................................................................... I-7

    I.4.1 Sizing data ............................................................................................................ I-7

    I.4.1.1 Source code (KSLOC) ................................................................................... I-7

    I.4.1.2 Reuse adjustments ......................................................................................... I-7

    I.4.1.3 Software source ............................................................................................. I-8

    I.4.1.4 Function points .............................................................................................. I-8

  • x

    I.4.1.5 Programming source language ...................................................................... I-8

    I.5 Development environment data ................................................................................... I-8

    I.6 Cost, schedule data ...................................................................................................... I-9

    I.7 Technology constants ................................................................................................ I-10

    I.8 Development environment attributes......................................................................... I-11

    I.8.1 Personnel ............................................................................................................ I-11

    I.8.2 Support................................................................................................................ I-16

    I.8.3 Management ....................................................................................................... I-24

    I.8.4 Product ................................................................................................................ I-27

  • xi

    Acknowledgements The size of this volume hardly represents the amount of effort expended to develop it. Since May 2005,

    nearly five person years have gone into the research, writing, editing, and production.

    We wish to thank the Naval Center for Cost Analysis (NCCA), specifically Susan Wileman for her

    confidence in our experience and knowledge and asking us to write this handbook. Of course, this effort

    would not have been possible without funding. We are grateful to NCCA and the Air Force Cost Analysis

    Agency (AFCAA), namely Wilson Rosa, for believing in the value of this effort enough to provide the

    financial means to accomplish it.

    We appreciate Susan, John Moskowitz, Mike Tran, and others for the reviews, discussions, and feedback

    during the many months of writing. We offer our gratitude to members of the United States Air Force

    Software Technology Support Center (STSC) for their expert editing and proof-reading: Thomas Rodgers,

    Gabriel Mata, Daniel Keth, Glen Luke, and Jennifer Clement. We offer a special thanks to Dr. David A.

    Cook, Dr. James Skinner, and Teresa Brown for their singular knowledge and perspective in the technical

    review. The final review conducted by the CrossTalk editors, namely Drew Brown and Chelene Fortier-

    Lozancich, uncovered a myriad of fine points we overlooked or assumed everyone would know or

    understand. Thanks Drew and Chelene!

    Most significantly, we acknowledge the extensive knowledge and experience of Dr. Randall W. Jensen

    applied in writing and refining this volume into what, arguably, will become a standard within the cost

    estimating community. Additional writing, editing, and final production by Leslie (Les) Dupaix and Mark

    Woolsey were also key efforts in creating this volume. We also wish to acknowledge the Defense

    Acquisition University for providing the basis of the acquisition related portions of the handbook.

    Finally, we want to thank our spouses, families, and anyone else, who would listen to our repetitive

    discussions on topics within the handbook, endured us working on it at home or during vacations, and were

    always interested (or at least pretended to be) in our efforts to create a useful resource for software cost

    estimators.

  • xii

    List of Figures Figure # Description Page Figure 1-1 Achievable development schedule 1-5 Figure 1-2 Development environment facets 1-6

    Figure 2-1 Defense acquisition management framework 2-1

    Figure 2-2 Waterfall development 2-4

    Figure 2-3 Software product activities relationship 2-5

    Figure 2-4 Computer software architecture 2-7

    Figure 3-1 Achievable effective size and schedule 3-5

    Figure 3-2 Software elements and relationship to estimate type 3-7

    Figure 4-1 Effective size growth distribution 4-4

    Figure 5-1 Rayleigh-Norden project staffing profile 5-3

    Figure 5-2 Effects of improper staffing 5-4

    Figure 6-1 Source of code taxonomy 6-2

    Figure 6-2 Black box description 6-2

    Figure 6-3 Normal distribution 6-4

    Figure 6-4 Impact of structure on effective size 6-6

    Figure 6-5 Effort required to incorporate changes 6-7

    Figure 6-6 Historic project data basis for growth algorithm 6-10

    Figure 6-7 Modified Holchin growth algorithm 6-11

    Figure 6-8 Effective size growth distribution 6-14

    Figure 6-9 Function point system structure 6-17

    Figure 6-10 State transition model 6-22

    Figure 8-1 Basic technology constant range 8-2

    Figure 8-2 Basic technology constant distribution 8-3

    Figure 8-3 Productivity gains from 1960 to present 8-3

    Figure 8-4 Learning curve impact 8-6

    Figure 8-5 Impact of Application Experience on development effort 8-7

    Figure 8-6 CMMI rating improvement over period 1987 to 2002 8-8

    Figure 8-7 Components of communication 8-10

    Figure 9-1 Learning curve impact 9-3

    Figure 9-2 Impact of programming language experience on development 9-4

    Figure 9-3 Impact of practices and methods experience on development 9-5

    Figure 9-4 Impact of development system experience on development 9-5

    Figure 9-5 Impact of target system experience on development 9-6

    Figure 10-1 Software complexity illustration 10-2

    Figure B-1 Rayleigh staffing profile B-5

    Figure D-1 Software waterfall process D-1

    Figure D-2 Spiral development process D-2

    Figure D-3 Evolutionary development process D-3

    Figure D-4 Incremental development process D-3

    Figure D-5 Agile development process D-4

    Figure H-1 Defense Acquisition System H-1

    Figure H-2 Acquisition Oversight H-3

    Figure H-3 Defense Acquisition Management Framework H-4

    Figure H-4 User Needs Activities H-5

    Figure H-5 Pre-Systems Acquisition Activity H-6

    Figure H-6 Systems Acquisition Activity H-7

    Figure H-7 Production and Deployment Phase H-9

    Figure H-8 Operations and Support Phase H-10

    Figure H-9 Life Cycle Cost Composition H-11

  • xiii

    List of Tables Table # Description Page Table 3-1 Typical productivity factors by size and software type 3-3

    Table 4-1 System concept information 4-1

    Table 4-2 Stratification of complexity data 4-2

    Table 4-3 Maximum software growth projections as a function of maturity and complexity 4-5

    Table 4-4 Mean software growth projections as a function of maturity and complexity 4-5

    Table 4-5 Typical productivity factors by size and software type 4-6

    Table 4-6 Electronic Systems Center reliability categories 4-7

    Table 4-7 Definition of complexity/reliability categories 4-7

    Table 4-8 Productivities for military applications by category 4-8

    Table 4-9 Total project effort distribution as a function of product size 4-9

    Table 5-1 Computer Software Configuration Item Size Estimates 5-1

    Table 5-2 Total project effort distribution as a function of product size 5-9

    Table 5-3 Approximate total schedule breakdown as a function of product size 5-10

    Table 6-1 Code growth by project phase 6-11

    Table 6-2 Modified Holchin maturity scale 6-12

    Table 6-3 Mean growth factors for normal complexity values as a function of maturity 6-12

    Table 6-4 Maximum growth factors for normal complexity values as a function of maturity 6-13

    Table 6-5 Function point rating elements 6-17

    Table 6-6 Table of weights for function point calculations 6-19

    Table 6-7 Ranking for Internal Logical and External Interface Files 6-19

    Table 6-8 Unadjusted function point calculation 6-19

    Table 6-9 Ranking for External Inputs 6-20

    Table 6-10 Ranking for External Outputs 6-20

    Table 6-11 Ranking for External Inquiries 6-21

    Table 6-12 Ranking for Transforms 6-22

    Table 6-13 Unadjusted function point calculation 6-23

    Table 6-14 General System Characteristics definition 6-24

    Table 6-15 General System Characteristic ratings 6-24

    Table 6-16 Online Data Entry rating definitions 6-24

    Table 6-17 Function Point to Source Lines of Code conversion 6-25

    Table 7-1 Typical productivity factors by size, type, and complexity value 7-3

    Table 7-2 Typical productivity factors by size and software type 7-3

    Table 7-3 Definition of complexity/reliability categories 7-5

    Table 7-4 Productivity for military applications by category 7-5

    Table 7-5 Productivity for military applications as a function of personnel capability 7-5

    Table 7-6 Relationship between Ck and PI values 7-6

    Table 7-7 Typical PI ranges for major application type from the QSM database 7-7

    Table 8-1 Analyst capability ratings 8-5

    Table 8-2 Programmer capability ratings 8-5

    Table 8-3 Traditional use of modern practices rating 8-7

    Table 8-4 Relationship between CMMI and MODP ratings 8-8

    Table 8-5 Modern tool categories and selection criteria 8-9

    Table 8-6 Use of automated tools support rating 8-9

    Table 9-1 Programming language mastery time 9-4

    Table 9-2 Development system volatility ratings 9-6

    Table 9-3 Practices/methods volatility ratings 9-7

    Table 9-4 Multiple security classifications ratings 9-7

    Table 9-5 Multiple development organizations ratings 9-7

    Table 9-6 Multiple development site ratings 9-8

    Table 9-7 Resources and support location ratings 9-8

    Table 10-1 Stratification of complexity data 10-1

    Table 10-2 Complexity rating matrix 10-3

  • xiv

    Table # Description Page

    Table 10-3 Special display requirements ratings 10-3

    Table 10-4 Rehosting requirements ratings 10-4

    Table 10-5 Memory constraint ratings 10-4

    Table 10-6 Required reliability ratings 10-5

    Table 10-7 Real time operation ratings 10-6

    Table 10-8 Requirements volatility ratings 10-7

    Table 10-9 Security requirements ratings 10-8

    Table E-1 Comparison of major software estimating methods E-2

    Table E-2 Typical productivity factors by size and software type E-7

    Table E-3 Environment factors used by common third-order estimation models E-10

    Table F-1 Baseline description of case study at concept stage F-1

    Table F-2 Case study unadjusted function point calculation F-2

    Table F-3 Case study general system characteristics ratings F-2

    Table F-4 Function point to Source Lines Of Code conversion F-3

    Table F-5 Baseline description of case study at concept stage F-3

    Table F-6 Software growth projections as a function of maturity and complexity F-4

    Table F-7 Baseline description of case study at concept stage F-4

    Table F-8 Definition of complexity/reliability categories F-5

    Table F-9 Productivity values for military applications by category F-5

    Table F-10 Productivity values for case study derived from the ESC database F-6

    Table F-11 Comparison of cost with mean and maximum size growth using ESC data F-6

    Table F-12 Comparison of cost with mean and maximum size growth using table F-7

    Table F-13 Comparison of worst case from component-level and system level F-8

    Table F-14 Total project effort distribution as a function of product size F-8

    Table F-15 Total project effort distribution for nominal case study development F-9

    Table F-16 Maximum effort analysis from system level including maintenance F-10

    Table G-1 Baseline description of case study at start of requirements review G-1

    Table G-2 Case study unadjusted function point calculation G-2

    Table G-3 Case study general system characteristics ratings G-3

    Table G-4 Baseline description of case study at start of requirements review G-4

    Table G-5 Software growth projections as a function of maturity and complexity G-5

    Table G-6 Baseline description of case study at start of requirements development G-5

    Table G-7 Parameter values for basic capability estimate calculation G-6

    Table G-8 Personnel parameter values for case study G-6

    Table G-9 Development environment parameter values for case study G-7

    Table G-10 Product impact parameter values for case study G-7

    Table G-11 Technology constant values for case study G-8

    Table G-12 Nominal effort and schedule analysis of case study at the component level G-9

    Table G-13 Worst-case effort and schedule analysis of case study at component level G-9

    Table G-14 Total project effort distribution for case study G-11

    Table G-15 Nominal effort allocation for the case study at the component level G-12

    Table G-16 Approximate schedule breakdown as a function of product size G-13

    Table G-17 Nominal schedule allocation for the case study at the component level G-14

    Table G-18 Nominal component level cost analysis of case study maintenance G-15

    Table H-1 DoD Instruction 5000.2 Acquisition Categories H-15

    Table H-2 SECNAV Instruction 5000.2C Acquisition Categories H-16

    Table I-1 Estimating model parameter comparison I-2

    Table I-2 Computer Software Configuration Item information I-4

    Table I-3 Project summary data I-4

    Table I-4 Requirements data I-6

    Table I-5 System integration data I-6

    Table I-6 Source code sizes I-7

    Table I-7 Reuse data I-7

    Table I-8 Reuse source I-8

    Table I-9 Function point data I-8

  • xv

    Table # Description Page

    Table I-10 Source language I-8

    Table I-11 Environment data I-9

    Table I-12 Development effort and schedule data I-10

    Table I-13 Technology constants I-10

    Table I-14 Analyst Capability (ACAP) rating values I-11

    Table I-15 Programmer Capability (PCAP) rating values I-12

    Table I-16 Productivity Factor (PROFAC) rating values I-12

    Table I-17 Application Experience (AEXP) rating values I-13

    Table I-18 Development System Experience (DEXP) rating values I-14

    Table I-19 Programming Language Experience (LEXP) rating values I-14

    Table I-20 Practices and Methods Experience (PEXP) rating values I-15

    Table I-21 Target System Experience (TEXP) rating values I-15

    Table I-22 Development System Complexity (DSYS) rating values I-16

    Table I-23 Development System Volatility (DVOL) rating values I-17

    Table I-24 Modern Practices use (MODP) rating values I-18

    Table I-25 Process Improvement (PIMP) rating values I-19

    Table I-26 Practices/methods Volatility (PVOL) rating values I-20

    Table I-27 Reusability level required (RUSE) rating values I-21

    Table I-28 Required Schedule (SCED) rating values I-21

    Table I-29 Automated tool support levels of automation I-22

    Table I-30 Automated tool use (TOOL) rating values I-23

    Table I-31 Multiple Classification Levels (MCLS) rating values I-24

    Table I-32 Multiple Development Organizations (MORG) rating values I-25

    Table I-33 Multiple Development Sites (MULT) rating values I-26

    Table I-34 Personnel Continuity (PCON) rating values I-26

    Table I-35 Product Complexity (CPLX) rating values I-27

    Table I-36 Database size rating values I-28

    Table I-37 Special Display requirements (DISP) rating values I-29

    Table I-38 Development Re-hosting (HOST) rating values I-30

    Table I-39 External Integration (INTEGE) requirements rating values I-31

    Table I-40 Internal Integration (INTEGI) requirements rating values I-31

    Table I-41 Target System Memory Constraints (MEMC) rating values I-32

    Table I-42 Software Platform (PLAT) rating values I-32

    Table I-43 Required Software Reliability (RELY) rating values I-33

    Table I-44 Real-time operations requirements (RTIM) rating values I-34

    Table I-45 System Requirements Volatility (RVOL) rating values I-34

    Table I-46 System Security Requirement (SECR) rating values I-35

    Table I-47 System CPU Timing Constraint (TIMC) rating values I-37

    Table I-48 Target System Volatility (TVOL) rating values I-38

  • xvi

    List of Equations Equation # Description Page Equation 3-1 First-order estimating model 3-2

    Equation 3-2 Simple size value 3-2

    Equation 3-3 Component-level estimating model 3-5

    Equation 3-4 Single CSCI development schedule approximation 3-6

    Equation 4-1 System-level estimating model 4-1

    Equation 4-2 Single CSCI development schedule approximation 4-1

    Equation 4-3 Total effort relationship 4-9

    Equation 4-4 Enhancement effort component 4-11

    Equation 4-5 Knowledge retention effort heuristic 4-11

    Equation 4-6 Steady-state maintenance effort 4-11

    Equation 5-1 Component-level development effort 5-1

    Equation 5-2 Component development schedule 5-2

    Equation 5-3 Rayleigh-Norden staffing profile relationship 5-4

    Equation 5-4 Basic technology constant 5-6

    Equation 5-5 Effective technology constant 5-6

    Equation 5-6 General component-level effort 5-7

    Equation 5-7 Development schedule 5-7

    Equation 5-8 Alternate schedule equation 5-8

    Equation 5-9 Total effort 5-9

    Equation 6-1 Mean size 6-4

    Equation 6-2 Standard deviation 6-5

    Equation 6-3 Adaptation adjustment factor 6-7

    Equation 6-4 Size adjustment factor 6-8

    Equation 6-5 Effective size (Jensen-based) 6-8

    Equation 6-6 Effective size (COCOMO-based) 6-8

    Equation 6-7 Ratio of development cost to development time 6-11

    Equation 6-8 Growth relationship (minimum) 6-12

    Equation 6-9 Growth relationship (maximum) 6-12

    Equation 6-10 Effective size growth 6-12

    Equation 6-11 Total size growth 6-13

    Equation 6-12 Effective size 6-13

    Equation 6-13 Effective size growth 6-13

    Equation 6-14 Mean growth size 6-14

    Equation 6-15 Value adjustment factor 6-24

    Equation 6-16 Adjusted function point count 6-25

    Equation 6-17 Function points to source lines of code conversion 6-26

    Equation 6-18 Effective size 6-27

    Equation 7-1 Development effort 7-1

    Equation 7-2 Hours per source line of code 7-1

    Equation 7-3 General form of SLIM® software equation 7-6

    Equation 7-4 Productivity factor relationship 7-6

    Equation 7-5 Development effort for upgrade 7-7

    Equation 7-6 Development schedule 7-8

    Equation 8-1 Basic technology constant 8-10

    Equation 8-2 Productivity factor 8-10

    Equation 10-1 Complexity function 10-1

    Equation E-1 First-order estimating model E-7

    Equation E-2 Effective size E-7

    Equation E-3 Second-order estimating model E-8

    Equation E-4 Effective size E-8

    Equation E-5 Third-order estimating model E-9

    Equation F-1 Function point value adjustment factor F-2

  • xvii

    Equation # Description Page Equation F-2 Adjusted function point count F-3

    Equation F-3 Total software source lines of code count F-3

    Equation F-4 Total effort relationship F-8

    Equation G-1 Function point value adjustment factor G-3

    Equation G-2 Adjusted function point count G-3

    Equation G-3 Total software source lines of code count G-3

    Equation G-4 Mean size growth factor G-5

    Equation G-5 Maximum growth size G-5

    Equation G-6 Productivity factor G-8

    Equation G-7 Median productivity factor G-8

    Equation G-8 Total effort G-12

    Equation I-1 Development effort I-3

    Equation I-2 Development productivity I-9

  • Section 1

    Introduction The term ―software crisis‖ refers to a set of problems, defined below, that

    highlights the need for changes in our existing approach to software

    development. One of the most dominant and serious complaints arising from

    the software crisis was the inability to estimate, with acceptable accuracy,

    the cost, resources, and schedule required for a software development

    project. The term ―software crisis‖ originated sometime in the late 1960s

    about the time of the 1968 NATO Conference on Software Engineering.

    Crisis is a strong word. It suggests a situation that demands resolution. The

    conditions that represent the crisis will be altered, either toward favorable

    relief or toward a potential disaster. According to Webster’s definition, a

    crisis is ―a crucial or decisive point or situation.‖ By now, the crisis should

    have been resolved one way or another.

    A notion pervading the conference was that we can engineer ourselves out

    of any problem. Hence, the term ―software engineering‖ was coined. One

    of the significant conference outputs was a software engineering

    curriculum. The curriculum happened to be identical to the computer

    science curriculum of that day.

    A list of software problems was presented as major development concerns

    at the 1968 NATO Conference. The problem list included software that

    was:

    Unreliable

    Delivered late

    Prohibitive in terms of modification costs

    Impossible to maintain

    Performing at an inadequate level

    Exceeding budget costs

    The software development problems listed in 1968 are still with us today.

    Each of these complaints can be traced to the inability to correctly estimate

    development costs and schedule. Traditional intuitive estimation methods

    have consistently produced optimistic results which contribute to the all too

    familiar cost overruns and schedule slips. In retrospect, the term exigence1

    fits the situation better than ―crisis‖ since there is no discernable point of

    change for better or worse.

    Most humans, especially software developers, are inherent optimists. When

    was the last time you heard something like, ―It can’t be that bad,‖ ―It

    shouldn’t take more than two weeks to finish,‖ or, best of all, ―We are 90

    percent complete?‖ Estimates need to be based on facts (data), not warm

    feelings or wishful thinking. In other words, hope is not a management

    strategy, nor is it an estimating approach.

    1 Exigence: The state of being urgent or pressing; urgent demand; urgency; a pressing

    necessity.

    Predicting is very hard, especially when it

    is about the future.

    Yogi Berra

    …It was also becoming painfully

    evident that estimating the cost of

    technologically state-of-the-art

    projects was an inexact science.

    The experts, in spite of their

    mountains of numbers, seemingly

    used an approach descended from

    the technique widely used to weigh

    hogs in Texas. It is alleged that in

    this process, after catching the hog

    and tying it to one end of a teeter-

    totter arrangement, everyone

    searches for a stone which, when

    placed on the other end of the

    apparatus, exactly balances the

    weight of the hog. When such a

    stone is eventually found, everyone

    gathers around and tries to guess

    the weight of the stone. Such is the

    science of cost estimating. But

    then, economics has always been

    known as the dismal science.

    Augustine’s Laws

  • 1-2

    The cost and schedule estimating problem can be described by the following

    statement:

    More software projects have gone awry for lack of calendar time than

    for all other causes combined. Why is this cause of disaster so

    common?

    First, our techniques of estimating are poorly developed. More

    seriously, they reflect an unvoiced assumption which is quite untrue, i.e.,

    that all will go well.

    Second, our estimating techniques fallaciously confuse effort with

    progress, hiding the assumption that men and months are

    interchangeable.

    Third, because we are uncertain of our estimates, software managers

    often lack the courteous stubbornness of Antoine’s chef.

    Fourth, schedule progress is poorly monitored. Techniques proven and

    routine in other engineering disciplines are considered radical

    innovations in software engineering.

    Fifth, when schedule slippage is recognized, the natural (and

    traditional) response is to add manpower. ’Like dousing a fire with

    gasoline, this makes matters worse, much worse. More fire requires

    more gasoline and thus begins a regenerative cycle that ends in

    disaster.’2

    The rapidly increasing cost of software has led customers for these products

    to become less willing to tolerate the uncertainty and losses associated with

    inaccurate cost and schedule estimates, unless the developer is willing to

    accept a significant portion of that risk. This customer pressure emphasizes

    the need to use an estimating method that can be applied early in the

    software development when tradeoff studies and investment decisions are

    made. The estimating method must consider the characteristics of the

    development organization and the environmental effects imposed by the

    development task, as well as the application size and complexity.

    Estimating is magic for most estimators and managers. Well-known science

    fiction author Arthur C. Clarke’s Third Law3 states: ―Any sufficiently

    advanced technology is indistinguishable from magic.‖ This illustrates one

    of the primary problems with software estimating today. The ―magic‖

    creates an environment of unreasonable trust in the estimate and lack of

    rational thought, logical or otherwise.

    Estimating tools produce estimates from a set of inputs that describe the

    software and the environment. The result is a development cost and

    schedule. If neither the estimator, nor the manager, understands the

    algorithm behind the estimate, then the estimate has crossed into the realm of

    magic. The result is a development cost and schedule estimate – which is

    never wrong. Estimate accuracy increases with the number of significant

    digits.

    With magic we expect the impossible, and so it is with estimating as well.

    When something is magic, we don’t expect it to follow logic, and we don’t

    apply our common sense. When the estimate is not the cost and schedule we

    2 Brooks, F.P. The Mythical Man-Month: Essays on Software Engineering. Addison-

    Wesley, Reading, MA: 1975. 3 Clarke, Arthur C. Clarke. Profiles of the Future. 1961.

    es·ti·mate

    To make a judgment as to the

    likely or approximate cost,

    quality, or extent of; calculate

    approximately…estimate may

    imply judgment based on rather

    rough calculations.

    American Heritage Dictionary

    Antoine's is a New Orleans

    restaurant whose menu states: good

    cooking takes time. If you are made

    to wait, it is to serve you better and

    to please you.

    From the menu, Antoine’s

    Restaurant, New Orleans

  • 1-3

    want, we can simply change the inputs to the algorithm and produce the

    estimate we desire. Since the tool has magical properties, we can suspend

    reality and make any estimate come true. That is why so many projects

    overrun and we consistently blame the failure on the projects, not the

    estimates.

    As demonstrated, software cost estimation is a discipline sometimes equated

    with mystical forms of prognostication, in which signs and indications are

    used to foretell or predict the future. Some suggest that a venerable person

    of recognized experience and foresight with far-seeing wisdom and prudence

    is required to create accurate cost estimates. These points have merit as cost

    estimation may fittingly be considered a blend of art and science. Yet, with

    proper knowledge and experience wisely applied, the discipline of software

    cost estimating becomes more science than knack.

    Several cost and schedule estimation methods have been proposed over the

    last 25 years with mixed success due, in part, to limitations of the estimation

    models. A significant part of the estimate failures can be attributed to a lack

    of understanding of the software development environment and the impact of

    that environment on the development schedule and cost. The environment

    imposed by the project manager is a major driver in the software equation.

    Organizations need both managers and estimators. Managers make

    infrequent estimates to support their decisions. Estimators, like any

    specialist, need to perform frequent estimates to track development progress,

    increase estimating experience, and maintain process and tool proficiency.

    Software cost estimating is an essential part of any system acquisition

    process. Fiscal constraints, the mission of service-level and command cost

    agencies, and program office responsibilities further highlight the

    importance of a solid understanding of software cost estimating principles

    and the need for credible resources.

    Estimation seeks to answer questions such as:

    Is the estimate reasonable?

    Has a similar system been developed before?

    How long should the development take?

    How much should the development cost?

    How does size, cost, and schedule data from other projects relate to this system?

    If a developer claims their software development productivity is 1,000 lines of source code per

    month, is the claim realistic?

    A sound software cost estimate is developed by employing recognized and

    accepted estimating methodologies. As an analyst, you can determine that

    your estimate is credible by using historical data, cost estimating

    relationships, and having an understanding of the tools or models used.

    Estimating the cost, schedule and resources for a software development

    project requires training, experience, access to historical information related

    to the software domain under consideration, and the confidence to commit to

    the estimate even when the project information is qualitative and lacks detail.

    All software estimates carry inherent risks from several views. For example,

    all estimating tools are the results of regression analysis (curve fitting) to

    historic project data that is inconsistent in nature. Data is collected from

    Programming a computer

    does require intelligence.

    Indeed, it requires so much

    intelligence that nobody really

    does it very well. Sure, some

    programmers are better than

    others, but we all bump and

    crash around like overgrown

    infants. Why? Because

    programming computers is by

    far the hardest intellectual

    task that human beings have

    ever tried to do. Ever.

    G.M. Weinberg, 1988

  • 1-4

    many sources, each with its own definition of size, complexity, productivity,

    and so on. Knowledge of the software project is somewhat subjective in

    terms of size, complexity, the environment, and the capabilities of the

    personnel working on project development.

    Risk represents the degree of uncertainty in the cost and schedule estimates.

    If the scope of the system under development is poorly understood, or the

    software requirements are not firm, uncertainty can become extreme.

    Software requirements in an ideal world should be complete and specified at

    a level that is sufficient to the maturity of the system. Interfaces should also

    be complete and stable to reduce the instability of the software development

    and the development estimate.

    1.1 Development constraints There is a model of software development as seen from the project control

    point of view. This model has only four variables:

    Cost

    Schedule

    Quality

    Scope

    The shareholders (users, customers, etc. – all external to the development)

    are allowed to set three of the four variables; the value of the fourth variable

    will be determined by the other three.

    Some managers attempt to set all four variables, which is a violation of the

    rules. When one attempts to set all four, the first visible failure is a decrease

    in product quality. Cost and schedule will then increase in spite of our most

    determined efforts to control them. If we choose to control cost and

    schedule, quality and/or scope become dependent variables.

    The values for these attributes cannot be set arbitrarily. For any given

    project, the range of each value is constrained. If any one of the values is

    outside the reasonable range, the project is out of control. For example, if

    the scope (size) is fixed, there is a minimum development time that must be

    satisfied to maintain that scope. Increasing funding to decrease the schedule

    will actually increase the schedule while increasing the project cost.

    Software estimating tools allow us to make the four variables visible so we

    can compare the result of controlling any, or all, of the four variables (look at

    them as constraints) and their effect on the product.

    1.2 Major cost factors There are four major groups of factors that must be considered or accounted

    for when developing a software cost and schedule estimate. The factor

    groups are: (1) effective size, (2) product complexity,

    (3) development environment, and (4) product characteristics.

    The following sections briefly describe each of the major factor groups.

    Project Uncertainty Principle

    If you understand a project, you

    won’t know its cost, and vice versa.

    Dilbert (Scott Adams)

  • 1-5

    1.2.1 Effective size Software size is the most important cost and schedule driver, yet it is the

    most difficult to determine. Size prediction is difficult enough that many

    methods have been created to alleviate the problem. These measures include

    source lines of code (SLOC), function points,

    object points, and use cases, as well as many

    variants.

    Size has a major impact on the management of

    software development in terms of cost and

    schedule. An Aerospace Corporation study by

    Long et al4 that examined 130 military software

    development projects demonstrates some of the

    constraints on development imposed by the

    magnitude of the software project. Of the 130

    projects shown in Figure 1-1, no Computer

    Software Configuration Item (CSCI) over 200

    thousand source lines of code (KSLOC) was

    successfully completed or delivered. A project

    effective size of 200 KSLOC requires a team of

    approximately 100 development and test personnel,

    a development schedule of four years, and nearly

    3,000 person months of effort. The average

    turnover rate for software personnel is less than four years. Managing a

    team of 100 people in a single development area is not easy.

    The Aerospace study also showed that schedule is not arbitrary. There is an

    apparent minimum development schedule related to size. The assumption

    that by front-loading project staff the project will decrease its schedule

    below a minimum development time is faulty. This is sometimes referred to

    as the software Paul Masson Rule; that is, ―We will deliver no software

    before its time.‖

    Effective size is discussed in detail in Section 6.

    1.2.2 Product complexity Software complexity is an all-embracing notion referring to factors that

    decide the level of difficulty in developing software projects. There are

    many facets to the value we refer to as complexity. Here we will only touch

    upon the effects of product complexity and its importance in cost and

    schedule estimation. First, complexity limits the rate at which a project can

    absorb development personnel. It also limits the total number of people that

    can effectively work on the product development. Small development teams

    are actually more productive per person than large teams; hence, the limiting

    action of complexity correlates with higher productivity. At the same time,

    the skill set necessary to build an operating system is not interchangeable

    with that of a payroll system developer.

    Complexity is discussed in detail in Section 10.

    4 Long, L., K. Bell, J. Gayek, and R. Larson. ―Software Cost and Productivity

    Model.‖ Aerospace Report No. ATR-2004(8311)-1. Aerospace Corporation. El

    Segundo, CA: 20 Feb 2004.

    Figure 1-1: Achievable development schedule based on 130 military

    software projects

    We will sell no wine before its time.

    Paul Masson advertisement, 1980

  • 1-6

    1.2.3 Development environment The development environment is a major, yet often ignored, factor in

    development productivity and the resulting development cost and schedule.

    The environment may be divided into two areas: developer capability, and

    project-specific development environment. At the core of the development

    environment measure is the raw capability of the developer. This includes

    application experience, creativity, ability to work as a team, and the use of

    modern practices and tools. It is hard to directly assess the capability or the

    environment, but by looking at the individual, independent facets of the

    environment (shown in Figure 1-2), a cohesive, rational measure can be

    obtained.

    The development environment is discussed in detail in Sections 8 (developer

    capability) and 9 (development environment).

    For estimates conducted early in the acquisition – when the developer and

    environment are unknown – typical productivity factors that assume a

    generic developer and environment can be used to obtain a ―ballpark‖

    estimate. This technique is described in Section 7.

    1.2.4 Product characteristics Product characteristics describe the software product to be developed. These

    characteristics are specific to a class of products; for example, a military

    space payload. The characteristics include timing and memory constraints,

    amount of real-time processing, requirements volatility, user interface

    complexity, and development standards (among others). Development

    standards, in turn, encompass the documentation, reliability, and test

    requirements characteristics of a project.

    The product characteristics generally limit or reduce the development

    productivity. The characteristics for any application can be grouped into a

    product template that simplifies a system-level estimate.

    The elements of the product characteristics are discussed in detail in Section

    10.

    1.3 Software support estimation Software support costs usually exceed software development costs, primarily

    because software support costs involve much more than the cost of

    correcting errors. Typically, the software product development cycle spans

    one to four years, while the maintenance phase spans an additional five to 15

    years for many programs. Over time, as software programs change, the

    software complexity increases (architecture and structure deteriorate) unless

    specific effort is undertaken to mitigate deterioration5.

    Software maintenance cost estimates in 1976 ranged from 50 to 75 percent

    of the overall software life-cycle costs6. The trend is about the same today in

    spite of the rapid increase in product size over time. Historic project data

    shows that a program with a software development cost of about $100 per

    source line can have maintenance costs that are near $4,000 per source line.

    5 Belady, L.M., and M.M. Lehman. ―Characteristics of Large Systems.‖ Research

    Directions in Software Technology. MIT Press. Cambridge, MA: 1979. 6 Boehm, B.W. ―Software Engineering,‖ IEEE Transactions of Computers.

    Dec.1976: 1226-1241.

    Figure 1-2: Development

    environment facets

    PROJECT

    PROCESS PEOPLE

    ENVIRONMENT

    PROJECT

    PROCESS PEOPLE

    ENVIRONMENT

    Murphy's Law is an adage that

    broadly states that things will go

    wrong in any given situation, if

    you give them a chance. ―If

    there’s more than one possible

    outcome of a job or task, and one

    of those outcomes will result in

    disaster or an undesirable

    consequence, then somebody will

    do it that way.‖

  • 1-7

    Software maintenance is defined by Dr. Barry Boehm7 as: "The process of

    modifying existing operational software while leaving its primary functions

    intact.‖ The definition includes two types of activities: software repair and

    software enhancement.

    Software repair is another way of saying ―corrective maintenance‖ – fixing

    implementation, processing, and performance related to the specified

    software requirements. These failures may surface after delivery of the

    product in operational use. The repair can also be related to errors or

    deficiencies known before the product was delivered, but deferred to

    maintenance because of funding or schedule issues during development.

    Maintenance costs during the period immediately following product delivery

    are normally high due to errors not discovered during the software

    development.

    Software enhancement results from software requirements changes during

    and following product delivery. Software enhancement includes:

    Redevelopment or modification of portions of the existing software product.

    Software adaptation (―adaptive‖) to new processing or data environments.

    Software performance and maintainability enhancements (―perfective‖).

    In addition to repair and enhancement, the cost of maintenance must also

    include the cost of maintaining knowledge of the software structure and

    code. The cost of this knowledge retention for a large software system often

    dominates the maintenance cost.

    1.4 Handbook overview This handbook is divided into several major sections centered on key

    software development cost estimating principles. Some of the sections

    discussed include the following:

    Software cost and schedule estimating introduction (Sections 1 & 2) – The history, art, and science behind developing reasonable

    cost estimates.

    Software development process (Section 2) – The evolution of software development, various methods or approaches, and key

    issues.

    Levels of detail in software estimates (Section 3) – Guidelines and theory to help determine the best or most appropriate method

    for evaluating given data and formulating an estimate. The cost of

    developing the system is just the tip of the iceberg when the cost

    over the entire life of the program is considered.

    System level estimating process (Section 4) – A concise introduction to the software cost estimating process at the system

    level, taking place prior to knowledge of the software architecture

    (Milestone A). This process assumes a generic developer and total

    effective software size, including growth. Estimates include

    validation and effort allocation.

    7 Boehm, B.W. Software Engineering Economics Prentice-Hall. Englewood Cliffs,

    NJ. 1981: 54.

  • 1-8

    Component level estimating process (Section 5) – A concise introduction to the software cost and schedule estimating process at

    the component level (Milestone B) using the effective component

    (CSCI) size including growth, the developer capability and

    environment, and product constraints to determine the development

    cost and schedule. Estimates include cost and schedule validation,

    as well as effort and schedule allocation.

    Estimating effective size (Section 6) – A key element in determining the effort and subsequent cost and schedule is the size

    of the software program(s) within the system. This section explains

    two primary methods of size estimation: effective source lines of

    code and function points.

    Productivity factor evaluation (Section 7) – Software development effort estimates at the system level are dependent upon

    the effective size and the generic developer productivity for the

    given system type. Productivity factors can be derived from

    historic industry data or from specific developer data (if available).

    Evaluating developer capability (Section 8) – An important factor in a component level estimate is the developer’s capability,

    experience, and specific qualifications for the target software

    system. This section explains the attributes that define developer

    capabilities in terms of efficiency or productivity.

    Development environment evaluation (Section 9) – The development environment (management, work atmosphere, etc.)

    and product traits (complexity, language, etc.), combined with the

    developer’s skill and experience directly impact the cost of the

    development. This section quantitatively describes the impacts of

    the development environment in terms of productivity.

    Product characteristics evaluation (Section 10) – The product characteristics (real-time operation, security, etc.) and constraints

    (requirements stability, memory, CPU, etc.) reduce the development

    efficiency and increase cost and schedule. This section

    quantitatively describes the impacts of the product characteristics

    on development cost and schedule and the associated estimates.

    Acronyms (Appendix A) – This appendix contains a list of the acronyms used in the handbook and by the estimating tools

    supported by the handbook.

    Terminology (Appendix B) – This section contains definitions and terminology common to this handbook and the software estimating

    discipline.

    Bibliography (Appendix C) – This section contains a list of useful resources for software development cost and schedule estimators.

    The list also includes material useful for software development

    planners and managers.

    Software life cycle approaches (Appendix D) – This appendix provides background information describing the common software

    development approaches and discusses topics such as spiral and

    incremental development.

  • 1-9

    Software estimating models (Appendix E) – This section describes the myriad of estimate types and models available for the

    software cost and schedule estimator.

    System-level estimate case study (Appendix F) – This section walks the reader through a system-level estimate example,

    incorporating the processes, practices, and background material

    introduced in the core sections of the handbook.

    Component-level estimate case study (Appendix G) – This section, like the previous section, works through an in-depth

    example at the component level.

    The defense acquisition system (Appendix H) – This appendix provides background information describing the defense acquisition

    framework and how software estimates fit into that process.

    Data collection (Appendix I) – Data collection (as highlighted in the core sections) without efforts to validate and normalize it is of

    limited value. This section provides guidelines, instructions, and

    suggested formats for collecting useful data.

  • CConcept

    Refinement

    Technology

    Development

    System Development

    & Demonstration

    Production &

    Deployment

    Operations &

    Support

    A B

    Concept

    Decision

    Design

    Readiness

    Review

    FRP

    Decision


Recommended