Date post: | 13-Dec-2015 |
Category: |
Documents |
Upload: | oswald-joseph |
View: | 213 times |
Download: | 0 times |
University of Southern California
Center for Systems and Software Engineering
Risk Management
©USC-CSSE 1
University of Southern California
Center for Systems and Software Engineering
©USC-CSSE 2
University of Southern California
Center for Systems and Software Engineering
Risk vs Issue
• A Risk is an uncertain event that could impact your chosen path should it be realised.
• Risks are events that are not currently affecting you – they haven’t happened yet.
• Once a risk is realised, it has the potential to become an Issue
©USC-CSSE 3
Source: http://agile101.net/2009/07/26/agile-risk-management-the-difference-between-risks-and-issues/
University of Southern California
Center for Systems and Software Engineering
Risk Management
• What is risk? • What about problem, concern, issue?
©USC-CSSE 4
University of Southern California
Center for Systems and Software Engineering
Reactively Managing a Software Development Problem
System Integration Time:• We just started integrating the various software components to be used in
our project and we found out that COTS* products A and B can’t talk to one another
• This is a problem, caused by a previously unrecognized risk, that materialized: The risk that two COTS products might not be able to talk to one another; specifically that A and B might not be able to talk to one another)
• We’ve got too much tied up in A and B to change• Our best solution is to build wrappers around A and B to get them to talk via
CORBA**• This will result in:
– a 3 month schedule overrun = $100K contract penalty– a $300K cost overrun
*COTS: Commercial off-the-shelf**CORBA: Common Object Request Broker Architecture
©USC-CSSE 5
University of Southern California
Center for Systems and Software Engineering
Proactively Managing a Risk (assessment)
System Design Time:– A and B are our strongest COTS choices
– But there is some chance that they can’t talk to one another • Probability that A and B can’t talk to one another
= probability of loss: P(L)
• From previous experience, with COTS like A and B, we assess P(L) at 50%
– If we commit to using A and B, and we find out at integration time that they can’t talk to one another
• Size of loss S(L) = $300K + $100K = $400K
• We have a risk exposure of RE = P(L) * S(L) = (.5) * ($400K) = $200K
©USC-CSSE 6
University of Southern California
Center for Systems and Software Engineering
Risk Management Strategy 1:Buying Information
System Design Time:– Let’s spend $30K and 2 weeks prototyping the
integration of A and B– This will buy information on the magnitudes of
P(L) and S(L)– If RE = P(L) * S(L) is small, we’ll accept and
monitor the risk– If RE is large, we’ll use one/some of the other
strategies
©USC-CSSE 7
University of Southern California
Center for Systems and Software Engineering
Other Risk Management Strategies• Risk Avoidance
– COTS product C is almost as good as B, and we know, from having used A and C, that C can talk to A
– Delivering on time is worth more to the customer than the small performance loss
• Risk Transfer– If the customer insists on using A and B, have them establish
a risk reserve, to be used in case A and B can’t talk to each other
• Risk Reduction– If we build the wrappers and the CORBA connections right
now, we add cost but minimize the schedule delay• Risk Acceptance
– If we can solve the A and B interoperability problem, we’ll have a big competitive edge on future procurements
– Let’s do this on our own money, and patent the solution©USC-CSSE 8
University of Southern California
Center for Systems and Software Engineering
Software Risk Management
©USC-CSSE 9
ChecklistsDecision driver analysisAssumption analysisDecomposition
Performance modelsCost modelsNetwork analysisDecision analysisQuality factor analysis
Risk exposureRisk leverageCompound risk reduction
Buying informationRisk avoidanceRisk transferRisk reductionRisk element planningRisk plan integration
PrototypesSimulationsBenchmarksAnalysesStaffingMilestone trackingTop-10 trackingRisk reassessmentCorrective action
Risk Resolution
Risk Assessment
Risk Control
Risk Identification
Risk Analysis
Risk Prioritization
Risk mgmtPlanning
Risk Management
Risk Monitoring
University of Southern California
Center for Systems and Software Engineering
Top 10 Risk Categories: 1989 and 1995
©USC-CSSE 10
1989
1. Personnel shortfalls
2. Schedules and budgets
3. Wrong software functions
4. Wrong user interface
5. Gold plating
6. Requirements changes
7. Externally-furnished components
8. Externally-performed tasks
9. Real-time performance
10. Straining computer science
1995
1. Personnel shortfalls
2. Schedules, budgets, process
3. COTS, external components
4. Requirements mismatch
5. User interface mismatch
6. Architecture, performance, quality
7. Requirements changes
8. Legacy software
9. Externally-performed tasks
10. Straining computer science
University of Southern California
Center for Systems and Software Engineering
Primary CS577 Risk Categories (all on 1995 list) and Examples
• Personnel shortfalls: commitment (This is team member’s last course; only needs C to graduate); compatibility; communication problems; skill deficiencies (management, Web design, Java, Perl, CGI, data compression, …)
• Schedule: project scope too large for 24 weeks; IOC content; critical-path items (COTS, platforms, reviews, …)
• COTS: see next slide re multi-COTS
• Rqts, UI: mismatch to user needs (recall overdue book notices)
• Performance: #bits; #bits/sec; overhead sources
• Externally-performed tasks: Client/Operator preparation; commitment for transition effort
©USC-CSSE 11
University of Southern California
Center for Systems and Software Engineering
COTS and External Component Risks
• COTS risks: immaturity; inexperience; COTS incompatibility with application, platform, other COTS; controllability
• Non-commercial off-the shelf components: reuse libraries, government, universities, etc.– Qualification testing; benchmarking;
inspections; reference checking; compatibility analysis
©USC-CSSE 12
University of Southern California
Center for Systems and Software Engineering
Risk Exposure Factors(Satellite Experiment Software)
©USC-CSSE 13
Unsatisfactory Outcome (UO) Prob (UO) Loss (UO) Risk ExposureA. S/ W error kills experiment
B. S/ W error loses key data
C. Fault tolerance features cause unacceptable
performance
D. Monitoring software reports unsafe condition
as safe
E. Monitoring software reports safe condition
as unsafe
F. Hardware delay causes schedule overrun
G. Data reduction software errors cause extra
work
H. Poor user interface causes inefficient
operation
I. Processor memory insufficient
J. DBMS software loses derived data
3 - 5
3 - 5
4 - 8
5
5
6
8
6
1
2
10
8
7
9
3
4
1
5
7
2
30 - 50
24 - 40
28 - 56
45
15
24
8
30
7
4
University of Southern California
Center for Systems and Software Engineering
Risk Reduction Leverage (RRL)Change in Risk Exposure / cost to implement avoidance method
©USC-CSSE 14
RRL - RE BEFORE - RE AFTER
RISK REDUCTION COST
· Spacecraft Example
LOSS (UO)PROB (UO)RE
BB
LONG DURATIONTEST
$20M0.2$4M
FAILURE MODETESTS
$20M0.2$4M
PROB (UO)RE A
A
0.05$1M
0.07$1.4M
COST $2M $0.26M
RRL4-12
= 1.54- 1.40.26
= 10
University of Southern California
Center for Systems and Software Engineering
Risk Management Plans
©USC-CSSE 15
For Each Risk Item, Answer the Following Questions:1. Why?
Risk Item Importance, Relation to Project Objectives2. What, When?
Risk Resolution Deliverables, Milestones, Activity Nets3. Who, Where?
Responsibilities, Organization4. How?
Approach (Prototypes, Surveys, Models, …)5. How Much?
Resources (Budget, Schedule, Key Personnel)
University of Southern California
Center for Systems and Software Engineering
Risk Management Plan: Fault Tolerance Prototyping
1. Objectives (The “Why”)– Determine, reduce level of risk of the software fault tolerance features
causing unacceptable performance– Create a description of and a development plan for a set of low-risk fault
tolerance features
2. Deliverables and Milestones (The “What” and “When”)– By week 3
1. Evaluation of fault tolerance option2. Assessment of reusable components3. Draft workload characterization4. Evaluation plan for prototype exercise5. Description of prototype
– By week 76. Operational prototype with key fault tolerance features7. Workload simulation8. Instrumentation and data reduction capabilities9. Draft Description, plan for fault tolerance features
– By week 1010. Evaluation and iteration of prototype11. Revised description, plan for fault tolerance features
©USC-CSSE 16
University of Southern California
Center for Systems and Software Engineering
Risk Management Plan: Fault Tolerance Prototyping (concluded)
• Responsibilities (The “Who” and “Where”)– System Engineer: G. Smith
• Tasks 1, 3, 4, 9, 11, support of tasks 5, 10
– Lead Programmer: C. Lee• Tasks 5, 6, 7, 10 support of tasks 1, 3
– Programmer: J. Wilson• Tasks 2, 8, support of tasks 5, 6, 7, 10
• Approach (The “How”)– Design-to-Schedule prototyping effort– Driven by hypotheses about fault tolerance-performance effects– Use real-time OS, add prototype fault tolerance features– Evaluate performance with respect to representative workload– Refine Prototype based on results observed
• Resources (The “How Much”)$60K - Full-time system engineer, lead programmer, programmer (10
weeks)*(3 staff)*($2K/staff-week)$0K - 3 Dedicated workstations (from project pool)$0K - 2 Target processors (from project pool)$0K - 1 Test co-processor (from project pool)$10K - Contingencies$70K - Total ©USC-CSSE 17
University of Southern California
Center for Systems and Software Engineering
Risk Monitoring
Milestone Tracking– Monitoring of risk Management Plan Milestones
Top-10 Risk Item Tracking– Identify Top-10 risk items– Highlight these in monthly project reviews– Focus on new entries, slow-progress items
Focus review on manger-priority items
Risk Reassessment
Corrective Action
©USC-CSSE 18
University of Southern California
Center for Systems and Software Engineering
Project Top 10 Risk Item List: Satellite Experiment Software
©USC-CSSE 19
Risk Item Mo. RankingThis Last #Mo. Risk Resolution Progress
Replacing Sensor-Control Software 1 4 2 Top Replacement Candidate UnavailableDeveloper
Target Hardware Delivery Delays 2 5 2 Procurement Procedural Delays
Sensor Data Formats Undefined 3 3 3 Action Items to Software, Sensor Teams; Due Next Month
Staffing of Design V&V Team 4 2 3 Key Reviewers Committed; Need Fault- Tolerance Reviewer
Software Fault-Tolerance May 5 1 3 Fault Tolerance Prototype Successful Compromise Performance
Accommodate Changes in Data 6 - 1 Meeting Scheduled With Data Bus Bus Design Designers
Testbed Interface Definitions 7 8 3 Some Delays in Action Items; Review Meeting Scheduled
User Interface Uncertainties 8 6 3 User Interface Prototype Successful
TBDs In Experiment Operational - 7 3 TBDs Resolved Concept
Uncertainties In Reusable - 9 3 Required Design Changes Small, Monitoring Software Successfully Made
University of Southern California
Center for Systems and Software Engineering
Early Risk Management in 577Project Tasks Risk Management Skills; Skill-building activities
Select projects; form teams Project risk identificationStaffing risk assessment and resolution- Readings, lectures, homework, case study, guidelines
Plan early phases Schedule/budget risk assessment, planningRisk–driven processes (ICSM)- Readings, lectures, homework, guidelines, planning and estimating tools
Formulate, validate concept of operation
Risk-driven level of detail- Readings, lecture, guidelines, project
Manage to plans Risk monitoring and control- Readings, lecture, guidelines, project
Develop, validate FC package Risk assessment and prioritization- Readings, lecture, guidelines, project
FC Architecture Review Risk-driven review process
Review of top-N project risks Readings, lecture, case studies, review
©USC-CSSE 20
University of Southern California
Center for Systems and Software Engineering
Software Risk Management TechniquesSource of Risk Risk Management Techniques
1. Personnel shortfalls Staffing with top talent; key personnel agreements; team-building; training; tailoring process to skill mix; walkthroughs
2. Schedules, budgets,Process
Detailed, multi-source cost and schedule estimation; design to cost; incremental development; software reuse; requirements descoping; adding more budget and schedule; outside reviews
3. COTS, external components
Benchmarking; inspections; reference checking; compatibility prototyping and analysis
4. Requirements mismatch
Requirements scrubbing; prototyping; cost-benefit analysis; design to cost; user surveys
5. User interface mismatch
Prototyping; scenarios; user characterization (functionality; style, workload); identifying the real users
©USC-CSSE 21
University of Southern California
Center for Systems and Software Engineering
Software Risk Management TechniquesSource of Risk Risk Management Techniques
6. Architecture, performance, quality
Simulation; benchmarking; modeling; prototyping;instrumentation; tuning
7. Requirements changes High change threshold: information hiding; incrementaldevelopment (defer changes to later increments)
8. Legacy software Reengineering; code analysis; interviewing; wrappers; incremental deconstruction
9. COTS, Externally-performed tasks
Pre-award audits, award-fee contracts, competitive design or Prototyping
10. Straining computerscience
Technical analysis; cost-benefit analysis; prototyping; reference checking
©USC-CSSE 22
University of Southern California
Center for Systems and Software Engineering
Validation Results on Process Adoption
• Incidents of Process Selection and Direction Changes
23
#of teams Results on Project Process Selection
8/14 Selected the right process pattern from the beginning
3/14 Unclear project scope ; re-select right at the end of the Exploration phase
1/14 Minor changes on project scope ; right at the end of the Valuation phase
1/14 Major change in Foundations phase
1/14 Infeasible project scope
©USC-CSSE
University of Southern California
Center for Systems and Software Engineering
Top 10 Risk Categories: 1995 and 2010
1995 2010
1. Personnel shortfalls 1. Customer-developer-user team cohesion
2. Schedules, budgets, process 2. Personnel shortfalls
3. COTS, external components 3. Architecture complexity; quality tradeoffs
4. Requirements mismatch 4. Budget and schedule constraints
5. User interface mismatch 5. COTS and other independently evolving systems
6. Architecture, performance, quality 6. Lack of domain knowledge
7. Requirements changes 7. Process Quality Assurance
8. Legacy software 8. Requirements volatility; rapid change
9. Externally-performed tasks 9. User interface mismatch
10. Straining computer science 10. Requirements mismatch
©USC-CSSE 24
University of Southern California
Center for Systems and Software Engineering
Primary CS577 Risk Categories (all on 1995 list) and Examples
• Personnel shortfalls: commitment (This is team member’s last course; only needs C to graduate); compatibility; communication problems; skill deficiencies (management, Web design, Java, Perl, CGI, data compression, …)
• Schedule: project scope too large for 24 weeks; IOC content; critical-path items (COTS, platforms, reviews, …)
• COTS: see next slide re multi-COTS
• Rqts, UI: mismatch to user needs (recall overdue book notices)
• Performance: #bits; #bits/sec; overhead sources
• Externally-performed tasks: Client/Operator preparation; commitment for transition effort
©USC-CSSE 25
University of Southern California
Center for Systems and Software Engineering
Top 11 - Risk distribution in CSCI577
©USC-CSSE 26
University of Southern California
Center for Systems and Software Engineering
Comparing between risks in Fall and Spring
©USC-CSSE 27
University of Southern California
Center for Systems and Software Engineering
Conclusions• Risk management starts on Day One
– Delay and denial are serious career risks– Data provided to support early investment
• Win Win spiral model provides process framework for early risk resolution– Stakeholder identification and win condition
reconciliation– Anchor point milestones
• Risk analysis helps determine “how much is enough”– Testing, planning, specifying, prototyping,…– Buying information to reduce risk
©USC-CSSE 28
University of Southern California
Center for Systems and Software Engineering
Quality Management
©USC-CSSE 29
University of Southern California
Center for Systems and Software Engineering
Outline
• Quality Management – In CMMI 1.3– In ISO 15288– In CSCI577ab
(c) USC-CSSE 30
University of Southern California
Center for Systems and Software Engineering
Objectives of QM
• To ensure the high quality process • in order to deliver high quality products
(c) USC-CSSE 31
University of Southern California
Center for Systems and Software Engineering
Quality Management in CMMI 1.3
(c) USC-CSSE 32
Process Areas
Configuration Management (CM) Product Integration (PI)
Causal Analysis and Resolution (CAR) Project Monitoring and Control (PMC)
Decision Analysis and Resolution (DAR) Project Planning (PP)
Integrated Project Management (IPM) Quantitative Project Management (QPM)
Measurement and Analysis (MA) Requirements Development (RD)
Organizational Performance Management (OPM)
Requirements Management (REQM)
Organizational Process Definition (OPD) Risk Management (RSKM)
Organizational Process Focus (OPF) Supplier Agreement Management (SAM)
Organizational Process Performance (OPP)
Technical Solution (TS)
Organizational Training (OT) Validation (VAL)
Process and Product Quality Assurance (PPQA)
Verification (VER)
University of Southern California
Center for Systems and Software Engineering
PPQA - Product and Process Quality Assurance
(c) USC-CSSE 33
University of Southern California
Center for Systems and Software Engineering
PPQA - Product and Process Quality Assurance
(c) USC-CSSE 34
University of Southern California
Center for Systems and Software Engineering
PPQA for Agile development
(c) USC-CSSE 35
University of Southern California
Center for Systems and Software Engineering
CM – Configuration Management
(c) USC-CSSE 36
University of Southern California
Center for Systems and Software Engineering
CM – Configuration Management
(c) USC-CSSE 37
University of Southern California
Center for Systems and Software Engineering
CM – Configuration Management
(c) USC-CSSE 38
University of Southern California
Center for Systems and Software Engineering
MA – Measurement and Analysis
(c) USC-CSSE 39
University of Southern California
Center for Systems and Software Engineering
VER - Verification
(c) USC-CSSE 40
University of Southern California
Center for Systems and Software Engineering
VER - Verification
(c) USC-CSSE 41
University of Southern California
Center for Systems and Software Engineering
VAL - Validation
(c) USC-CSSE 42
University of Southern California
Center for Systems and Software Engineering
VAL - Validation
(c) USC-CSSE 43
University of Southern California
Center for Systems and Software Engineering
Quality Management in ISO 15288Activitiesa) Plan quality management.
1. Establish quality management policies
2. Establish organization quality management objectives
3. Define responsibilities and authority for implementation of quality management.
b) Assess quality management. 1. Assess customer satisfaction and report.
2. Conduct periodic reviews of project quality plans.
3. The status of quality improvements on products and services is monitored.
c) Perform quality management corrective action. 1. Plan corrective actions when quality management goals are
not achieved.
2. Implement corrective actions and communicate results through the organization.
(c) USC-CSSE 44
University of Southern California
Center for Systems and Software Engineering
Configuration Management in ISO 15288Activities
a)Plan configuration management. 1) Define a configuration management strategy
2) Identify items that are subject to configuration control.
b)Perform configuration managementa) Maintain information on configurations with an
appropriate level of integrity and security
b) Ensure that changes to configuration baselines are properly identified, recorded, evaluated, approved, incorporated, and verified.
(c) USC-CSSE 45
University of Southern California
Center for Systems and Software Engineering
Quality Management in 577ab
• IIV&V• Configuration Management• Defect Reporting and Tracking• Testing• Buddy Review• Architecture Review Board• Core Capability Drive through• Design Code Review• Document template• Sample artifacts
(c) USC-CSSE 46
University of Southern California
Center for Systems and Software Engineering
Quality Guidelines
• Design Guidelines– Describe design guidelines on how to improve
or maintain modularity, reuse and maintenance– How the design will map to the implementation
• Coding Guidelines– Describe how to document the code in such as
way that it could easily be communicated to others
(c) USC-CSSE 47
University of Southern California
Center for Systems and Software Engineering
Coding Guidelines
• C: http://www.gnu.org/prep/standards/standards.html• C++ : http://geosoft.no/development/cppstyle.html• Java: http://geosoft.no/development/javastyle.html• Visual Basic:
http://msdn.microsoft.com/en-us/library/h63fsef3.aspx
(c) USC-CSSE 48
University of Southern California
Center for Systems and Software Engineering
Quality Guidelines
• Version Control and History– Chronological log of the changes introduced to
this unit
• Implementation Considerations– Detailed design and implementation for as-built
considerations
• Unit Verification– Unit / integration test– Code walkthrough / review / inspection
(c) USC-CSSE 49
University of Southern California
Center for Systems and Software Engineering
Quality Assessment Methods
• Methods, tools, techniques, processes that can identify the problems– Detect and report the problem– Measure the quality of the software system
• Three methods of early defect identification– peer review, IIV&V, Automated Analysis
(c) USC-CSSE 50
University of Southern California
Center for Systems and Software Engineering
Peer Review
• Reviews performed by peers in the development team– Can be from Fagan’s inspections to simple
buddy checks– Peer Review Items– Participants / Roles– Schedule
(c) USC-CSSE 51
University of Southern California
Center for Systems and Software Engineering Defect Removal Profiles
(c) USC-CSSE 52