CSE 7314Software Testing and Reliability
Robert Oshana
Lecture #1
Industry facts
Software testing accounts for 50% of pre-release costs,and 70% of post-release costs [Cigital
Corporation]
30-40% of errors detected after deployment are run-time errors [U.C. Berkeley, IBM’s TJ Watson
Lab]
The amount of software in a typical device doubles every 18 months [Reme Bourguignon, VP of Philips
Holland]
Defect densities are stable over the last 20 years : 0.5 - 2.0 sw failures / 1000 lines [Cigital
Corporation]
Critical SW Applications
Critical software applications which have failed :
Mariner 1 NASA 1962Missing ‘-’ in ForTran code Rocket bound for Venus destroyed
Therac 25 Atomic Energy of Canada Ltd 1985-87Data conversion error Radiation therapy machine for cancer
Long Distance Service AT&T 1990A single line of bad code Service outages up to nine hours long
Patriot Missiles U.S. military 1991Endurance errors in tracking system 28 US soldiers killed in barracks
Tax Calculation Program InTuit 1995Incorrect results SW vendor payed tax penalties for users
Introduction
Chapter 1
History
• STEP originally developed out of frustration!
• STEP does not establish rules that must be followed– guidelines
History of testing definitions
Preventive testing
• Philosophy that testing can actually improve the quality of the software tested if it occurs early enough
• The process of writing test cases to test a requirement can identify the flaw in the requirement
Preventive testing
• “Withdraw $200 from an account with $165 in it”
• “Withdraw $168.46 from an account with $200 in it”
Waterfall model
Why is testing so difficult?
• Ambiguous and incorrect requirements
• Tight time schedules
• Many other !!!
STEP
• Introduced in 1985
• Built on IEEE 1008-1987
• Covers a broad activity of software evaluation
Views of testing
CSE 7314Software Testing and Reliability
Robert Oshana
End of Lecture
CSE 7314Software Testing and Reliability
Robert Oshana
Lecture #2
Introduction
Chapter 1
Views of testing
Elements of STEP
STEP architecture
STEP activities
Activity timing of various levels of test
Activity timing of various levels of test
Parallel Mutually Supportive Development
Roles and responsibilities in STEP
Summary of STEP
Chapter 2
Risk analysis
• No way to guarantee a software system will be perfect
• Small number of projects meet criteria for success
• Most of us realize its impossible to test everything in even the most trivial of systems
• Shortage of trained testers doesn’t help
CSE 7314Software Testing and Reliability
Robert Oshana
End of Lecture
CSE 7314Software Testing and Reliability
Robert Oshana
Lecture #3
Chapter 2
Risk analysis
• No way to guarantee a software system will be perfect
• Small number of projects meet criteria for success
• Most of us realize its impossible to test everything in even the most trivial of systems
• Shortage of trained testers doesn’t help
Domain of all possible test cases
What is risk?
• “the chance of injury, damage, or loss; a dangerous chance; a hazard”
• Everyone subconsciously performs risk analysis hundred of times a day
• Risk management; risk analysis, avoidance, and control
• IEEE standard defines a section for “Risk and contingencies”
Risk analysis activities
Software risk analysis
• Purpose is to determine what to test, the testing priority, and the depth of testing
• Also determining what not to test
• Should be done by a team of experts from various groups in the organization
Software risk analysis
• Done as early as possible
• First cut after high level requirements
• Results of the analysis should be reviewed periodically because things change
Software riskanalysisprocess
Process
• 1. Form a brainstorming team to increase the number of ideas
• 2. Reduce the list to a workable size
• 3. Compile a list of features
• 4. Assign an indicator of relative likelihood of failure
Likelihood of failure
4. Determine the impact
5. Assign numerical values
• H, M, L
• Is your system safety-critical?
• Add together likelihood of failure and impact of failure
CSE 7314Software Testing and Reliability
Robert Oshana
End of Lecture
CSE 7314Software Testing and Reliability
Robert Oshana
Lecture #4
Chapter 2
4. Determine the impact
5. Assign numerical values
• H (3), M (2), L (1)
• Is your system safety-critical?
• Add together likelihood of failure and impact of failure
Risk priority
Summed priorities
7. Review/modify the values
• Modify based on additional information from analysis
• Likelihood-of-failure-indicators– Team history– Complexity– Usability– New technology– Defect history
8. Prioritize the features
9. Determine the cut line
10. Consider mitigation
Planning risks and contingencies
• Determine the best contingencies in the event that one of the planning risks occurs
• Schedules are usually ambitious or even impossible
• Planning risks are anything that adversely affects the planned testing effort
Planning risks and contingencies
• Identifying planning risks and contingencies helps you make intelligent, informed decisions
• Possible contingencies– Reduce the scope– Delay implementation– Add resources– Reduce quality processes
Planning risks and contingencies
• Planning risks help us to do the “what if..” and develop contingencies
• The entire testing strategy is planned around the concept of using risk to prioritize the testing effort
Project assumptions
• Project and test plans usually have a section called “assumptions”
• Assumptions become planning risks if they turn out to be false
Summary
• Software risk analysis helps you decide what features and attributes should be tested and helps you assign priorities to these items
• Planning risk analysis helps you decide what to do in the event that an unplanned problem arises
CSE 7314Software Testing and Reliability
Robert Oshana
End of Lecture