ISA 201Information System
Acquisition
Lesson 16Software Testing
Learning Objectives
3Software Testing
Overall: Given a DoD IT/SW acquisition scenario, apply software T&E best practices that result in system acceptance.• Identify the Test & Evaluation (T&E) responsibilities of the
Program Manager (PM). • Describe the purpose of DoD software Test & Evaluation (T&E). • Describe the DoD software T&E environment. • Describe the DoD software T&E process. • Identify what Independent Verification and Validation (IV&V) is.• Recognize the different types of software testing tools. • Identify common DoD software testing issues and risks.• Recognize the Agile software development T&E challenges.• Recognize the software T&E best practices.
Today we will learn to:
Lesson Overview
4Software Testing
• Software Testing Overview• Software Development and Testing Process• Software T&E Considerations, Issues and Trends• Software Testing Lessons Learned & Best Practices• Summary
Lesson Plan
5Software Testing
Can we test quality into our software???
Testing Software
Program Manager’s Responsibilities for Software T&E
The PM is responsible for defining the level of Software Quality for their program. This includes defining the “level of maturity” the software must perform at in order to declare software T&E a success.
6Software Testing
Testing must focus on testing the performance of those items the Program Manager (PM) identifies as software quality goals for the
success of their program.
7Software Testing
The software T&E mission has two primary objectives:
1) Demonstrate performance of the whole software system2) Assist in fault detection and correction of faults
There are four (4) primary ways to support the success of the software T&E mission:
1) Your plans should include an incremental test strategy 2) Identify and correct software defects as early as
possible in the lifecycle3) Provide scientific based measures for progress and
quality4) Provide data evaluation to support acquisition decisions
Software Test and Evaluation Mission
Example Typical Costs of Software Fixes*
*Once a system is fielded, PDSS costs are typically 50-70% of total system lifecycle costsB. Boehm, Software Engineering Economics
Lifecycle SW Development Activity
Initial $ Spent
Errors Introduced
Errors Found
Relative Cost
Requirements Analysis 5% 55% 18% 1.0
Design Activities 25% 30% 10% 1.0-1.5
Testing Activities 60% 10% 50% 1.5-5.0
Documentation 10% --- --- ---
Post-Deployment Software Support
(PDSS)---* 5% 22% 10-100
8Software Testing
9Software Testing
Current State of Software T&E
• National Institute of Standards & Technology (NIST) states that inadequate testing methods and tools cost the U.S. economy between $22.2 and $59.5 billion annually.
• Testing methods in general are relatively ineffective. Testing typically only identifies from one-fourth to one-half of defects (found over a typical software lifespan) while other verification methods (e.g., formal inspections) are more effective in finding defects earlier.
• Inadequate and ineffective testing results in approximately 2 to 7 defects undiscovered per thousand lines of code (KLOC) when the software is deployed. This means that major software-reliant systems are being delivered and placed into operation with hundreds or even thousands of residual defects. If you include software vulnerabilities as defects, the defect rates are even more troubling.
What are the problems with Software T&E?
https://insights.sei.cmu.edu/sei_blog/2013/04/common-testing-problems-pitfalls-to-prevent-and-mitigate.html
10Software Testing
Known Problems with Software T&E
• Requirements-related testing: Requirements ambiguous, missing, incomplete, incorrect, or unstable.
• Test planning and scheduling: Improper planning, improper test schedules.
• Stakeholder involvement and commitment: Not testing items important to stakeholders; Leadership commitment lacking.
• Management-related testing: Test resource management lacking.
• Test organizational and professionalism: Lack of independence, unclear testing responsibilities, and inadequate testing expertise.
• Test & Evaluation process: T&E and engineering processes poorly integrated.
• Test tools and environments: Over-reliance on manual or COTS testing tools; testing environments inadequate to fully test software.
• Test communication: Test documents inadequate, not maintained, not communicated. https://insights.sei.cmu.edu/sei_blog/2013/04/common-testing-problems-pitfalls-to-prevent-and-mitigate.html
11Software Testing
Software Test Plan Relationships
Test & Evaluation Master Plan (TEMP)Documents the overall structure and objectives of the T&E program. Includes critical operational effectiveness and suitability parameters for [among other items]...SOFTWARE and COMPUTER RESOURCES...it addresses SOFTWARE MATURITY and the degree to which SOFTWARE DESIGN has stabilized..
Defense Acquisition Guidebook
TEMP
Developmental Testing: -substantiate system technical performanceOperational Testing: -meet CDD requirements
Software Development Plan (SDP)• Coding and Unit Testing• Unit Integration & Test• SI Qualification Testing• SI/HI Integration & Test• System Qualification Testing• Software Quality Assurance• Corrective Action Process
S D P
ST P
Software Requirements
Software Test Plan (STP)• ID’s Test Items (SIs, Units)• ID’s Personnel Resources• ID’s Test Environment• Provides Test Schedules• Traces to Requirements
Example of a Software Testing Priority Classification Scheme
Category Classifications• Project Plans• Operational Concept• System/Software Requirements• Design• Code• Databases/data files• Test Plans, Descriptions, Reports• User, Operator or Support Manuals• Other Software Products
Priority Classification SchemePriority 1: Prevents Mission Accomplishment or jeopardizes safety or other “critical” requirementPriority 2: Adversely affect Mission Accomplishment or cost, schedule, performance or software support + no work-around existsPriority 3: Adversely affect Mission Accomplishment or cost, schedule, performance or software support + a work-around existsPriority 4: User/Operator or support inconveniencePriority 5: All other problems
J-STD-016Classification
Schemes
Rev 2.2
“Alternate category and priority schemes may be used by
developers if approved by the acquirer”
J-STD-016, Annex M
12Software Testing
13Software Testing
COMPUTER-based
White Box – Uses source code knowledge and executes Software Unit tests to ensure correctnessBlack Box – No knowledge of internal source code; perspective of a HackerGray Box – Some knowledge of internal code; perspective of a smart Hacker
Types of Software Tests (Human-based and Computer-based)
HUMAN-based
• Desk Checking –Programmer inspected
• Walkthroughs (Informal Inspections) –Programmer Team inspected.
• FORMAL INSPECTIONS
FORMAL INSPECTIONS
provide BEST ROI
14Software Testing
Types of Software Tests (Human-based and Computer-based)
Human-Based Computer-Based
Form
al In
spec
tions
White Box
Black-box
Gray-box
Acceptance Testing occurs after each development phase (yellow and blue boxes) to ensure requirements are met from previous phase. Final Acceptance
testing is with the warfighter and is the most important test!
Lesson Overview
15Software Testing
• Software Testing Overview
• Software Development and Testing Process
• Software T&E Considerations, Issues and Trends• Software Testing Lessons Learned & Best Practices• Summary
Lesson Plan
Software UnitCoding
Operational Requirements
(user needs) & Project Plans
System Requirements Analysis
SystemDesign
Software Requirements Analysis
SoftwareDesign
Software Unit Testing
Software Unit Integration & Testing
HI and SIIntegration & Testing
SubsystemIntegration & Testing
System Qualification Testing
SI Qualification Testing
Operational& AcceptanceTests
Concurrent Test
Planning
Software Development V-Model
16Software Testing
OT&E
DT&E
17Software Testing
Remember: Typically the Software Specification Review (SSR) results in approval of the Software Requirements Specification (SRS) and the Interface Requirements Specification (IRS).
Outputs/Exit Criteria• Software Test Descriptions are
defined, verified & baselined• Testing is consistent with any
defined incremental approaches• Test facilities available and OK• Tested software under CM• Lower-level software tests done• Software metrics indicate readiness
to start testing• Software problem report system is
defined and implemented• Software test baseline established• Development estimates updated• Nontested requirements at the SI
level are identified for later testing
Inputs/Entry Criteria• Identified tested requirements (SRS)• Traceability of test requirements to
SRS & IRS has been established• All software item test procedures
have been completed• Test objectives identified• Applicable documentation,
procedures and plans are complete and under control
• Method of documenting and dispositioning “test anomalies” is acceptable
Test Readiness Review (TRR)
Software OT Maturity CriteriaSoftware Problems
Functionality
Management
CM Process
Fixes
No Priority I or II problemsImpact Analysis for Priority III
Functionality available before OT&E DT completed External interfaces functionally certified
PMO ID’s unmet critical parameters Acquisition Executive must certify (+
Operational Tester must agree) that:•SW requirement & design stable•Depth & breadth of testing is OK•DT has demo’ed required functions
Deficiency reporting process in place Software CM system in place System is baselined Test Agency has access to CM system
All changes completed
Software Maturity “must be
demonstrated prior to OT&E”
*OSD (OT&E) Memo Subject: Software Maturity Criteria 18Software Testing
19Software Testing
Validation Ensuring that the software
meets the users’ needs.Answers Question: “Did I build the right system?”
VerificationEnsuring that the system
is well engineered. Answers Question: “Did I build the system right?”
Independent Verification and Validation (IV&V)
• Test activities (based on the TEMP and Software Test Plan (STP))• Demonstrate that the system meets all critical technical parameters &
identify technological & design risks documented in the Software Requirements Specification (SRS), Interface Requirements Specification (IRS) and Data Design Specification (DDS).
• Ready to perform system OT&E • No open priority 1 or 2 problems
• Acceptable degrees of:• Requirements traceability / stability (includes Cybersecurity)• Computer resource utilization• Design stability• Breadth & depth of testing• Fault profiles• Reliability & Interoperability • Software Maturity
System DT&E Decision Exit Criteria
20Software Testing
Software Evaluations During IOT&E
• Does the software support system performance? Will the system be accepted by our customer?
• Is the software:- Mature & reliable?- Usable by typical operators?- Sustainable/maintainable?
• Testers Bottom Line… all cases of software testing, reproducible failures facilitate software fixes. When you identify a problem, document it for reproducibility purposes.
21Software Testing
Lesson Overview
22Software Testing
• Software Testing Overview• Software Development and Testing Process
• Software T&E Considerations, Issues and Trends
• Software Testing Lessons Learned & Best Practices• Summary
Lesson Plan
23Software Testing
Software T&E Consideration: Software Testing Tools
• Automated software testing is critical for testing large and complex DoD software applications. Tool support is very useful for repetitive tasks; the computer doesn’t get bored and will be able to exactly repeat repetitive tests without any mistakes.
• Here are some of the benefits of using automated testing tools. Test tools can:- automatically verify key functionality- do automated Regression Testing- test interfaces- test Graphical User Interfaces (GUI)- provide scenario testing models for Mission Critical Threads (MCT)- automate cloud services testing- help test teams run large numbers of tests in a shorter period of
time.
24Software Testing
Software T&E Consideration: Software Testing Tools
• Successful programs use Static and Dynamic Analysis tools in combination.
- The use of Static Analysis and Dynamic Analysis Tools should be put on contract to ensure COTS vendors provide a picture of their code for Government purposes (Security and Product Support).
• Five (5) Automated Test Tool types:
- Test Management Tools- Static Testing Tools- Test Specification Tools- Test execution and logging Tools- Performance and Monitoring Test
25Software Testing
Software T&E Consideration: Software Testing Tools
Original Software Code Software Code After Many Changes
Example Results of Static Code Analysis
Software T&E Issues
• Software design issues that may affect the T&E TEMP (strategy) and Software Test Plan (STP):- Size & complexity of the software. Degree of software
reuse. Amount of COTS software.- Role the software will play (safety, security risks,
activities to be performed by the software, critical or high risk functions, technology maturity)
- Number of interfaces / complexity of integration / number of outside systems requiring interoperability
- Cost & schedule available for software testing, previous test results
- Software development strategy, stability of software requirements 26Software Testing
27Software Testing
• Real-time performance may be marginal
• Robustness & reliability lower when compared to custom code
• Higher COTS use in a system generally implies more difficult system level integration testing (4 or more is a critical risk)
• Frequent market-driven releases complicate regression testing (Promote Intelligent Bypass)
• Unit level testing/component testing is generally impossible (Black or Gray Box testing only)
• Incomplete documentation• Multiple complex, non-
standard interfaces (interoperability is a risk)
• Market leverage may not exist to force vendor bug fixes
• Formal requirements documents unavailable
• COTS usage may not match original design environment
COTS/NDI Software T&E Issues & Risks
Software T&E Trends
• IT trends that may affect T&E (all of these tend to add more complexity to software testing):- Larger & more complex software packages- More emphasis on software reuse- More challenges / more emphasis on cybersecurity - More challenges / more emphasis on interoperability
(part of the Net-Ready KPP)- Rapid pace of technology advances- Open Architecture
28Software Testing
Six (6) DoD Software Domains T&E Emphasis
• Mission Systems software:- Focus is on White Box testing (especially on the embedded
software applications)- Does the unique hardware/software system function correctly in
its intended operating environment?- System safety & mitigation of key risks- Interoperability with other known systems- Reliability usually critical
• C4ISR, DBS, Infrastructure, Cybersecurity and Modeling & Simulation (M&S) software:- Focus is on White, Black and Gray box testing- Does the software function correctly (often in an office or
command center environment on varying hardware)- Cybersecurity, to include privacy are key risks- Interoperability through standards compliance- Reliability may not be as critical 29Software Testing
30Software Testing
• Agile software development requires:- a more detailed test plan as test events occur more frequently (Agile Time-
boxed Development).
- intense configuration management of test results due to the frequent nature of testing.
• Inadequate Test Coverage and Application Programming Interface (API) Testing : - With each Sprint, use code analysis tools to monitor code changes to
ensure adequate test cases.
- Use tools that allow testers to test the API who don’t have strong coding skills.
• Code Broken Accidentally due to Frequent Builds; - add tests to an automated script so you can use automated testing on a
regular basis.
- Automated testing tools are required to finish regression test requirements for each test.
Agile SW Development T&E Challenges
31Software Testing
• Early Detection of Defects: (Catch errors as early as possible!)- Use static analysis tools to find missing error routines, coding standard
deviations, data type mismatches, other errors that can arise due to frequent build and test cycles.
• Performance Bottlenecks: - Identify the areas of your code are causing performance issues and how
performance is being impacted over time.
- Use load testing tools to help identify slow areas and track performance over time to more objectively document performance from release to release.
Agile SW Development T&E Challenges(Continued)
Lesson Overview
32Software Testing
• Software Testing Overview• Software Development and Testing Process• Software T&E Considerations, Issues and Trends
• Software Testing Lessons Learned & Best Practices
• Summary
Lesson Plan
Software T&E Lessons Learned
From “A Real Life Experience in Testing a 1,000,000+ SLOC Ada Program” STC 1994
• Formulate test strategy prior to contract award that accommodates cost/schedule constraints
• Test Strategy should be able to:- Verify all critical software
requirements of system - Test in a way to isolate faults
• Phase testing to focus on:- Qualification Testing- Operational Thread Testing- Performance/Stress Testing
• Resolve the “requirements” vs. “design” information argument early-on (Watch out for Gold-Plating)
• Plan ahead for 1) Adequate schedule, 2) Test Regression Strategy, 3) Timing and format of deliverables and 4) Accommodating incremental builds.
• Understand the test--be prepared to prioritize test cases
• Be flexible and attuned to end-of-schedule pressures
• Cop an attitude--know when to fall on your sword
• Understand the politics of the acquisition
• Ensure you have Separate Development, Test and Evaluation organizations to provide independent development of software and T&E.
33Software Testing
Arlie Council 16 Best Practices
14) Inspect Requirements and Design
15) Manage Testing as a Continuous Process
16) Compile & Smoke Test Frequently
Product Integrity & Stability (Tester)
Use FormalInspections!
Integration &AcceptanceThroughout!
Use DailyCompile &Smoke Testing!
Arlie Council 16Software Management
Best Practices*
One Example of Software
Management “Best Practices”
*As developed by the Software Program Managers Network(www.spmn.com)
Universal Truth #9: COTS are notnecessarily the best solution. They bringrisks + benefits…understand both!
Management Emphasis• Investigate the pricing structure• Select established products with largeinstalled bases• Budget for the complete cost of COTSintegration and maintenance• “Fly before you buy” + TEST a lot• Adapt your requirements to COTS
Elephant Bungee Jumping #9: AvoidingDiseases that Are Fun to Catch
Elephant Bungee Wisdom
Employing IV&V on Agile Projects: Lessons Learned
• Identify the Control Points in the Agile Process. The control points are where decisions need to be made. Decisions on concept documentation, use cases and test results can be independently assessed by the IV&V team to ensure accuracy, completeness, consistency and traceability to the war-fighter requirements.
• Review the Backlog. IV&V is used to ensure accuracy, completeness, consistency and testability of user stories in the backlog. IV&V assesses the desired change for impact on existing tasks and products.
• Manage the Roadmap. IV&V is used to ensure each iteration within a Sprint is in concert with the overall Agile Project Roadmap.
• Question Business Value. IV&V will review the Agile Minimum Viable Product (MVP) to ensure that what was produced in a Sprint matches the MVP. The IV&V team will validate each new user story or business requirement to ensure it adds the proper value to the warfighter.
• Validate Consistent Practices. The IV&V team can act as the “process cop” to ensure the agreed to repeatable processes in place remain in place from Sprint to Sprint.
36Software Testing
37
Class Exercise (Which Team Can Solve the Quickest?)
Short Range Assault Weapon (SRAW) Program
38
Class Exercise
ROUTINE 6-1 Contractor Software Code
39
Class Exercise
SRAW SOFTWARE DESIGN DIAGRAM
40
Class Exercise
SRAW CONTRACTOR TEST REPORT
41
Class Exercise
What type of testing did you do to discover the error?
Lesson Overview
42Software Testing
• Software Testing Overview• Software Development and Testing Process• Software T&E Considerations, Issues and Trends• Software Testing Lessons Learned & Best Practices
• Summary
Lesson Plan
Summary
43Software Testing
Today we learned to:Overall: Given a DoD IT/SW acquisition scenario, apply software T&E best practices that result in system acceptance. • Identify the T&E responsibilities of the IT Program Manager. The
PM is responsible for defining software maturity as part of the Program’s Software Quality definition so we know when T&E is considered successful (done).
• Describe the purpose of DoD software Test & Evaluation (T&E). Demonstrate performance of the whole software system. Assist in fault detection and correction of faults.
• Describe the DoD software T&E environment. Identify defects as early as possible! Always use FORMAL INSPECTIONS.
• Describe the DoD software T&E process. DoD software T&E ensures that we built the system right (verification, per spec) and that we built the right system (validation, what the warfighter wanted).
• Identify what Independent Verification and Validation (IV&V) is. IV&V means you have an independent team doing independent testing on critical risk areas of your software (e.g., safety).
Summary
44Software Testing
Today we learned to:• Recognize the different types of software testing tools. Use of
automated test tools is critical for testing today’s large and complex DoD software applications. Successful programs use Static and Dynamic test tools in combination.
• Identify common DoD software testing issues and risks. Each DoD software domain has a different architecture. Ensure you plan your tests to address size, complexity, reuse, COTS and integration challenges of that domain.
• Recognize the Agile software development T&E challenges. T&E events happen more frequently in an Agile development. Use of automated tools and more frequent test plan adjustments are critical to success.
• Recognize the Software T&E Best Practices. There are a number of Software T&E Best Practices and Lessons Learned. Emphasize Formal Inspections for requirements, design, test plans and source code to ensure accuracy.
45Software Testing
Future Lesson Material
46Software Testing
ELO 22.1.1.13 Given a safety-critical system scenario, determine the specific software T&E challenges
MT 13.1 When doing T&E on a safety-critical system, it is a best practice to follow the procedures and recommendations in the “Joint Software Systems Safety Engineering Handbook,” Version 1.0, August 27, 2010.MT 13.2 Use of statistical testing is required to lower safety risk. Statistical testing is where you test a representative subset of the complex code and extrapolate the test results across the entire application to predict the overall quality of the code.MT 13.3 Cleanroom is one example of software development-T&E option for a safety-critical environment. Cleanroom is an example of Extreme Team Desk Checking.MT 13.4 Use of Static and Dynamic Analysis tools is critical when doing T&E on safety-critical systems because the goal is highly optimized, defect free code.
Learning Points in Notes Section
47Software Testing
ELO 22.1.1.14 Given a Cybersecurity-critical system scenario, determine the specific
software T&E challenges MT 14.1 Planning and executing cybersecurity DT&E should occur early in the acquisition lifecycle, beginning before MS A or as early as possible in the acquisition lifecycle.MT 14.2 Test activities should integrate RMF security control assessments with tests of commonly exploited and emerging vulnerabilities early in the acquisition life cycle. More information on RMF security controls is available in the RMF KS at https://rmfks.osd.mil.MT 14.3 The TEMP should detail how testing will provide the information needed to assess cybersecurity and inform acquisition decisions. Historically, TEMPs and associated test plans have not adequately addressed cybersecurity measures or resources. MT 14.4 The cybersecurity T&E phases support the development and testing of mission-driven cybersecurity requirements, which may require specialized systems engineering and T&E expertise. The Chief Developmental Tester may request assistance from SMEs such as vulnerability testers and adversarial testers (Red Team-type representatives) to assist in implementation of cybersecurity testing. MT 14.5 Cybersecurity T&E requires additional time, money and possibly resources (e.g., National Security Agency (NSA) certified assessors) to perform Red Team events (e.g., Adversarial assessment). MT 14.6 The Adversarial DT&E Team (i.e., Red Team) should meet with the Chief Developmental Tester to develop a detailed test plan. The team will share its rules of engagement and will describe its threat portrayal based on its knowledge and the information provided by the program. Through its analysis, the team will identify assets of value, system processes, vulnerabilities, attack plans and methods, and scheme types and indicators.
LEARNING POINT IN NOTES SECTION
48Software Testing
ELO 22.1.1.15 Given a Privacy-critical system scenario, determine the specific software T&E
challenges.
MT 15.1 When doing T&E for Defense Business Systems (DBS), it is a best practice to use the NIST SP 800-53, “Security and Privacy Controls for Federal Information Systems and Organizations, to identify test procedures to ensure Personally Identifiable Information (PII) is secure.MT 15.2 When testing large Defense Business Systems (DBS), don’t use large volumes of data for software unit, integration, regression and quality assurance testing as the data volume is too much to test in a reasonable period of time. There are sampling techniques and automated tools to help test large volumes of data used by a DBS.
Learning Points in Notes Section