Date post: | 06-May-2015 |
Category: |
Documents |
Upload: | softwarecentral |
View: | 1,881 times |
Download: | 0 times |
West Virginia University
Slide 1SENG 530 Verification & Validation
Part III: Execution – Based Verification and Validation
Katerina Goseva - PopstojanovaLane Department of Computer Science and Electrical Engineering
West Virginia University, Morgantown, WV
[email protected] www.csee.wvu.edu/~katerina
West Virginia University
Slide 2SENG 530 Verification & Validation
Outline
Introduction Definitions, objectives and limitations Testing principles Testing criteria
Testing techniques Black box testing White box testing Fault based testing
Mutation testing Fault injection
West Virginia University
Slide 3SENG 530 Verification & Validation
Outline
Testing levels Unit testing Integration testing
Top-down Bottom-up
Regression testing Validation testing
West Virginia University
Slide 4SENG 530 Verification & Validation
Outline
Non-functional testing Configuration testing Recovery Testing Safety testing Security testing Stress testing Performance testing
West Virginia University
Slide 5SENG 530 Verification & Validation
Software Quality Assurance
Quality of software is the extent to which software satisfies its specifications – it is not “excellence”
Independence between development team and SQA group Neither manager should be able to overrule the other
Does independent SQA group add considerably to the cost of software development?
West Virginia University
Slide 6SENG 530 Verification & Validation
Software Verification & Validation
Testing is an integral part of the software process and must be carried out throughout the life-cycle
Verification Determine if the phase was completed correctly Are we building the product right?
Validation Determine if the product as a whole satisfies its
requirements Are we building the right product?
West Virginia University
Slide 7SENG 530 Verification & Validation
V&V and software life cycle
Phase Activities
Requirements specification
- determine test strategy- test requirements specification- generate functional test data
Design
-check consistency between design and requirements specification - evaluate the software architecture- test the design- generate structural and functional test data
Implementation
- check consistency between design and implementation- test implementation- generate structural and functional test data- execute tests
Maintenance- repeat the above tests in accordance with the degree of redevelopment
West Virginia University
Slide 8SENG 530 Verification & Validation
Cost of software life cycle phases
Requirements analysis 3% Specification 3% Design 5% Coding 7% Testing 15% Maintenance 67%
West Virginia University
Slide 9SENG 530 Verification & Validation
Cost of finding and fixing faults
Requirements Implementation Deployment
Cost
West Virginia University
Slide 10SENG 530 Verification & Validation
Cost of finding and fixing faults
Changing a requirements document during its first review is inexpensive. It costs more when requirements change after code has been written: the code must be rewritten.
Fixing faults is much cheaper when programmers find their own errors. There is no communication cost. They don’t have to explain an error to anyone else. They don’t have to enter the fault into a fault tracking database. Testers and managers don’t have to review the fault status. The fault does not block or corrupt anyone else’s work.
Fixing a fault before releasing a program is much cheaper than maintenance or the consequences of failure.
West Virginia University
Slide 11SENG 530 Verification & Validation
Fault & Failure
Definitions Failure – departure from the specified behavior Fault – defect in the software that when executed
under particular conditions causes a failure
What we observe during testing are failures A failure may be caused by more than one fault A fault may cause different failures
West Virginia University
Slide 12SENG 530 Verification & Validation
Execution-based testing
Two types of testing Non-execution based (walkthroughs and
inspections) Execution based
Execution based testing is the process of inferring certain behavioral properties of product based, in part, on results of executing product in known environment with selected inputs
West Virginia University
Slide 13SENG 530 Verification & Validation
Test data - inputs which have been devised to test the system
Test cases - inputs to test the system and the predicted outputs from these inputs if the system operates according to its specification
All test cases must be Planned beforehand, including expected output Retained afterwards
Test data and test cases
West Virginia University
Slide 14SENG 530 Verification & Validation
Structure of the software test plan
Testing process – description of the major phases of the testing process
Requirements traceability – testing should be planned so that all requirements are individually tested
Tested items – products which are to be tested should be specified
Testing schedule – overall testing schedule and resource allocation for this schedule
West Virginia University
Slide 15SENG 530 Verification & Validation
Structure of the software test plan
Test recording procedures – results of the tests must be systematically recorded
Hardware and software requirements – software tools required and estimated hardware utilization
Constraints – constraints affecting the testing process such as staff shortages
West Virginia University
Slide 16SENG 530 Verification & Validation
Testing process
Design testcases
Prepare testdata
Run programwith test data
Compare resultsto test cases
Testcases
Testdata
Testresults
Testreports
West Virginia University
Slide 17SENG 530 Verification & Validation
Testing workbenches
Testing is expensive Testing workbenches provide a range of tools to
reduce the time required and total testing costs
Dynamicanalyser
Programbeing tested
Testresults
Testpredictions
Filecomparator
Executionreport
Simulator
Sourcecode
Testmanager Test data Oracle
Test datagenerator
Specification
Reportgenerator
Test resultsreport
West Virginia University
Slide 18SENG 530 Verification & Validation
Debugging
When software testing detects a failure, debugging is the process of detecting and fixing the fault
ResultsResultsTest
cases
Testcases
Executionof tests
Executionof tests
DebuggingDebugging
Suspected causes
Suspected causes
Additional tests
Additional tests
Identifiedcauses
IdentifiedcausesCorrectionsCorrections
Regression Tests
Regression Tests
West Virginia University
Slide 19SENG 530 Verification & Validation
Simple non-software example
A lamp in my house does not work If nothing in the house works, the cause must be in
the main circuit breaker or outside I look around to see whether the neighborhood is
blacked out I plug the suspect lamp into a different socket and a
working appliance into the suspect circuit
West Virginia University
Slide 20SENG 530 Verification & Validation
Debugging approaches
Brute force – most common, least efficient
Backtracking – effective for small programs; beginning from the site where a symptom has been uncovered, the source code is traced backward (manually) until the site of the cause is found
Cause elimination – a list of all possible causes is developed and tests are conducted to eliminate each; if initial tests indicate that a particular cause shows promise, data are refined in an attempt to isolate the bug
West Virginia University
Slide 21SENG 530 Verification & Validation
What should be tested
Utility Correctness Robustness Non-functional testing
Configuration testing Recovery Testing Safety testing Security testing Stress testing Performance testing
West Virginia University
Slide 22SENG 530 Verification & Validation
Utility
Utility – the extend to which a user’s needs are met when a correct product is used under conditions permitted by its specifications
Does the product meet user’s needs? Functionality Ease of use Cost-effectiveness
West Virginia University
Slide 23SENG 530 Verification & Validation
Correctness
A software product is correct if it satisfies its specification when operated under permitted conditions
What if the specifications themselves are incorrect?
West Virginia University
Slide 24SENG 530 Verification & Validation
Correctness of specifications
Specification for a sort
Function trickSort which satisfies this specification:
West Virginia University
Slide 25SENG 530 Verification & Validation
Correctness of specifications
Incorrect specification for a sort
Corrected specification for the sort
West Virginia University
Slide 26SENG 530 Verification & Validation
Correctness
Correctness of a product is meaningless if its specification is incorrect
Correctness is NOT sufficient
West Virginia University
Slide 27SENG 530 Verification & Validation
Robustness
A software product is robust if it works satisfactory on invalid inputs
Deliberately test the product on invalid inputs (error based testing)
West Virginia University
Slide 28SENG 530 Verification & Validation
All tests should be traceable to requirements definition and specification
Tests should be planned long before testing begins
The Pareto principle applies to software testing – 80% of all detected faults during testing will likely be traceable to 20% of program modules
Exhaustive testing is not possible Number of possible input data is exceptionally large Number of path permutations for even a moderately sized
program is exceptionally large
Testing principles
West Virginia University
Slide 29SENG 530 Verification & Validation
Example
A simple program like
for i from 1 to 100 do
print(if a[i]=true then 1 else 0 endif);
has 2100 different outcomes; exhaustively testing this program would take 3 x 1014 years
West Virginia University
Slide 30SENG 530 Verification & Validation
Limitations of testing
Dijkstra, 1972 – “Testing can be very effective way to show the presence of faults, but it is hopelessly inadequate for showing their absence”
Faults are not detected during testing Good news? Bad news?
West Virginia University
Slide 31SENG 530 Verification & Validation
Test objective
Provoking failures and detecting faults Increasing the confidence in failure-free
behavior
West Virginia University
Slide 32SENG 530 Verification & Validation
Test adequacy criteria
Test adequacy criterion can be used as Stopping rule – when a sufficient testing has been
done? Statement coverage criterion – stop testing if all
statements have been executed Measurement – mapping from the test set to the
interval [0,1] What is the percentage of statements executed
Test case generator If a 100% statement coverage has not been achieved
yet, select an additional test case that covers one or more statements yet untested
West Virginia University
Slide 33SENG 530 Verification & Validation
Test adequacy criteria
Test adequacy criteria are closely linked to test techniques Coverage based adequacy criterion (e.g.,
statement coverage) does not help us in assessing whether all error prone points have been tested
West Virginia University
Slide 34SENG 530 Verification & Validation
Strategic issues
Specify product requirements in a quantifiable manner long before testing commences
State testing objectives explicitly (e.g., coverage, mean time to failure, the cost to find and fix faults)
Understand the users of software and develop a profile for each user category
Build robust software that is designed to test itself
West Virginia University
Slide 35SENG 530 Verification & Validation
Strategic issues
Use reviews (walkthroughs and inspections) as a filter prior to execution based testing
Develop a continuous improvement approach for testing process
West Virginia University
Slide 36SENG 530 Verification & Validation
Testing techniques
Black box testing, also called functional or specification based testing – test cases are derived from the specification, does not consider software structure
White box testing, also called structural or glass box testing – considers the internal software structure in the derivation of test cases; test adequacy criteria are specified in terms of the coverage (statements, branches, paths, etc.)
West Virginia University
Slide 37SENG 530 Verification & Validation
Testing techniques
Fault based techniques Fault injection – artificially seed a number of faults in a
software Mutation testing – (large) number of variants (mutants) of
a software is generated; each variant slightly differs from the original version
Experiments indicate that there is no “best” testing technique Different techniques tend to reveal different types of
faults The use of multiple testing techniques results in the
discovery of more faults