+ All Categories
Home > Documents > Verification and Validation Overview

Verification and Validation Overview

Date post: 11-Feb-2016
Category:
Upload: caine
View: 30 times
Download: 0 times
Share this document with a friend
Description:
Verification and Validation Overview. References: Shach, Object Oriented and Classical Software Engineering Pressman, Software Engineering: a Practitioner’s Approach , McGraw Hill Pfleeger, Software Engineering, Theory and Practice , Prentice Hall - PowerPoint PPT Presentation
Popular Tags:
24
Verification and Validation Overview References: Shach, Object Oriented and Classical Software Engineering Pressman, Software Engineering: a Practitioner’s Approach, McGraw Hill Pfleeger, Software Engineering, Theory and Practice, Prentice Hall Davis, Software Requirements: Objects, Functions, and States , Prentice Hall 1209
Transcript
Page 1: Verification and Validation Overview

Verification and ValidationOverview

References: Shach, Object Oriented and Classical Software Engineering

Pressman, Software Engineering: a Practitioner’s Approach, McGraw HillPfleeger, Software Engineering, Theory and Practice, Prentice Hall

Davis, Software Requirements: Objects, Functions, and States, Prentice Hall

1209

Page 2: Verification and Validation Overview

Purpose of V&V

• Groups of 3• 5 minutes• What is verification and validation?• What is the purpose of verification and

validation?

Page 3: Verification and Validation Overview

Purpose of V&V

• Give programmers information they can use to prevent faults

• Give management information to evaluate risk• Provide software that is reasonably defect free• Achieve a “testable” design (one that can be

easily verified)• Validate the software (demonstrate that it

works)

Page 4: Verification and Validation Overview

Definitions-1 (note: usual definitions, but they do not match all authors or

the 1990 IEEE glossary)• Validation

– Evaluation of an object to demonstrate that it meets expectations. (Did we build the right system?)

• Verification– Evaluation of an object to demonstrate that it meets its

specification. (Did we build the system right?)– Evaluation of the work product of a development phase to

determine whether the product satisfies the conditions imposed at the start of the phase.

• Correct Program– Program matches its specification

• Correct Specification– Specification matches the client’s intent

Page 5: Verification and Validation Overview

Definitions-2• Error (a.k.a. mistake):

– A human activity that leads to the creation of a fault– A human error results in a fault which may, at runtime, result in a

failure– Kaner: “It’s an error if the software doesn’t do what the user intended”

• Fault (a.k.a. bug)– May result in a failure– A discrepancy between what something should contain (in order for

failure to be impossible) and what it does contain– The physical manifestation of an error

• Failure (a.k.a. symptom, problem, incident)– Observable misbehavior– Actual output does not match the expected output– Can only happen when a thing is being used

Page 6: Verification and Validation Overview

Definitions-3

• Fault identification– Process of determining what fault caused a failure

• Fault correction– Process of changing a system to remove a fault.

• Debugging– The act of finding and fixing program errors

Page 7: Verification and Validation Overview

Definitions-4• Testing

– The act of designing, debugging, and executing tests

• Test– A sample execution to be examined

• Test case– A particular set of input and the expected output

• Oracle– Any means used to predict the outcome of a test

Page 8: Verification and Validation Overview

Definitions-5• Significant test case

– A test case with a high probability of detecting an error– One test case may be more significant than another

• Significant test set– A test set with a high probability of detecting an error– A test set is more significant than another if the first is a superset of

the second– The number of test cases does not determine the significance

• Regression testing: – rerun a test suite to see if

• a change fixed a bug • a change introduced a new one 

Page 9: Verification and Validation Overview

Definitions-6

• Let S be a relation, a specification of a program.

• Let P be the implementation of the program. • R is the Range. r R.• D is the domain. d D.• S (r,d). The specification. • P (r,d). The implementation.

Page 10: Verification and Validation Overview

Definitions-7

• FailureP (r,d) but not S (r,d)

• Test caseA pair (r,d) such that S (r,d)

• Test set T A finite set of test cases.

• P passes T if tT, t=(r,d) S (r,d) P (r,d)

• T is ideal if ( d,r | S (r, d) (P(r, d))) ( tT | t=(r’,d’) S (r’,d’) P(r’,d’))

• R: Range. r R.• D: Domain. d D.• S (r,d). The specification. • P (r,d). The implementation.

Page 11: Verification and Validation Overview

Definitions-7

• FailureP (r,d) but not S (r,d)

• Test caseA pair (r,d) such that S (r,d)

• Test set T A finite set of test cases.

• P passes T if tT, t=(r,d) S (r,d) P (r,d)

• T is ideal if ( d,r | S (r, d) (P(r, d))) ( tT | t=(r’,d’) S (r’,d’) P(r’,d’))

In groups, translate these into English

Page 12: Verification and Validation Overview

Take Home Message

• If your test set does not identify any bugs, was your testing successful?

Page 13: Verification and Validation Overview

Parnas: “There are only three engineering techniques for

verification”• Mathematical analysis• Exhaustive case analysis• Prolonged realistic testing

Page 14: Verification and Validation Overview

Parnas: “There are only three engineering techniques for

verification”• Mathematical analysis

– Works well for continuous functions (software engineering is more difficult than other engineering)

– Cannot interpolate reliably for discrete functions• Exhaustive case analysis• Prolonged realistic testing

Page 15: Verification and Validation Overview

Parnas: “There are only three engineering techniques for

verification”• Mathematical analysis• Exhaustive case analysis

– Only possible for systems with small state space• Prolonged realistic testing

Page 16: Verification and Validation Overview

Hierarchy of V&V techniques(not exhaustive)

Static Analysis

V&V

Dynamic Techniques

Model Checking

SymbolicExecution

Testing InformalAnalysis

FormalAnalysis

Static Techniques

Proofs Review

Inspection

Walkthrough

Page 17: Verification and Validation Overview

Hierarchy of V&V techniques

Static Analysis

V&V

Dynamic Techniques

Model Checking

SymbolicExecution

Testing InformalAnalysis

FormalAnalysis

Static Techniques

Proofs Review

Inspection

Walkthrough

Note: This does not match Van Vliet’s definition of testing: I use “testing” to

mean that the program is being executed. He does not.

Page 18: Verification and Validation Overview

Types of Faults

• In group• 3 minutes• List all the types and causes of faults: what

can go wrong in the development process?

Page 19: Verification and Validation Overview

Some Types of Faults• Algorithmic: algorithm or logic does not produce the proper output

for the given input• Syntax: improper use of language constructs• Computation (precision): formula’s implementation wrong or

result not to correct degree of accuracy• Documentation: documentation does not match what program does• Stress (overload): data structures filled past capacity• Capacity: system’s performance unacceptable as activity reaches its

specified limit• Timing: code coordinating events is inadequate• Throughput: system does not perform at speed required• Recovery: failure encountered and does not behave correctly.

Page 20: Verification and Validation Overview

Causes of FaultsRequirements

System Design

Program Design

Program Implementation

Unit Testing

System Testing

Incorrect or missing requirementsIncorrect translation

Incorrect design specification

Incorrect design specification

Incorrect design interpretation

Incorrect semantics

Incorrect documentation

Incomplete testing

New faults introduced correctingothers

Page 21: Verification and Validation Overview

Some Verification and Validation Techniques

Requirements

Operation

Design

Maintenance

Implementation

Testing

Reviews: walkthroughs/inspections

Synthesis

Model CheckingCorrectness Proofs

TraditionalRuntime Monitoring

Page 22: Verification and Validation Overview

Effectiveness of Fault Detection Techniques

0

10

20

30

40

50

60

70

Require

ments

Design

Coding

Documen

tation

Perc

ent E

rror

s di

scov

ered

Prototyping

Review, Requirements

Review, Design

Code Inspection

Unit Testing

Page 23: Verification and Validation Overview

Groups:2 min.What

does this slide say?

Page 24: Verification and Validation Overview

Error Estimates:

• 3 errors per 1000 keystrokes for trained typists.• 1 bug per 100 lines of code (after publication)• 1.5 bugs per line of code (all together, including

typing errors).• Testing is 30-90% of the cost of a product.• probability of correctly changing a program

50% if less than 10 lines, 20% if between 10 and 50 lines


Recommended