Verification and Validation › ~russo › TTST › Introduction.pdf · • Verification and...

Post on 29-May-2020

5 views 0 download

transcript

Tools and Techniques for Software Testing - Barbara Russo SwSE - Software and Systems Engineering group

�1

Verification and Validation

�2

Groups: • Customers/Users • Requirements Analysts • Designers • Developers

Exercise - why testing?

�3

Steps: • Customers/Users get to look at a GUI and memorize it (5’)

• Customers/Users describe the functionality of the GUI to Requirements Analysts (without telling the name of the app ) while the analysts write down the requirements (5+5=10’)

• Requirements Analysts describe the system to Designers while the designer represent graphically entities and relations (5+5=10’)

• Designers describe the system to Developers (5+5=10’)

• Developers draw a GUI that satisfies the design and draw it at the blackboard (2’)

• Customers/Users validate GUI against the original (2’)

• Each group lists the problems they encountered independently (15’)

Exercise

�4

• Software products is imperfect as it is created by human beings

• Verification and Validation are processes that use techniques and methods to ensure the final product quality

• Testing is one of these processes

Verification and Validation

�5

• Ok, but what are they in your experience? • Are they synonyms? Is there any difference?

• Verification is:

• Validation is:

Verification and Validation

�6

• Check the consistency of an implementation with a specification

• It is about “How” i.e., the process of building • Are we building the product right?" (B. Boehm)

• Example: A music player plays (it does play) the music when I press Play

Verification

�7

• Check the degree at which a software system fulfills user’s requirements

• It is about “What” - the product itself • Are we building the right product ? (B. Boehm)

• Example: A music player plays a song (it does not show a video) when I press Play

Validation

�8

• Requirements are goals of a software system • Specifications are solutions to achieve such

goals • Validation: Software that matches requirements ⇒

useful software • Verification: Software that matches specifications ⇒ dependable software

Usefulness vs. dependability

�9

Verification and validation activities

Doyouknowanyofthem?

Whatdotheyhaveincommon?

�10

• What is what (Ver or Val)? • Acceptance test (with customer): negotiated with the

customer. It defines the input and the output of each software feature

• alpha test (acceptance test with user): performed by users in a controlled environment. Capture operational profiles decided by the organisation

• beta test (acceptance test with user): performed by users in a their own environment. Capture real operational profiles

Exercise

�11

• Are they synonyms? • Are there any differences?

• Verification is:

• Validation is:

Verification and Validation

NoYes

Tools and Techniques for Software Testing - Barbara Russo SwSE - Software and Systems Engineering group

�12

Testing as a verification process

�13

• We have seen that

• Dependability is the degree at which a software system complies with its specifications

Dependability

�14

• Software analysis and review are verification processes to examine a software artifact and to approve it

• Software testing is a verification process that detects differences between existing and required conditions and to evaluate the features of the software item

IEEE definition

Types of Verification process

Tools and Techniques for Software Testing - Barbara Russo SwSE - Software and Systems Engineering group

�15

What is the relation between testing and dependability?

�16

• Testing aims at verifying four software dependability properties: • Correctness: consistency with specification • Reliability: statistical approximation to correctness;

probability that a system deviates from the expected behavior

Goal of testing

�17

• Robustness: being able to maintain operations under exceptional circumstances of not full-functionality

• Safety: robustness in case of hazardous behavior (e.g., attacks)

!Goal of testing

�18

Relations

Source: Mauro Pezze’ and Michal Young

�19

• Reliability: built according to central scheduling and practice

• Robustness, safety: degraded function when possible; never signal conflicting greens • Blinking red / blinking

yellow is better than no lights;

• No lights is better than conflicting greens

Source: Mauro Pezze’ and Michal Young

�20

• Testing is a process • Different testing techniques can be used all

along the process

Testing techniques

�21

• Pay attention testing does not question specifications!!! Thus, it can be affected by specifications that do not have: • Consistency: Specification vs specification, no

conflicts • No ambiguity: open to interpretations,

uncertainty • Adherence to standards: consistency with

benchmarks

Specification Self-consistency

�22

• How can we check whether our software satisfies any of the dependability properties?

• Can we use a “proof”? • For example, correctness: given a set of

specifications and a program we want to find some logical procedure (e.g., a proof) to say that the program satisfies the specifications

Checking dependability

�23

Some problems cannot be solved by any computer program (Alan Turing)

Undecidability of problems

�24

Given a program P and an input I, it is not decidable whether P will eventually halt when it runs with that input I or it runs forever

The halting problem

�25

• Undecidability implies that given a program P and a set of verification techniques, we do not know whether the techniques can verify the program in finite time

• ... and even when it is feasible it might be very expensive

Checking a program

�26

• Thus, testing is inaccurate and can be expensive • => modern testing aims at automation

Inaccuracy of testing

�27

• Thus, techniques for verification are inaccurate when checking dependability properties

• Optimistic and pessimistic inaccuracy of a verification technique

Inaccuracy of techniques for verification

�28

• A technique that verifies a dependability property can return TRUE on programs that does not have the property (FALSE POSITIVE)

Optimistic Inaccuracy

�29

• Testing is an optimistic technique for correctness

• It returns that a program is correct even if no finite number of tests can guarantee correctness

Example

�30

• Pessimistic inaccuracy: technique that verifies a property S can return FALSE on programs that have the property (FALSE NEGATIVE)

Pessimistic Inaccuracy

�31

• Automatic testing is pessimistic for reliability as it typically uses rules

Example

�32

Accuracy: confusion matrix

Pred. TRUE

Pred. FALSE

TRUE TP FN

FALSE FP TN

Predictedbythetechnique

Truth

Tools and Techniques for Software Testing - Barbara Russo SwSE - Software and Systems Engineering group

�1

Introduction to the course and exam procedure

�2

• Teacher: Barbara Russo - POS 115 • Lab instructors: Barbara Russo and Florian Hofer

(POS 114)

People

�3

• To be able to select, use, customize, and deploy tools and apply techniques for software testing

Goal

�4

• Introduction to Testing • Techniques for black box and white box testing • Automated testing • Dynamic Testing • Static testing • Performance and monitoring • Introduction to search-based testing

Syllabus

�5

• We give you hints

Highly practical

�6

• You will create your own pipeline for testing

Highly practical

�7

• My public page: http://www.inf.unibz.it/~russo/ToolsTechniquesST.html

• Timetable (check it periodically) • Ole portal (emails, marks, non public material) • GitLab (your code)

Sources and material

�8

• Tuesdays 14:00 - 16:00 POS 115 • Appointment by email

Office hours

�9

• Lab: Assignments • Written exam

• Mark = 80%Lab+20%written exam

Milestones

�10

• The Lab work is incremental: except the first one, each assignment depends on the previous one

• Each lab assessed on scale 0-33 with 18=“pass” • The final Lab mark is the average of the

assignments marks • Warm-up assignments used to decide with

uncertain student’s performance

Lab Assessment

�11

• Except the first one, each assignment includes the fix of the previous one (if needed)

• The last assignment (“Project completion”) must be passed

• If the average is below “18” or the last assignment has not been delivered, the lab assignments can be done (or revisioned) and re-submitted in all in one one week before the exam date of the first session

• If the Lab is not passed at the first exam session, a new system (but the same assignments) must be selected

Lab Assessment

�12

• To access to the written exam, you must have passed the Lab with 18 or greater

• In case the Lab assessment is positive, the mark is kept for all three exam sessions (February, June, September)

Final assessment

�13

• The written exam is a set of 5-6 questions that can be little exercises or theoretical

• The written exam is assessed on scale 0-33 with 18=“pass”

• The same for the whole course

Written exam

Tools and Techniques for Software Testing - Barbara Russo SwSE - Software and Systems Engineering group

�14

Questions?