+ All Categories
Home > Documents > Objectives of Test Planning COMP 4004 – Fall 2008 Notes...

Objectives of Test Planning COMP 4004 – Fall 2008 Notes...

Date post: 03-Apr-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
48
© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 1 Objectives of Test Planning COMP 4004 – Fall 2008 Notes Adapted from Dr. A. Williams © Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 2 Objectives of Test Planning Record, organize and guide testing activities Schedule testing activities according to the test strategy and project deadlines Describe the way to show confidence to customers Provide a basis for re-testing during system maintenance Provide a basis for evaluating and improving testing process
Transcript
Page 1: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 1

Objectives of Test Planning

COMP 4004 – Fall 2008

Notes Adapted from Dr. A. Williams

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 2

Objectives of Test Planning

• Record, organize and guide testing activities

• Schedule testing activities according to the test strategy and project deadlines

• Describe the way to show confidence to customers

• Provide a basis for re-testing during system maintenance

• Provide a basis for evaluating and improving testing process

Page 2: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 3

IEEE 829 Standard Test Plan

• Revision: 1998

• Describes scope, approach resources, schedule of intended testing activities.

• Identifies – test items, – features to be tested, – testing tasks, – who will do each task, and – any risks requiring contingency planning.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 4

IEEE 829 Test Plan Outline

1. Test plan identifier

2. Introduction (refers to Project plan, Quality assurance plan, Configuration management plan, etc.)

3. Test items - identify test items including version/revision level (e.g requirements, design, code, etc.)

4. Features to be tested

Page 3: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 5

IEEE 829 Test Plan Outline

5. Features not to be tested

6. Testing Approach

7. Significant constraints on testing (test item availability, testing-resource availability, deadlines).

8. Item pass/fail criteria

9. Suspension criteria and resumption requirements

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 6

IEEE 829 Test Plan Outline

10. Test deliverables (e.g. test design specifications, test cases specifications, test procedure specifications, test logs, test incident reports, test summary report)

11. Testing tasks

12. Environmental needs

13. Responsibilities

Page 4: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 7

IEEE 829 Test Plan Outline

14. Staffing and training needs

15. Schedule

16. Risks and contingencies

17. Approvals

18. References

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 8

IEEE 829 Test Design Specification

1. Test design specification identifier

2. Features to be tested – Features addressed by document

3. Approach refinements

4. Test identification– Identifier and description of test cases

associated with the design

5. Features pass/fail criteria– Pass/fail criteria for each feature

Page 5: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 9

IEEE 829 Test Case Specification

1. Test case specification identifier

2. Test items– Items and features to be exercised by the test case.

3. Input specifications– Input required to execute the test case, data bases,

files, etc.

4. Output specifications

5. Environmental needs

6. Special procedural requirements

7. Inter-case dependencies

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 10

IEEE 829 Test Procedure Outline

1. Purpose

2. Special requirements

3. Procedure stepsa) Log – how to log resultsb) Set Up – how to prepare for testingc) Start – how to begin procedure executiond) Proceed – procedure actions e) Measure – how test measurements will be madef) Shut Down – how to suspend testing procedureg) Restart – how to resume testing procedureh) Stop – how to bring execution to an orderly halti) Wrap Up – how to restore the environmentj) Contingencies – how to deal with anomalous events during

execution

Page 6: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 11

IEEE 829 Test Log Outline

1. Test log identifier

2. Description– Information on all the entries in the log

3. Activity and event entries a) Execution Descriptionb) Procedure Results - observable results (e.g. messages),

outputs c) Environmental Information specific to the entryd) Anomalous Events (if any)e) Incident Report Identifiers (identifier of test incident

reports if any generated)

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 12

IEEE 829 Test Incident Report (1)

1. Test incident report identifier

2. Summary - items involved, references to linked documents (e.g. procedure, test case, log)

3. Incident descriptiona. Inputs b. Expected results c. Actual results d. Date and time e. Anomalies f. Procedure step g. Environment h. Attempts to repeat i. Testers j. Observers

Page 7: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 13

IEEE 829 Test Incident Report (2)

4. Impact – on testing process– S: Show Stopper – testing totally blocked,

bypass needed– H: High - major portion of test is partially

blocked, test can continue with severe restrictions, bypass needed

– M: Medium - test can continue but with minor restrictions, no bypass needed

– L: Low – testing not affected, problem is insignificant

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 14

IEEE 829 Test Summary Report

1. Test summary report identifier

2. Summary - Summarize the evaluation of the test items, references to plans, logs, incident reports

3. Variances – of test items (from specification), plan, procedure, ...

4. Comprehensive assessment - of testing process against comprehensiveness criteria specified in test plan

5. Summary of results – issues (resolved, unresolved)

6. Evaluation - overall evaluation of each test item including its limitations

7. Summary of activities

8. Approvals - names and titles of all persons who must approve the report

Page 8: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 15

System Testing

• Performed after the software has been assembled.

• Test of entire system, as customer would see it.

• High-order testing criteria should be expressed in the specification in a measurable way.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 16

System Testing

• Check if system satisfies requirements for:– Functionality– Reliability– Recovery – Multitasking– Device and Configuration – Security– Compatibility – Stress– Performance – Serviceability– Ease/Correctness of installation

Page 9: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 17

System Testing

• Acceptance Tests– System tests carried out by customers or under

customers’ supervision– Verifies if the system works according to the customers’

expectations

• Common Types of Acceptance Tests– Alpha testing: end user testing performed on a system

that may have incomplete features, within the development environment

– Performed by an in-house testing panel including end-users.

– Beta testing: an end user testing performed within the user environment.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 18

Functional Testing

• Ensure that the system supports its functional requirements.

• Test cases derived from statement of requirements.– traditional form – use cases

Page 10: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 19

Deriving Test Cases from Requirements

• Involve clarification and restatement of the requirements to put them into a testable form.– Obtain a point form formulation

– Enumerate single requirements– Group related requirements

– For each requirement:– Create a test case that demonstrates the

requirement.– Create a test case that attempts to falsify the

requirement. – For example: try something forbidden.

– Test boundaries and constraints when possible

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 20

Deriving Test Cases from Requirements

• Example: Requirements for a video rental system

• The system shall allow rental and return of films– 1. If a film is available for rental then it may be lent to a

customer. – 1.1 A film is available for rental until all copies have

been simultaneously borrowed. – 2. If a film was unavailable for rental, then returning the

film makes it available. – 3. The return date is established when the film is lent

and must be shown when the film is returned. – 4. It must be possible for an inquiry on a rented film to

reveal the current borrower.– 5. An inquiry on a member will reveal any films they

currently have on rental.

Page 11: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 21

Deriving Test Cases from Requirements

• Test situations for requirement 1– Attempt to borrow an available film. – Attempt to borrow an unavailable film.

• Test situations for requirement 1.1– Attempt to borrow a film for which there are

multiple copies, all of which have been rented. – Attempt to borrow a film for which all copies

but one have been rented.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 22

Deriving Test Cases from Requirements

• Test situations for requirement 2– Borrow an unavailable film. – Return a film and borrow it again.

• Test situations for requirement 3.– Borrow a film, return it and check dates– Check date on a non-returned film.

Page 12: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 23

Deriving Test Cases from Requirements

• Test situations for requirement 4 – Inquiry on rented film. – Inquiry on returned film. – Inquiry on a film which has been just returned.

• Test situations for requirement 5 – Inquiry on member with no films. – Inquiry on member with 1 film. – Inquiry on member with multiple films.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 24

Deriving Test Cases from Use Cases

• For all use cases:1. Develop a graph of scenarios2. Determine all possible scenarios3. Analyze and rank scenarios4. Generate test cases from scenarios to meet a

coverage goal5. Execute test cases

Page 13: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 25

Scenario Graph

• Generated from a use case

• Nodes correspond to point where system waits for an event– environment event, system reaction

• There is a single starting node

• End of use case is finish node

• Edges correspond to event occurrences– May include conditions and looping edges

• Scenario: – Path from starting node to a finish node

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 26

Use Case Scenario Graph (1)

Title: User login

Actors: User

Precondition: System is ON

1. User inserts a card

2. System asks for personal identification number (PIN)

3. User types PIN

4. System validates user identification

5. System displays a welcome message to user

6. System ejects card

Postcondition: User is logged in

1

3

4

5

6

1a.1

4b.1

2

1a.2

4a.2

1a: card isnot valid

4b:PIN invalidand attempts ≥ 4 4a:PIN invalid and

attempts < 4

4a.1

Page 14: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 27

Use Case Scenario Graph (2)

Alternatives:

1a: Card is not valid

1a.1: System emits alarm

1a.2: System ejects card

4a: User identification is invalid

AND number of attempts < 4

4a.1 Ask for PIN again and go back

4b: User identification is invalid

AND number of attempts ≥ 4

4b.1: System emits alarm

4b.2: System ejects card

1

3

4

5

6

1a.1

4b.1

2

1a.2

4a.2

1a: card isnot valid

4b:PIN invalidand attempts ≥ 4 4a:PIN invalid and

attempts < 4

4a.1

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 28

Scenarios

• Paths from start to finish

• The number of times loops are taken needs to be restricted to keep the number of scenarios finite.

User login with regular card. Wrong PIN on all four tries.

1-2-3-4-(2*-3-4)3-4b.1-4b.24

User login with regular card. Wrong PIN on first try. Correct PIN on second try.

1-2-3-4-2-3-4-5-63

User login with non-regular card1-1a.1-1a.22

User login with regular card. Correct PIN on first try. Normal scenario.

1-2-3-4-5-61DescriptionEventsID

Page 15: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 29

Scenario Ranking

• If there are too many scenarios to test:– Ranking may be based on criticality and frequency– Can use operational profile, if available

– “Operational profile”: statistical measurement of typical user activity of the system.

– Example: what percentage of users would typically be using any particular feature at any time.

• Always include main scenario– Should be tested first

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 30

Test Case generation

• Satisfy a coverage goal. For example:– All branches in graph of scenarios (minimal

coverage goal)– All scenarios– n most critical scenarios

Page 16: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 31

Example of Test Case

Test Case: TC1

Goal: Test the main course of events for the ATM system.

Scenario Reference: 1

Setup: Create a Card #2411 with PIN #5555 as valid user identification, System is ON

Course of test case

Pass criteria: User is logged in

System validates user identification. System displays a welcome message to the user.

User types PIN #55552

System asks for Personal Identification Number (PIN)

User inserts card #24111CommentReactionExternal event#

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 32

Forced-Error Test (FET)

• Objective: to force system into all error conditions– Basis: set of error messages for system.

• Checks– Error-handling design and communication

methods consistency– Detection and handling of common error

conditions – System recovery from each error condition– Correction of unstable states caused by errors

Page 17: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 33

Forced-Error Test (FET)

• Verification of error messages to ensure:– Message matches type of error detected.– Description of error is clear and concise.– Message does not contain spelling or

grammatical errors.– User is offered reasonable options for getting

around or recovering from error condition.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 34

Forced-Error Test (FET)

• How to obtain a list of error conditions? – Obtain list of error messages from the developers– Interviewing the developers– Inspecting the String data in a resource file– Information from specifications– Using a utility to extract test strings out of the binary

or scripting sources– Analyzing every possible event with an eye to error cases– Using your experience– Using a standard valid/invalid input test matrix

Page 18: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 35

Forced-Error Test (FET)

• For each error condition :1. Force the error condition.2. Check the error detection logic3. Check the handling logic

– Does the application offer adequate forgiveness and allow the user to recover from the mistakes gracefully?

– Does the application itself handle the error condition gracefully?

– Does the system recover gracefully? – When the system is restarted, it is possible that

not all services will restart successfully?

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 36

Forced-Error Test (FET)

4. Check the error communication– Determine whether an error message

appears – Analyze the accuracy of the error message – Note that the communication can be in

another medium such as an audio cue or visual cue

5. Look for further problems

Page 19: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 37

Usability Testing

• Checks ability to learn, use system to perform required task– Usability requirements usually not explicitly specified

• Factors influencing ease of use of system– Accessibility: Can users enter, navigate, and exit the

system with relative ease? – Responsiveness: Can users do what they want, when they

want, in an intuitive/convenient way? – Efficiency: Can users carry out tasks in an optimal

fashion with respect to time, number of steps, etc.? – Comprehensibility: Can users quickly grasp how to use

the system, its help functions, and associated documentation?

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 38

Usability Testing

• Typical activities for usability testing– Controlled experiments in simulated working

environments using novice and expert end-users.

– Post-experiment protocol analysis by human factors experts, psychologists, etc

• Main objective: collect data to improve usability of software

Page 20: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 39

Installability Testing

• Focus on requirements related to installation – relevant documentation – installation processes – supporting system functions

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 40

Installability Testing

• Examples of test scenarios– Install and check under the various option given (e.g.

minimum setup, typical setup, custom setup).– Install and check under minimum configuration.– Install and check on a clean system.– Install and check on a dirty system (loaded system). – Install of upgrades targeted to an operating system.– Install of upgrades targeted to new functionality.– Reduce amount of free disk space during installation – Cancel installation midway – Change default target installation path – Uninstall and check if all program files and install (empty)

directories have been removed.

Page 21: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 41

Installability Testing

• Test cases should include– Start / entry state – Requirement to be tested (goal of the test)– Install/uninstall scenario (actions and inputs)– Expected outcome (final state of the system).

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 42

Serviceability Testing

• Focus on maintenance requirements – Change procedures (for various adaptive,

perfective, and corrective service scenarios)– Supporting documentation – All system diagnostic tools

Page 22: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 43

Performance/Stress/Load Testing

• Performance Testing– Evaluate compliance to specified performance

requirements for: – Throughput– Response time– Memory utilization– Input/output rates– etc.

– Look for resource bottlenecks

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 44

Performance/Stress/Load Testing

• Stress testing - focus on system behavior at, near or beyond capacity conditions – Push system to failure– Often done in conjunction with performance

testing.– Emphasis near specified load, volume

boundaries– Checks for graceful failures, non-abrupt

performance degradation.

Page 23: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 45

Performance/Stress/Load Testing

• Load Testing - verifies handling of a particular load while maintaining acceptable response times– done in conjunction with performance testing

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 46

Performance Testing Phases

• Planning phase

• Testing phase

• Analysis phase

Page 24: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 47

Performance Testing ProcessPlanning phase

• Define objectives, deliverables, expectations

• Gather system and testing requirements– environment and resources– workload (peak, low)– acceptable response time

• Select performance metrics to collect– e.g. Transactions per second (TPS), Hits per

second, Concurrent connections, Throughput, etc.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 48

Performance Testing ProcessPlanning phase

• Identify tests to run and decide when to run them.– Often selected functional tests are used as the test

suite.– Use an operational profile to match “typical”usage.– Decide how to run the tests:

– Baseline Test– 2x/3x/4x baseline tests– Longevity (endurance) test

• Decide on a tool/application service provider option – to generate loads (replicate numerous instances of test

cases)

• Write test plan, design user-scenarios, create test scripts

Page 25: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 49

Performance Testing ProcessTesting Phase

• Testing phase– Generate test data– Set-up test bed

– System under test– Test environment performance monitors

– Run tests – Collect results data

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 50

Performance Testing ProcessAnalysis Phase

• Analyze results to locate source of problems– Software problems– Hardware problems

• Change system to optimize performance– Software optimization– Hardware optimization

• Design additional tests (if test objective not met)

Page 26: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 51

Configuration Testing

• Configuration testing – test all supported hardware and software configurations – Factors:

– Hardware: processor, memory– Operating system: type, version– Device drivers– Run-time environments: JRE, .NET

• Consist of running a set of tests under different configurations exercising main set of system features

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 52

Configuration Testing

• Huge number of potential configurations

• Need to select configurations to be tested– decide the type of hardware needed to be tested– select hardware brands, models, device drivers to test– decide which hardware features, modes, options are

possible– pare down identified configurations to a manageable set

– e.g.: based on popularity, age

Page 27: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 53

Compatibility Testing

• Compatibility testing – test for– compatibility with other system resources in

operating environment – e.g., software, databases, standards, etc.

– source- or object-code compatibility with different operating environment versions

– compatibility/conversion testing – when conversion procedures or processes

are involved

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 54

Security Testing

• Focus on vulnerabilities to unauthorized access or manipulation

• Objective: identify any vulnerabilities and protect a system.– Data

– Integrity– Confidentiality– Availability

– Network computing resources

Page 28: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 55

Security Testing – Threat Modelling

• To evaluate a software system for security issues– Identify areas of software susceptible to be exploited

for security attacks.

• Threat Modeling steps:1. Assemble threat modelling team (developers, testers,

security experts)2. Identify assets (what could be of interest to attackers)3. Create an architecture overview (major technological

pieces and how they communicate, trust boundaries between pieces)

4. Decompose the application (identify how/where data flows through the system, what are data protection mechanisms)

– Based on data flow and state diagrams

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 56

Security Testing – Threat Modelling

• Threat Modeling steps4. Identify the Threats

– Consider each component as target,– How could components be improperly

viewed/attacked/modified?– Is it possible to prevent authorized users access to

components?– Could anyone gain access and take control of the system?

5. Document the Threats (description, target, form of attack, countermeasure)

6. Rank the Threats based on:– Damage potential– Reproducibility– Exploitability– Affected users– Discoverability

Page 29: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 57

Security TestingCommon System Vulnerabilities

• Buffer overflow

• Command line (shell) execution

• Backdoors

• Web scripting language weakness

• Password cracking

• Unprotected access

• Information leaks– Hard coding of id/password information– Revealing error messages– Directory browsing

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 58

Security TestingBuffer Overflow

• One of the most commonly exploited vulnerabilities.

• Caused by: 1. The fact that in x86 systems, a program stack

can mix both data (local function variables) and executable code.

2. The way program stack is allocated in x86 systems (stack grows up-to-down).

3. A lack of boundary checks when allocating buffer space in memory in program code (a typical bug).

Page 30: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 59

Security TestingBuffer Overflow

void parse(char *arg){

char param[1024];int localdata;strcpy(param,arg);.../...return;

}main(int argc, char **argv){

parse(argv[1]);.../...

}

Exit address (4 bytes)

Main stack (N bytes)

Return address (4 bytes)

param (1024 bytes)

localdata (4 bytes)

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 60

Security TestingSQL injection

• Security attack consisting of:– Entering unexpected SQL code in a form in order to

manipulate a database in unanticipated ways– Attacker’s expectation: back processing is supported by

SQL database

• Caused by:– The ability to string multiple SQL statements together

and to execute them in a batch– Using text obtained from user interface directly in SQL

statements.

Page 31: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 61

Security TestingSQL injection

• Example: Designer intends to complete this SQL statement with values obtained from two user interface fields.SELECT * FROM bank

WHERE LOGIN = '$id' AND PASSWORD = '$password'

• Malicious user enters:Login = ADMIN

And

Password = anything' OR 'x'='x

• Result: SELECT * FROM bank

WHERE LOGIN = 'ADMIN' AND PASSWORD = 'anything' OR 'x'='x'

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 62

Security TestingSQL injection

• Avoidance:– Do not copy text directly from input fields through to

SQL statements.– Input sanitizing (define acceptable field contents with

regular expressions)– Escape/Quotesafe the input (using predefined quote

functions)– Use bound parameter (e.g. prepareStatement)– Limit database access– Use stored procedures for database access– Configure error reporting (not to give too much

information)

Page 32: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 63

Security TestingTechniques

• Penetration Testing – try to penetrate a system by exploiting crackers’ methods– Look for default accounts that were not

protected by system administrators.

• Password Testing – using password cracking tools– Example: passwords should not be words in a

dictionary

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 64

Security TestingTechniques

• Buffer Overflows - systematical testing of all buffers– Sending large amount of data– Check boundary conditions on buffers

– Data that is exactly the buffer size– Data with length (buffer size – 1)– Data with length (buffer size + 1)

– Writing escape and special characters – Ensure safe String functions are used

• SQL Injection – Entering invalid characters in form fields (escape,

quotes, SQL comments, ...)– Checking error messages

Page 33: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 65

Concurrency Testing

• Investigate simultaneous execution of multiple tasks / threads / processes / applications.

• Potential sources of problems– interference among the executing sub-tasks – interference when multiple copies are running– interference with other executing products

• Tests designed to reveal possible timing errors, force contention for shared resources, etc– Problems include deadlock, starvation, race

conditions, memory

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 66

Multitasking Testing

• Difficulties– Test reproducibility not guaranteed

– varying order of tasks execution – same test may find problem in a run and fail

to find any problem in other runs– tests need to be run several times

– Behavior can be platform dependent (hardware, operating system, ...)

Page 34: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 67

Multitasking Testing

• Logging can help detect problems – log when– tasks start and stop– resources are obtained and released– particular functions are called– ...

• System model analysis can be effective for finding some multitasking issues at the specification level– using: FSM based approaches, SDL, TTCN,

UCM, ...– intensively used in telecommunications

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 68

Recovery Testing

• Ability to recover from failures, exceptional conditions associated with hardware, software, or people– Detecting failures or exceptional conditions – Switchovers to standby systems – Recovery of execution state and configuration (including

security status) – Recovery of data and messages – Replacing failed components – Backing-out of incomplete transactions – Maintaining audit trails– External procedures

– e.g. storing backup media or various disaster scenarios

Page 35: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 69

Reliability Testing

• Popularized by the Cleanroom development approach (from IBM)

• Application of statistical techniques to data collected during system development and operation (an operational profile) to specify, predict, estimate, and assess the reliability of software-based systems.

• Reliability requirements may be expressed in terms of – Probability of no failure in a specified time

interval– Expected mean time to failure (MTTF)

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 70

Reliability Testing – Statistical Testing

• Statistical testing based on a usage model.

1. Development of an operational usage model of the software

2. Random generation of test cases from the usage model

3. Interpretation of test results according to mathematical and statistical models to yield measures of software quality and test sufficiency.

Page 36: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 71

Reliability Testing – Statistical Testing

• Usage model– Represents possible uses of the software– Can be specified under different contexts; for example:

– normal usage context – stress conditions – hazardous conditions – maintenance conditions

– Can be represented as a transition graph where:– Nodes are usages states.– Arcs are transitions between usages states.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 72

Reliability Testing – Statistical Testing

• Example – Security alarm system– For use on doors, windows, boxes, etc.

– Has a detector that sends a trip signal when motion is detected

– Activated by pressing the Set button.– Light in Set button illuminated when security alarm is

on– An alarm emitted if trip signal occurs while the device is

active. – A 3 digit code must be entered to turn off the alarm. – If a mistake is made while entering the code, the user

must press the Clear button before retrying.– Each unit has a hard-coded deactivation code.

Page 37: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 73

Security alarm – Stimuli and Responses

High-pitched sound deactivatedAlarm offHigh-pitched sound activatedAlarm onSet button not illuminatedLight offSet button illuminatedLight on

DescriptionResponse

Digit that is part of 3-digit codeGood digit (G)

Clear entryClear (C)

Incorrect digit for 3-digit codeBad digit (B)

Signal from detectorTrip (T)

Device is activatedSet (S)

DescriptionStimulus

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 74

Reliability Testing – Statistical Testing

• Usage model for alarm system

Page 38: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 75

Usage Model for Alarm system with probabilities

• Usage Probabilities assumptions:– Trip stimuli

probability is 0.05 in states Ready, Entry Error, 1_OK, 2_OK.

– Other stimuli that cause a state change have equal probabilities.

• Results in a Markov chain

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 76

Reliability Testing – Statistical Testing

• Usage probabilities are obtained from – field data, – estimates from customer interviews – using instrumentation of prior versions of the

system

• For the approach to be effective, probabilities must reflect future usage

Page 39: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 77

Reliability Testing – Statistical Testing

• Usage Model analysis

• Based on standard calculations on a Markov chain

• Possible to obtain estimates for:– Long-run occupancy of each state - the usage profile as a

percentage of time spent in each state. – Occurrence probability - probability of occurrence of

each state in a random use of the software. – Occurrence frequency - expected number of occurrences

of each state in a random use of the software. – First occurrence - for each state, the expected number

of uses of the software before it will first occur. – Expected sequence length - the expected number of

state transitions in a random use of the software; the average length of a use case or test case.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 78

Reliability Testing – Statistical Testing

• Test case generation:– Traverse the usage model, guided by the

transition probabilities.– Each test case:

– Starts at the initial node and end at an exit node.

– Consists of a succession of stimuli – Test cases are random walks through the usage

model– Random selection used at each state to

determine next stimulus.

Page 40: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 79

Randomly generated test case

Software terminatedG112_OKG101_OKG9ReadyC8Entry errorB7ReadyC6Entry errorB5ReadyC42_OKG31_OKG2ReadyS1Next stateStimulus#

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 80

Reliability Testing – Statistical Testing

• Measures of Test Sufficiency (when to stop testing?)

• Usage Chain- usage model that generates test cases.– used to determine each state long-run occupancy

• Testing Chain - used during testing to track actual state traversals.

• Add a counter to each arc initialized to 0

• Increment the counter of an arc whenever a test-case execute the transition – Discriminant - difference between usage chain and

testing chain (degree to which testing experience is representative to the expected usage)

• ⇒ testing can stop when the discriminant plateaus

Page 41: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 81

Reliability Testing – Statistical Testing

• Reliability measurement

• Failure states - added to the testing chain as failures occur during testing

• Software reliability - probability of taking a random walk through the testing chain from invocation to termination without encountering a failure state.

• Mean Time to Failure (MTTF) - average number of test cases until a failure occurs

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 82

Example: All tests pass

100.0100.00.0019Pass30100.0100.00.0020Pass29

100.0100.00.0055Pass16100.0100.00.0059Pass1596.8100.0- -Pass14

67.7100.0- -Pass358.1100.0- -Pass222.660.0- -Pass1% arcs% statesD(U,T)Verdict#

Page 42: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 83

Example with failure cases

0.7864.667Pass140.7694.333Pass130.754.0Fail12

0.6673.0Fail31.0- -Pass2

0.8185.5Pass110.85.0Pass100.7784.5Pass90.754.0Pass80.7143.5Fail70.8336.0Pass60.85.0Pass50.754.0Pass4

1.0- -Pass1ReliabilityMTTFVerdict#

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 84

Regression Testing

• Purpose: In a new version of software, ensure that functionality of previous versions has not been adversely affected.– Example: in release 4 of software, verify that

all (unchanged) functionality of versions 1, 2, and 3 still work.

• Why is it necessary?– One of the most frequent occasions when

software faults are introduced is when the software is modified.

Page 43: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 85

Regression Test Selection (1)

• In version 1 of the software, choose a set of tests (usually at the system level) that has the “best coverage” given the resources that are available to develop and run the tests.

• Usually take system tests that were run manually prior to the release of version 1, and create a version of the that can run automatically.– boundary tests– tests that revealed bugs– tests for customer-reported bugs

• Depends on tools available, etc.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 86

Regression Test Selection (2)

• With a new version N of the software, the regression test suite will need updating:– new tests, for new functionality– updated tests, for previous functionality that has

changed– deleted tests, for functionality that has been removed

• There is a tendency for “infinite growth” of regression tests– Periodic analyses are needed to keep the size of the test

suite manageable, even for automated execution.

Page 44: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 87

Regression Test Management (1)

• The regression package for the previous version(s) must be preserved as long as the software version is being supported.

• Suppose that version 12 of software is currently being developed.– If versions 7, 8, 9, 10, and 11 of software are

still being used by customers and are officially supported, all five of these regression packages must be kept

• Configuration management of regression suites is essential !

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 88

Configuration Management ofRegression Suites

7

Softwareversion

7

Regressionsuite

8 8

9 9

10 10

11 11

12underdevelopment

Regressionsuites includeall functionalityup to indicatedversion number

Page 45: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 89

When to run the regression suite?

• The version 11 suite would be run on version 12 of the software prior to the version 12 release– It may be run at all version 12 recompilations during

development

• If a problem report results in a bug fix in version 9 of the software, the version 9 suite would be run to ensure the bug fix did not introduce another fault.– If the fix is propagated to versions 10, 11, and 12, then

the version 10 regression suite would run against product version 10, and the version 11 regression suite would run against product version 11 and new version 12.

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 90

Bug fixes in prior releases

7

Softwareversion

7

Regressionsuite

8 8

9 9

10 10

11 11

12underdevelopment

bug fixhere

must be re-run

updates

Page 46: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 91

Constraints on Regression Execution

• “Ideal”: would like to re-run entire regression test suite for each re-compile of new software version, or bug fix in a previous version

• Reality: test execution and (especially) result analysis may take too long, especially with large regression suites.– Test resources often shared with new functionality

testing

• May only be able to re-run entire test suite only at significant project milestones– example: prior to product release

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 92

Test selection

• When a code modification takes place, can we run “only” the regression tests related to the code modification?– Issues:

– How far does the effect of a change propagate through the system?

– Traceability: keeping a link that stores the relationship between a test, and what parts of the software are covered by the test.

Page 47: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 93

Change Impact Analysis

• Module firewall strategy: if module M7 changes, retest all modules with a “use” or “used-by” relationship:

M1

M10

M3M2

M4 M5 M6 M7

M9M8

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 94

OO Class Firewalls

• Suppose A is a superclass of A, and B is modified. Then:– B should be retested– A should be retested if B has an effect on inherited

members of A

• Suppose A is an aggregate class that includes B, and B is modified. Then A and B should be retested.

• Suppose class A is associated to class B (by access to data members or message passing), and B is modified. Then A and B should be retested.

• The transitive closure of such relationships also needs to be retested.

Page 48: Objectives of Test Planning COMP 4004 – Fall 2008 Notes …people.scs.carleton.ca/~sbtajali/4004/slides-2/will-lec... · 2009-09-15 · Objectives of Test Planning COMP 4004 –

© Dr. A. Williams, Fall 2008 Software Quality Assurance – Lec12 95

Example

A

B

C D

E

F

I

K L

N

H

G

M

J

Omodified class

class to re-test“firewall”

aggregationassociationinheritance


Recommended