Date post: | 11-May-2015 |
Category: |
Technology |
Upload: | techwellpresentations |
View: | 597 times |
Download: | 2 times |
rial
Presented by:
Rick raig Software eering
Brought to you by:
340 Corporate Way, Suite Orange Park, FL 32073 888‐2
ME AM Tuto4/7/2014 8:30 AM
“Essential Test Management and Planning”
C
Quality Engin
300,68‐8770 ∙ 904‐278‐0524 ∙ [email protected] ∙ www.sqe.com
A consultant, lecturer, author, and test manager, Rick Craig has led numerous teams
ny
e has
quent on.
Rick Craig ity Engineering Software Qual
of testers on both large and small projects. In his twenty-five years of consulting worldwide, Rick has advised and supported a diverse group of organizations on matesting and test management issues. From large insurance providers and telecommunications companies to smaller software services companies, hmentored senior software managers and helped test teams improve their effectiveness. Rick is coauthor of Systematic Software Testing and is a frespeaker at testing conferences, including every STAR conference since its incepti
© 2014 SQE Training V3.2 1
Rick [email protected]
ESSENTIAL TEST MANAGEMENT AND PLANNING
Administrivia
Course timing Breaks
4© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 2
Course Agenda
1. The Culture of Testing and Quality2 Introduction to STEP and Preventive 2. Introduction to STEP and Preventive
Testing3. Test Levels4. Master Test Plan5 The Test Summary Report5. The Test Summary Report
5© 2014 SQE Training V3.2
1THE CULTURE OF TESTING AND QUALITY
© 2014 SQE Training V3.2 3
What are your problems?
What are your testing challenges?
What are your testing challenges?
7© 2014 SQE Training V3.2
What is quality?
•Meeting requirements(stated and/or implied)
Qualityimplied)
8© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 4
What is testing?
•The process of determining conformance to requirements—stated and/or implied
Testingand/or implied
9© 2014 SQE Training V3.2
Class Questionnaire
• The overall quality of the software systems/products at my organization y /p y gis:
Outstanding - one of the bestAcceptable - OKPoor - must be improvedUnknown - a mystery to me
10© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 5
Class Questionnaire
• The time, effort, and money that my organization spends trying to achieve organization spends trying to achieve high software quality is:
Too much - needs to be reducedAbout right - OK gToo little - needs to be increasedUnknown - a mystery to me
11© 2014 SQE Training V3.2
Corporate Culture
• Them and Us vs. One Team • Early vs. Late
12© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 6
Economics of Test and Failure• The cost of testing• The cost of failure
The sa ings of p e enti e $14,102
• The savings of preventive testing and defect prevention
Cost toCorrect in
1970s $$7,136
ProductionDesignRequirements Code TestTime
$139 $455 $977
TRW, IBM, and Rockwell Landmark Study13© 2014 SQE Training V3.2
Software PsychologyWhat is “good enough”?
# of Bugs
Time14© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 7
2INTRODUCTION TO STEP AND PREVENTIVE TESTING
Select a Method/Process
• Systematic Test and Evaluation Process (STEP™) – SQEExample ( )
• MS Test Process – Microsoft• TMap®• Exploratory Testing
ptesting
methodologies
1.22
1.10
1.201.24
1.30
16© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 8
STEP™• Level Plans
– Acceptance S – System
– Integration – Unit
Project Testing
...
Unit
...
Plan
Analyze Design Implement
Aquire
...
Measure
Integration
...
System
...
Acceptance
17© 2014 SQE Training V3.2
STEP™ Activities
• P1. Establish master test plan• P2 Develop detailed test plans
Plan strategy • P2. Develop detailed test plansstrategy
• A1. Analyze test objectives• A2. Design tests• A3. Implement plans and designs
Acquire testware
• M1. Execute tests• M2. Check test set adequacy• M3. Evaluate software and process
Measure software and
testware18© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 9
Level TimingProject
planRequirementsspecification
High-leveldesign
ImplementationDetaileddesign
Master TestPlanning
and
Acceptance
i
System
Plan
Plan
Acquire
Acquire
Measure
Measure
Methodology
Standards
Guidelines Unit
Integration
Plan
Plan
Acquire
Acquire
Measure
Measure
19© 2014 SQE Training V3.2
Preventive Testing• Testing begins early (i.e., during
requirements) and test cases are d i t d lused as requirements models
• Testware design leads to software design
• Defects are detected earlier or prevented altogetherp g
• Defects are analyzed systematically • Testers and developers work
together
20© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 10
Opportunities for Defects
Business Needs
Requirements
Design
Coding
Installation
Operation
21© 2014 SQE Training V3.2
Testers Improve Requirements
• Testers ask:– What does this mean?– Are there any unspecified situations?– How can this requirement be adequately
demonstrated?– What problems might occur?
• Tests define and help specify • Tests define and help specify requirements– “A function/method or constraint
specification is unfinished until its test is defined”
22© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 11
Two ViewsFood and Drug Administration (FDA):( )“You CAN’T test quality into your software.”
SQE:SQE:“You MUST test quality into your software.”
23© 2014 SQE Training V3.2
3TEST LEVELS
© 2014 SQE Training V3.2 12
What is a Level?A level is defined by the environment
An environment is the collection of
• Hardware• System Software• Application Software• Documentation• People• Interfaces• Data
25© 2014 SQE Training V3.2
The “V” ModelPlanRequirements •Acceptance•
Plan
PlanHigh-leveldesign
Detailed design
•System
• Integration
•
•
Plan
g
Coding •Unit•
26© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 13
What determines the number of levels?
Risk
Politics
OrganizationOrganization
Objectives27© 2014 SQE Training V3.2
Acceptance TestingObjective: To demonstrate a system’s readiness for operational use and customer acceptance
• Focuses on user requirements• Follows systems testing and is expected to work• Is a final stage of confidence building• Provides protection to ensure production readiness• Provides protection to ensure production readiness• Allows customers to “sign off” on the product
Ends: When the system is approved for use
28© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 14
System TestingObjective: To develop confidence that a system is ready for acceptance testing. It forms the basis of a regression test set and often represents the bulk of the testing effort
•Most comprehensive testing• Usually tests software design and requirements• Usually consumes most test resources• Usually consumes most test resources
Ends: When a system is turned over for acceptance testing or moved into production
29© 2014 SQE Training V3.2
Integration Testing
Objective: To progressively test the interfaces b i d d lbetween units and modules
• Focuses on interfaces• Can be top‐down, bottom‐up, or functional
Ends: When the entire system has been integrated and stability has been demonstrated
30© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 15
Unit Testing
Objective: To determine that each unit meets its requirements (program specification) andits requirements (program specification) and that internal consistency has been achieved
• Can be black‐box as well as white‐box• Primary means of code coverage
Ends: When each unit has met its exit criteria
31© 2014 SQE Training V3.2
4MASTER TEST PLAN
© 2014 SQE Training V3.2 16
Master Test PlanMaster
Test Plan
UnitTest Plan
AcceptanceTest Plan
Plan
IntegrationTest Plan
SystemTest Plan
Test CasesTest Cases
33© 2014 SQE Training V3.2
Process vs. DocumentationThe process may be more important
than the product (i.e., paper)
34© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 17
Audience
Who will read a Master Test Plan?
35© 2014 SQE Training V3.2
IEEE Style Master Test Plan (2008)1. Introduction
– Document identifier– Scope
2. Details of the MTP– Test processes including
definition of test levelsM t– References
– System overview and key features
– Test overview• Organization• Master test schedule• Integrity level schema
R
• Management• Acquisition• Supply• Development• Operation• Maintenance
– Test documentation requirements
• Resources summary• Responsibilities• Tools, techniques,
methods, and metrics
– Test administration– Test reporting
3. General– Glossary– Document change
procedures and history
36© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 18
Master Test Plan Sections1. Test Plan Identifier2. Introduction3. Test Items4. Features to be tested5. Features not to be tested6. Software Risks7. Planning Risks and Contingencies 8. Approach9. Item Pass/Fail Criteria10. Suspension Criteria/Resumption Requirements11. Test Deliverables12 Testing Tasks12. Testing Tasks13. Environmental Needs14. Responsibilities15. Staffing and Training Needs16. Schedule17. Approvals
37© 2014 SQE Training V3.2
1. Test Plan Identifier / 2. Introduction• Test Plan Identifier
• Introduction– Scope of project– Scope of plan (functional)
38© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 19
3. Test Items• What is to be tested, from the library
management perspectiveWhi h b ild/ i f• Which build/version of:– Application software– Documentation– Databases– etc
Version 2.1
etc.
39© 2014 SQE Training V3.2
To test or not to test? That is the question!
4. Features to Be Tested / 5. Not to Be Tested
40© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 20
6. Risk AnalysisTwo kinds of risk
Software/product
Project /planningTesting priority
Contingencies andrisk mitigation
42© 2014 SQE Training V3.2
6. Example Software/Product Risks
•Scalability•Scope of use•Environment•Accessibility
•Functional complexity•Performance •Reliability•Third‐party productsy
•Usability•Interface complexity•Technical complexity
Third party products•Data integrity•Recoverability•New technology
43© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 21
6. Inventory Features and Attributes
Features
• Upload a file• Upload a file• Add a record• Logon to the system• Update a user profile• Generate a report
Attributes
• Accessibility • Reliability• Security• Performance• Backward compatibility
44© 2014 SQE Training V3.2
6. Risk Likelihood and Impact
LikelihoodHi h
ImpactHi h• High
•Medium•Low
• High•Medium•Low
45© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 22
6. Initial Risk Priority
Likelihood ImpactXHigh (3)Medium (2)Low (1)
High (3)Medium (2)Low (1)
= Potential $ Lossood 3 3 6 9 $
Risk (Test) Priority 1 2 3Impact
Like
liho
12
1 2 3
2 4 6
46© 2014 SQE Training V3.2
6. Risk Inventory ― Example
SpellingWeb Site Attribute Likelihood Impact Priority
High Low 3SpellingInvalid mail-toEmail virusesWrong tel #sSlow performancePoor usabilityUgly site
HighLowMediumLowHighMediumLow
LowMediumMediumHighHighMediumMedium
3243942
Does not work with Browser X
Hacker spam attackSite intrusion
Medium
LowLow
High
MediumHigh
6
23
47© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 23
6. Risk Inventory ― Example (cont.)
Slow performanceWeb Site Attribute Likelihood Impact Priority
High High 9Slow performanceDoes not work with
Browser XPoor usabilityEmail virusesSpellingWrong tel #s
HighMedium
MediumMediumHighLow
HighHigh
MediumMediumLowHigh
96
4433
Site intrusionInvalid mail-toUgly siteHacker spam attack
LowLowLowLow
HighMediumMediumMedium
3222
48© 2014 SQE Training V3.2
6. Risk Inventory ― Example (cont.)
Slow performanceWeb Site Attribute Likelihood Impact Priority
High High 9Slow performanceDoes not work with
Browser XPoor usabilityEmail virusesSpellingWrong tel #s
HighMedium
MediumMediumHighLow
HighHigh
MediumMediumLowHigh
96
4433
Site intrusionInvalid mail-toUgly siteHacker spam attack
LowLowLowLow
HighMediumMediumMedium
3222
49© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 24
7. Class Questionnaire
1) _________________________________
What are your greatest planning risks?
) _________________________________2) _________________________________3) _________________________________4) _________________________________
1)What are your contingencies?1) _________________________________2) _________________________________3) _________________________________4) _________________________________
50© 2014 SQE Training V3.2
7. Example Planning Risk Checklist• Delivery dates• Staff availability
• Lack of product Requirements
k• Budget• Environment options• Tool inventory• Acquisition schedule• Participant buy-in/
• Risk assumptions• Usage assumptions• Resource availability
Check Listxxxxxxxx Xp y
marketing• Training needs • Scope of testing
xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx xxxxxxxx
X
/////
51© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 25
7. Contingencies
Contingency Options:
+ Time+ Time- Scope/Size+ Resources- Quality/Risk
52© 2014 SQE Training V3.2
8. Approach / Strategy
The set of directions that has a major impact on the effectiveness or efficiency ofimpact on the effectiveness or efficiency of the testing effort (i.e., forks in the road)
54© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 26
8. Methodology (Political) Decisions• When will testers become involved in the
project?• What testing levels will be employed?g p y
– Acceptance– System– Integration– Unit
• How many (if any) beta sites will be used?Will there be a pilot?• Will there be a pilot?
• What testing techniques will be utilized?– Inspections– Walkthroughs
56© 2014 SQE Training V3.2
8. Entrance and Exit Criteria Decisions• What criteria will be used?• Are the criteria flexible/negotiable?• Who decides?
57© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 27
8. Smoke Tests as Entrance CriteriaSmoke tests are often used to evaluate whether or not a release is REALLY ready for system testing
58© 2014 SQE Training V3.2
8. Test Coverage Decisions
What strategy should be used for
• Functional coverage should be measured– For every test domain
• Program-based coverage should be used
determining “where to make the cut”?
– On all code written (given sufficient resources)– As a criteria for finishing unit testing
59© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 28
8. S/W Config Management Decisions• Who owns/is responsible for each
component?• How many environments will be used,
and what will each be used for?• How often will the SUT be updated?• How much regression testing is
required for each build?– Full– Partial
• Who controls changes to the test environment?
60© 2014 SQE Training V3.2
8. Full Regression Strategy ― Arguments
For AgainstFor
• Easy to manage (one test set per level)
• Lowest risk
• Easy to manage (one test set per level)
• Lowest risk
Against
• Heavy and often impractical resource requirements (ti d d ll )
• Heavy and often impractical resource requirements (ti d d ll )
Full regression usually requires maintenance of full, effective test sets
(time and dollars)(time and dollars)
61© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 29
8. Partial Regression Strategy ― Arguments
For Against• Solves execution resource
problem• Only viable strategy for
emergency
• Solves execution resource problem
• Only viable strategy for emergency
• Must develop and maintain adequate relationship information between functions and tests and functions and components
• Must know which components have changed
• Must decide which features
• Must develop and maintain adequate relationship information between functions and tests and functions and components
• Must know which components have changed
• Must decide which features• Must decide which features need to be re‐tested and assemble subsets
• Adds risk for features not re‐tested
• Must decide which features need to be re‐tested and assemble subsets
• Adds risk for features not re‐tested
62© 2014 SQE Training V3.2
8. Test Environment(s) Decisions• Will multiple platforms be needed?• What hardware will be used?• Data
– Where will it come from?– How much volume will be needed?– How fresh (fragility) must it be?– How inter-dependent (referential integrity)
d d bdoes it need to be?• What user profiles will be used?• Are simulators or other “test” software
required?64© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 30
8. Automation Decisions
• The four questionsWhat?– What?
– When?– Who?– How?
65© 2014 SQE Training V3.2
Automated TestingTools are not THE answer
66© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 31
9. Item Pass/Fail Criteria• Pass/fail criteria can be expressed
using a number of different measurements:measurements:– % of test cases passed/failed– Number of bugs
• Type of bugs• Severity of bugs• Location of bugs
Bugs and/or test cases may be “weighted”
67© 2014 SQE Training V3.2
10. Suspension and Resumption Criteria
Examples of suspension criteria include
Blocking bugs
Test case failures > 10%
GUI response > 5 seconds
Requirements < 80% complete
Environmental problems68© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 32
11. Testing Deliverables
Test Plan Test LogTest Plan
Test Procedure
Specification
Test Design Specification
Test Case Specification
Test Summary
ReportTest Incident
Report
69© 2014 SQE Training V3.2
12. Testing Tasks
To do listaaaaaaaabbbbccccddddeeee
70© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 33
13. Environmental Needs
• Hardware configuration• Data• Data• Interfaces• Facilities• Publications• Security access• Security access• System software• Documentation
71© 2014 SQE Training V3.2
14. Responsibilities
72© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 34
15. Staffing and Training NeedsSuccessful Testing Recipe
Take 1 experienced project leader
Add 5 senior testers, plus a handful of novice testers
Throw-in liberal amounts of training
Allow to mix and then add tools to taste
Garnish with appropriate amounts of hardware
73© 2014 SQE Training V3.2
16. Schedule
A timeline with milestones
74© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 35
16. Class Discussion
Where do milestones come from?
I need it today!
Where do milestones come from?•Marketing business decision•Wishful thinking• Estimates/guesstimates• PoliticsWAG•WAGs
75© 2014 SQE Training V3.2
17. ApprovalsWho calls the shots?
Users?
Product manager?
Development manager?
Test manager?
76© 2014 SQE Training V3.2
© 2014 SQE Training V3.2 36
Test Summary Report• Report Identifier• References
– Test items (with i i # )
• Adequacy Assessment– Evaluation of coverage– Identify uncovered
attributesrevision #s)– Environments– Test Plan
• Variances (Deviations)– From test plan or
requirements– Reasons for deviations
• Summary of Incidents
attributes• Summary of Activities
– System/CPU usage– Staff time– Elapsed time
• Software Evaluation – Limitations– Failure likelihoodSummary of Incidents
– Resolved incidents– Defect patterns– Unresolved incidents
Failure likelihood• Approvals
77© 2014 SQE Training V3.2
Shameless Commercial Message
78© 2014 SQE Training V3.2