Date post: | 18-Jan-2016 |
Category: |
Documents |
Upload: | polly-moore |
View: | 218 times |
Download: | 0 times |
System Test Planning
SYST TPLAN
1 Location of Test Planning
Responsibilities for Test Planning
Results of Test Planning
Structure of a Test Plan
Test Definitions
Identifying Objects to be tested
Identifying Functions to be tested
Definition of the Test Exit Criteria
Definition of the Test Results
Definition of the Test Phases
Definition of the Test Tasks
2
3
6
7
5
4
8
9
10
12 Definition of the Test Responsibilities
Definition of the Test Environment
Definition of the Test Resources
Test Efforts
Test Efforts according to Activity Type
Test Efforts according to Phases
System Test Effort Estimation
Effort for the Calendar Test
Test Plan for the Calendar Test
Test Planning Metrics
Estimating with Test-Points
13
14
17
18
16
15
19
20
21
11 22
Location of Test Planning
SYST TPLAN-1
TestPlanning
TestConception
TestSpec.
TestExecution
TestEvaluation
Test Plan
TestConcept
TestCases
TestResults
TestDocum.
Closure
Init
Result of Test Planning
Test PlanningInitiates the test project
Responsibilities for Test Planning
SYST TPLAN-2
T e s t P l a n n i n g
Project Manager
Test Manager
Test TeamManager
Results of Test Planning
SYST TPLAN-3
T e s t p l a n u n g
Plan for the Test Project
StandardsRequirements Resources
Structure of a Test Plan
SYST TPLAN-4
According to
ANSI/IEEE-Standard 829
Test Plan Identifier
Test Plan Introduction
Test Objects or Items
Test Features
Test Restrictions
Test Strategy
Test Acceptance Criteria
Regression Test Criteria
Test Deliverables
Testing Tasks
Test Infrastructure and Environment
Test Responsibilities
Test Staffing
Test Schedule
Test Risks and Contingencies
Approvals
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
Test Definitions
SYST TPLAN-5
Component or System to be tested.
Feature of the Software to be tested.
Rule for the decision, if a test has passed or not.
A set of input values, execution preconditions, expected results and execution post conditions, developed for a particular test.
Document specifying a set of test cases for a test item (objective, inputs, test actions, expected results, preconditions)
Amount of logically related Test Cases executed together in one single session.
Document specifying a sequence of actions for the test execution.
Each unexpected Event during the Test.
Chronological report of details about the test execution.
Test Object:
Test Feature :
Acceptance Criteria:
Test Case:
Test Case Specification:
Test:
Test Scenario:
Test Event:
Test Log:
Identifying Objects to be tested
SYST TPLAN-6
JobProcedures
Programs
GUI’sReports
System Imports
Databases
Files
Messages
System Exports
Identifying Functions to be tested
SYST TPLAN-7
CreateOrder Item
CreateBill
LogOrder
CheckRequest
CheckCustomer
OrderItem
CreateReserve
CheckItem
OrderingSystem
Definition of the Test Exit Criteria
SYST TPLAN-8
All program branches always are tested by 85%.(branch coverage)
Each Database or File is confirmed by 90% of its fields (data field coverage)
Each Function is tested at least two times(function coverage)
All results are checked by 99% if they match the expected values. (result coverage)
1.
2.
3.
4.
Definition of the expected Results
SYST TPLAN-9
Unit Test
Test path report Test result report Test coverage report Error report
Integration Test
Test execution report Interface report Test coverage report Error report
System Test
System Lists System export reports File / Database reports GUI reports
Definition of the Test Phases
SYST TPLAN-10
Integration Test
Unit Test
Acceptance Test
System Test
1. Phase
2. Phase
3. Phase
4. Phase
Definition of the Test Tasks
SYST TPLAN-11
Generation of the Test Databases
Preparation of the System Inputs
Preparation of the expected Results
Description of theSystem Results
Setup of the System Environment
System Test
Definition of the Test Responsibilities
SYST TPLAN-12
Generation of the Test Databases
Setup of the Test Environment
Preparation of the expected Results
Description of the Test Scenarios
Lisa
Robert
Eva
Harry
Definition of the Test Environment
SYST TPLAN-13
IBM Host 3090
Windows NT - Client Windows NT - Client
Windows NT - Clients Windows NT - Clients
CICSCICS
DCOM
DCOM
RS 6000 Server
RS 6000 Server
Definition of the Test Resources
SYST TPLAN-14
TestTools
OperatingSystem
DatabaseSystem
Human Software Hardware
Test Workstations
Calculation Capacity
Disk CapacityTester
User Representative
Test Efforts
SYST TPLAN-15
Test Design = n1 Hours per Test Cycle / Test Run / Use Case
Test Specification = n2 Hours per Test Case
Test Execution = n3 Hours per Test Procedure
+ n4 Hours per Test
Test Evaluation = n5 Hours per Test Protocol
+ n6 Hours per Error Message
Test Management = (n1:6) * Overhead Factor
1
2
3
4
5
Test Efforts according to Activities
SYST TPLAN-16
TestPlanning
TestDesign
TestSpec
TestExecution
TestEvaluation
45%
40%
35%
30%
25%
20%
15%
10%
5%
0%Test Plan Test Concept
Test ScenarioTest
Environment
Test CasesTest Procedures
Test ResultsTest Info
Test ProtocolError Messages
% Test Effort
8%
12%
40%
25%
15%
Tester
Test Efforts according to Phases
SYST TPLAN-17
TestPlanning
andDesign
UnitTest
IntegrationTest System
Test
45%
40%
35%
30%
25%
20%
15%
10%
5%
0%Documents Module
ProgramsComponentsSub Systems
ApplicationSystems
% Test Effort
20%
36%
24%
20%
Developer
Tester TesterDeveloper
System Test Effort Estimation
SYST TPLAN-18
= Number for Transactions or Use Cases
= Estimated Number of Test Cases per Transaction or Use Case
= Estimated Effort per Test Case
= Number of Sub Systems = Effort per Sub System
= Number of User Interfaces = Effort per User Interface
= Number of Databases / Files = Effort per Database
= Number of System Interfaces = Effort per Interface
= Quality Factor = 2 - Quality Grade (0:1)
= Number of Repeats
= Repeating Factor = 1 - Degree of Automation (%)
= Overhead Factor = (%)
T1T2T3
S1S2S3S4
A1A2A3A4
QF
NR
RF
OF
System Test Effort = {[( T1 * T2 * T3) * QF] * [NR * RF]}
+ {[(S1A1) + (S2A2) + (S3A3) + S4A4)] * (1 + OF)}
Effort for the Calendar Test
SYST TPLAN-19
T1 = 5 Use Cases
T2 = 3 Test Cases per Use Case
T3 = 0,5 Hours pro Test Case Quality = 0,5
S1 = 1 Sub System A1 = 8 Hours QF = 1,50
S2 = 1 User Interface A2 = 4 Hours NR = 3 Repetitions
S3 = 1 Database A3 = 12 Hours RF = 0,50
OF = 0,30Calendar Testing Effort = {[(5 * 3 * 0,5) * 1,5] * [3 *0,50]} + {[8 + 4 + 12] * (1 + 0,3)}
Calendar Testing Effort = {[7,5 * 1,5] * 1,5} + { 24 * 1,3}
Calendar Testing Effort = { 11,25 * 1,5} + { 31,2}
Calendar Testing Effort = 16,8 + 31,2
Calendar Testing Effort = 48 Hours
Test Plan for the Calendar Test
SYST TPLAN-20
Confirm that each Employ can manage a Calendar with 52 Weeks, 7 Days per Week and 12 Activities per Day
A Calendar GUI,A Database with 4 Objects Calendar Weeks Days Activities
Client/Server with Calendar on Server
Use Case oriented
Test Design, Test Case Specification, Test Execution, Test Evaluation
Crate Calendar, Create Week, Create Day, Insert and Delete Activities, Delete Calendar
Execute all Use Cases with all Paths,Trigger all Error Messages,Try to cause all System Breakdowns
48 Person Hours
Test Goal:
Test Objects:
Test Environment:
Test Strategy:
Test Phases:
Use Cases:
Test Amount:
Test Effort:
+----------------------------------------------------------------------+| C O N C E P T Q U A N T I T Y M E T R I C S || Number of Text Documents analyzed =======> 1 || Number of Text Lines analyzed =======> 575 || Number of Text Titles analyzed =======> 9 || Number of Text Segments analyzed =======> 25 || Number of Sentences contained in texts =======> 396 || Number of Bullet Points contained in text=======> 77 || Number of Words used in texts =======> 1859 || Number of Nouns used in texts =======> 371 || Number of Verbs used in texts =======> 413 || Number of Conditions used in texts =======> 34 || Number of Requirements specified =======> 16 || Number of User Interfaces specified =======> 8 || Number of User Reports specified =======> 3 || Number of Objects specified =======> 105 || Number of Objects referenced =======> 287 || Number of Predicates specified =======> 58 || Number of numeric Constants specified =======> 51 || Number of Literals specified =======> 0 || Number of States identified =======> 16 || Number of Actions identified =======> 93 || Number of Rules identified =======> 50 || Number of System Actors specified =======> 8 || Number of Use Cases specified =======> 6 || Number of Use Case Triggers specified =======> 1 || Number of Use Case Steps specified =======> 39 || Number of UseCase Preconditions specified=======> 6 || Number of UseCase Postconditions specified======> 6 || Number of UseCase Exceptions specified =======> 6 || Number of UseCase Attributes specified =======> 120 || Number of Test Cases extracted =======> 167 || || C O N C E P T S I Z E M E T R I C S || Number of Function-Points =======> 211 || Number of Data-Points =======> 884 || Number of Object-Points =======> 682 || Number of Use Case Points =======> 146 || Number of Test Case Points =======> 201 |+----------------------------------------------------------------------+
Requirement Metrics from the Text Analysis
SYST TPLAN-21
+----------------------------------------------------------------------+ | | | T E S T M E T R I C R E P O R T | | PRODUCT: TESTPROD | | SYSTEM : TESTSYS | | DATE: 16.12.06 PAGE: 0001 | | Metric Definition Metric Type Metric Value | +----------------------------------------------------------------------+ | Number of Test Cases specified Absolute Count 167 | | Number of Test Cases executed Absolute Count 122 | | Number of Code Modules Absolute Count 19 | | Number of Code Statements Absolute Count 3551 | | Number of Methods&Procedures coded Absolute Count 201 | | Number of Methods&Procedures tested Absolute Count 172 | | | | Number of Defects predicted Absolute Count 20 | | Number of Defects in total Absolute Count 14 | | Number of Critical Defects (8) Absolute Count 0 | | Number of Severe Defects (4) Absolute Count 2 | | Number of Major Defects (2) Absolute Count 3 | | Number of Medium Defects (1) Absolute Count 5 | | Number of Minor Defects (0.5) Absolute Count 4 | | Number of Weighted Defects Weighted Count 21 | | | | Defect Density Rate Relational Scale 0.0039 | | Weighted Defect Density Rate Relational Scale 0.0059 | | Case Coverage Rate Relational Scale 0.730 | | Code Coverage Rate Relational Scale 0.855 | | Test Coverage Rate Relational Scale 0.624 | | Defect Coverage Rate Relational Scale 0.700 | | Remaining Error Probability Relational Scale 0.002 | | Weighted Error Probability Relational Scale 0.003 | | System Trust Coefficient Relational Scale 0.878 | | Test Effectiveness Coefficient Relational Scale 0.795 | | | +----------------------------------------------------------------------+
Test Metrics from the Test Documentation
SYST TPLAN-22
Roll of Testing in the System Life CycleTEST TPLN-24
Problem
Solution
Req.
SpecSyst-Test
Prog-Test
UML-Diag.
Accp.-Test
ProgMod-Test
Testdata
Testcases
System-Analysis
SystemSpecification
SystemDesign
Code-Production
WHITE-BOX
GREY-BOX
BLACK-BOX
DynamicAnalysis
StaticAnalysis