+ All Categories
Home > Documents > The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM –...

The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM –...

Date post: 03-May-2020
Category:
Upload: others
View: 8 times
Download: 0 times
Share this document with a friend
210
The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 1 of 210 The TestSPICE PAM Process Assessment Model Title: TestSPICE - Process Assessment Model Author(s): TestSPICE SIG Version 3.0 Date: 2014-10-10 Confidentiality: Public Status: Released
Transcript
Page 1: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 1 of 210

The TestSPICE PAM

Process Assessment Model

Title: TestSPICE - Process Assessment Model

Author(s): TestSPICE SIG

Version 3.0

Date: 2014-10-10

Confidentiality: Public

Status: Released

Page 2: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 2 of 210

Table of content

The TestSPICE PAM .......................................................................................................................................... 1

1 About this Document ................................................................................................................................ 6

2 Scope ....................................................................................................................................................... 7

2.1 Introduction .................................................................................................................................... 7

2.2 Definitions ...................................................................................................................................... 7

2.3 Warning .......................................................................................................................................... 7

3 Statement of compliance .......................................................................................................................... 8

4 Acknowledgements .................................................................................................................................. 8

5 Process Assessment Model ..................................................................................................................... 8

5.1 Purpose .......................................................................................................................................... 8

5.2 Introduction .................................................................................................................................... 9

5.3 Process Dimension ........................................................................................................................ 9

5.3.1 Business Life Cycle Processes Category ..................................................................... 11

5.3.2 Technical Life Cycle Processes Category .................................................................... 13

5.4 Capability dimension .................................................................................................................... 15

5.5 Mapping ....................................................................................................................................... 17

5.6 Assessment Indicators ................................................................................................................. 18

5.7 Process Capability Indicators ....................................................................................................... 20

5.8 Process Performance Indicators .................................................................................................. 21

5.9 Measuring process capability ....................................................................................................... 22

6 Process Performance Indicators (level 1) .............................................................................................. 23

6.1 Agreement Processes for Testing Services (AGT) ...................................................................... 23

6.1.1 AGT.1 Testing Service Acquisition ............................................................................... 23

6.1.2 AGT.1a Acquisition Preparation ................................................................................... 24

6.1.3 AGT.1b Supplier Selection ........................................................................................... 26

6.1.4 AGT.1c Contract Agreement ........................................................................................ 28

6.1.5 AGT.1d Testing Service Monitoring ............................................................................. 30

6.1.6 AGT.1e Testing Service Acceptance ........................................................................... 32

6.1.7 AGT.2 Testing Service Supply ..................................................................................... 33

6.1.8 AGT.2a Test Supplier tendering ................................................................................... 33

6.1.9 AGT.2b Testing Service Delivery ................................................................................. 35

6.1.10 AGT.2c Testing Service / Test Product Acceptance Support ...................................... 37

6.2 Testing Process Group (TST) ...................................................................................................... 38

6.2.1 TST.1 Provision of required Test Inputs ....................................................................... 38

6.2.2 TST.2 Test Analysis & Design ...................................................................................... 41

6.2.3 TST.3 Test Realization and Execution ......................................................................... 44

6.2.4 TST.4 Test Results Analysis and Reporting ................................................................. 46

6.3 Test Process Management Process Group (TPM) ...................................................................... 47

Page 3: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 3 of 210

6.3.1 TPM.1 Organizational Test Strategy Management ...................................................... 47

6.3.2 TPM.1a Organizational Test Strategy Development .................................................... 48

6.3.3 TPM.1b Organizational Test Strategy Deployment ...................................................... 50

6.3.4 TPM.2 Test Requirements Analysis ............................................................................. 52

6.3.5 TPM.3 Test Planning .................................................................................................... 54

6.3.6 TPM.4 Test Monitoring and Control ............................................................................. 58

6.3.7 TPM.5 Test Closing and Reporting .............................................................................. 60

6.4 Test Regression Reuse and Maintenance Process Group (TRM) .............................................. 62

6.4.1 TRM.1 Test Asset Management ................................................................................... 62

6.4.2 TRM.2 Test Work Products Reuse Management......................................................... 64

6.4.3 TRM.3 Regression Test Management ......................................................................... 65

6.4.4 TRM.4 Testware Maintenance ..................................................................................... 67

6.5 Test Environment Management Process Group (TEM)............................................................... 69

6.5.1 TEM.1 Test Environment Requirements Analysis ........................................................ 69

6.5.2 TEM.2 Test Environment Design and Configuration Planning ..................................... 71

6.5.3 TEM.3 Test Environment Assembly ............................................................................. 73

6.5.4 TEM.4 Test Environment Testing ................................................................................. 75

6.5.5 TEM.5 Test Environment Operation ............................................................................. 77

6.5.6 TEM.6 Test Environment User Support ....................................................................... 78

6.5.7 TEM.7 Test Environment Disassembly ........................................................................ 81

6.6 Test Data Management Process Group (TDM) ........................................................................... 82

6.6.1 TDM.1 Test Data Requirements Management ............................................................. 83

6.6.2 TDM.2 Test Data Provision Planning ........................................................................... 85

6.6.3 TDM.3 Test Data Set Up .............................................................................................. 87

6.7 Test Automation Process Group (TAU) ....................................................................................... 89

6.7.1 TAU.1 Test Automation Needs & Requirements Elicitation ......................................... 89

6.7.2 TAU.2 Test Automation Design .................................................................................... 91

6.7.3 TAU.3 Test automation Implementation ....................................................................... 92

6.7.4 TAU.4 Test Case Implementation ................................................................................ 93

6.7.5 TAU.5 Test Automation Usage ..................................................................................... 94

6.7.6 TAU.6 Test Automation process monitoring ................................................................. 96

7 Process Capability Indicators (level 1 to 5) ............................................................................................ 98

7.1 Level 1: Performed process ......................................................................................................... 98

7.1.1 PA 1.1 Process performance attribute. ........................................................................ 98

7.1.1.1 Generic Practices for PA 1.1 ........................................................................................ 98

7.1.1.2 Generic Resources for PA 1.1 ...................................................................................... 98

7.2 Level 2: Managed process ........................................................................................................... 98

7.2.1 PA 2.1 Performance management attribute ................................................................. 99

7.2.1.1 Generic Practices for PA 2.1 ........................................................................................ 99

Page 4: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 4 of 210

7.2.1.2 Generic Resources for PA 2.1 .................................................................................... 100

7.2.2 PA 2.2 Work product management attribute .............................................................. 100

7.2.2.1 Generic Practices for PA 2.2 ...................................................................................... 100

7.2.2.2 Generic Resources for PA 2.2 .................................................................................... 101

7.3 Level 3: Established process ..................................................................................................... 101

7.3.1 PA 3.1 Process definition attribute ............................................................................. 101

7.3.1.1 Generic Practices for PA 3.1 ...................................................................................... 102

7.3.1.2 Generic Resources for PA 3.1 .................................................................................... 102

7.3.2 PA 3.2 Process deployment attribute ......................................................................... 103

7.3.2.1 Generic Practices for PA 3.2 ...................................................................................... 103

7.3.2.2 Generic Resources for PA 3.2 .................................................................................... 104

7.4 Level 4: Predictable process ...................................................................................................... 104

7.4.1 PA 4.1 Process measurement attribute ...................................................................... 104

7.4.1.1 Generic Practices for PA 4.1 ...................................................................................... 104

7.4.2 PA 4.2 Process control attribute ................................................................................. 106

7.4.2.1 Generic Practices for PA 4.2 ...................................................................................... 106

7.4.2.2 Generic Resources for PA 4.2 .................................................................................... 106

7.5 Level 5: Optimizing process ....................................................................................................... 107

7.5.1 PA 5.1 Process innovation attribute ........................................................................... 107

7.5.1.1 Generic Practices for PA 5.1 ...................................................................................... 107

7.5.1.2 Generic Resources for PA 5.1 .................................................................................... 108

7.5.2 PA 5.2 Process optimization attribute ........................................................................ 108

7.5.2.1 Generic Practices of PA 5.2 ....................................................................................... 108

7.5.2.2 Generic Resources for PA 5.2 .................................................................................... 109

8 Annex A: Conformity of the Process Assessment Model..................................................................... 110

9 Annex B: Work product characteristics ................................................................................................ 114

10 Annex C: Terminology .......................................................................................................................... 190

11 Annex D: Key Concepts Schematic ..................................................................................................... 191

12 Annex E: Bidirectional Traceability ....................................................................................................... 191

13 Annex F: Reference Standards and Relevant Documents .................................................................. 192

14 Annex G: Usage of TestSPICE in multiple test stages ........................................................................ 193

15 Annex H: TestSPICE Support for agile projects .................................................................................. 196

15.1.1 AMP.1 Backlog Management ..................................................................................... 196

15.1.2 AMP.2 Impediment Management ............................................................................... 198

15.1.3 AMP.3 Service Class and WIP Limit Management .................................................... 200

15.1.4 AMP.4 Technical debt Management .......................................................................... 202

15.1.5 AMP.5 Knowledge debt Management ........................................................................ 203

15.1.6 AMP.6 Definition of Done (DoD) Management .......................................................... 206

15.1.7 AMP.7 Organizational Capacity Management ........................................................... 208

Page 5: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 5 of 210

16 Annex I: Planning and delivering technical artifacts ............................................................................. 210

17 Annex J: Integrated Scoping ................................................................................................................ 210

18 Annex K: Mapping from TestSPICE to ISO/IEC 29119 ....................................................................... 210

19 Annex L: Exemplar Test Strategies ...................................................................................................... 210

20 Annex M: Interpretation of TestSPICE Practices and Outcomes for Domains .................................... 210

Page 6: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210

1 About this Document

This document “TestSPICE PAM” describes the TestSPICE Process Assessment Model and the TestSPICE Process Reference Model (PRM), which is included in this PAM (new in Version 3.0).

This document reproduces relevant material from ISO/IEC 15504:2003 Information Technology – Process Assessment – Part 2: Performing an assessment and ISO/IEC 15504:2006 Information Technology – Process Assessment – Part 5: An exemplar Process Assessment Model.

ISO/IEC 15504 Part 2 provides the following copyright release:

‘Users of this part of ISO/IEC 15504 may freely reproduce relevant material as part of any Process Assessment Model, or as part of any demonstration of conformance with this international standard, so that it can be used for its intended purpose.’

ISO/IEC 15504 Part 5 provides the following copyright release:

‘Users of this part of ISO/IEC 15504 may freely reproduce the detailed descriptions contained in the exemplar assessment model as part of any tool or other material to support the performance of process assessments, so that it can be used for its intended purpose.’

Permission has been obtained from ISO to incorporate the relevant material under the copyright release notice.

Distribution

The TestSPICE PAM may be distributed under the following conditions:

The document must be distributed in whole as-is and at no cost.

Derivative Works

Derivative works: You may not alter, transform, or build upon this work without the prior consent of the SIG Partners. Such consent may be given provided ISO copyright is not infringed.

The detailed descriptions contained in this document may be incorporated as part of any tool or other material to support the performance of process assessments, so that this Process Assessment Model can be used for its intended purpose, provided that any such material is not offered for sale.

For further information about TestSPICE visit:

� www.testspice.info

� www.sqs-group.com

� www.intacs.info

� www.spiceusergroup.com

The TestSPICE SIG: The TestSPICE SIG c.o.

The SQS-Group

Stollwerckstr. 11

51149 Köln

Germany

The SPICE User Group The SPICE User Group

6 Wilmslow Road, Unit 50

Manchester M14 5TD

United Kingdom

Page 7: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 7 of 210

Intacs intacs

David-Gilly-Str. 1 D-14469 Potsdam, Germany

2 Scope

The scope of the TestSPICE PAM covers Software & System Testing and corresponding testing services.

2.1 Introduction

The TestSPICE Process Assessment Model (PAM) has been developed based on more than 25 years and more than 1500 Consultants experience with software and systems testing.

The TestSPICE Process Assessment Model (PAM) is available for use when performing conformant assessments of the test process capability in accordance with the requirements of ISO/IEC 15504-2.

The TestSPICE Process Reference Model (PRM) is used in conjunction with the TestSPICE Process Assessment Model (PAM) when performing an assessment.

The TestSPICE Process Reference Model (PRM), which is included in this PAM (since Version 3.0), is derived from practical experience based on the ISTQB syllabus.

The FULL scope of TestSPICE contains all the processes from the TestSPICE Process Reference Model (PRM).

This TestSPICE Process Assessment Model contains a set of indicators to be considered when interpreting the intent of the TestSPICE Process Reference Model. These indicators may also be used when implementing a process improvement program following an assessment.

2.2 Definitions

PAM Process Assessment Model

PRM Process Reference Model

SIG Special Interest Group

SPICE Software Process Improvement and Capability Determination

ISTQB International Software Testing Qualification Board

2.3 Warning

This document is subject to revision.

Page 8: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 8 of 210

3 Statement of compliance

The TestSPICE Process Assessment Model is conformant with the ISO/IEC 15504-2 requirements for a Process Assessment Model, and can be used as the basis for conducting an assessment of process capability.

A statement of compliance of the Process Assessment Model with the requirements of ISO/IEC 15504-2: 2003 can be found in Annex A.

4 Acknowledgements

The TestSPICE SIG would like to extend thanks to the reviewers of TestSPICE, who so generously contributed to this PAM:

Contributor Country

Antonio Amescua Spain

Ramiro Carballo Spain

Vito Antonio Coletta Italy

Martin Dienelt Germany

Damjan Ekert Austria

Jörn Johansen Danmark

Norbert Kastner Germany

Thomas Kömmerling Germany

Wolfgang Kroworsch Germany

Özden Özcan Top Turkey

Susumu Sasabe Japan

Ana Sanz Esteban Spain

Mario Winter Germany

5 Process Assessment Model

5.1 Purpose

The purpose of the TestSPICE process assessment model is to provide a model for the assessment of software testing processes aligned with the TestSPICE PRM and the measurement framework of ISO/IEC 15504 so providing a consistent view on process capability for development (as described in ISO/IEC 15504 Part 5) and testing as described in this PAM.

Page 9: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 9 of 210

5.2 Introduction

The TestSPICE Process Assessment Model (PAM) comprises a set of assessment indicators of process performance and process capability. The indicators are used as a basis for collecting the objective evidence that enables an assessor to assign ratings.

The TestSPICE Process Reference Model (PRM) with the associated process attributes defined in ISO/IEC 15504-2 provides a common basis for performing assessments process capability, allowing for the reporting of results using a common rating scale.

The Process Assessment Model defines a two-dimensional model of process capability. In one dimension, the process dimension, the processes are defined and classified into process categories. Within a process category, processes are grouped into process groups at a second level according to the type of activity they address.

In the other dimension, the capability dimension, a set of process attributes grouped into capability levels is defined. The process attributes provide the measurable characteristics of process capability.

Figure 1 - Relationship between the Process Assessment Model and its inputs

Figure 1 shows the relationship between the general structure of the Process Assessment Model, ISO/IEC 15504-2 and the TestSPICE Process Reference Model.

5.3 Process Dimension

For the process dimension, the TestSPICE Process Reference Model (PRM) provides the set of processes. The processes are classified into Process Categories and Process Groups.

There are 2 Process Categories:

1) Business Life Cycle Processes

2) Technical Life Cycle Processes.

Page 10: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 10 of 210

Each process is described in terms of a purpose statement. These statements contain the unique functional objectives of the process when performed in a particular environment. A list of specific outcomes is associated with each of the process purpose statements, as a list of expected positive results of the process performance.

Figure 2 - Process Categories and Process Groups

TestSPICE V3.0 is designed to be a standalone assessment model. But it is also designed to be used in joint assessments together with processes from ISO 15504 part 5 or part 6. So far TestSPICE can be seen as a complementary offer of process groups and processes to select to the assessment scope of a development project or organization.

Figure 3 - TestSPICE V3.0 & ISO 15504-5:2012: A joint approach.

Page 11: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 11 of 210

5.3.1 Business Life Cycle Processes Category

Figure 4 - TestSPICE Business life cycle process category.

The Business Life Cycle processes category consists of testing processes from the business perspective.

The Business Life Cycle processes category consists of the following groups:

� the Agreement for Test Services process group;

� the Testing process group;

� the Test Process Management process group;

� the Test Regression Reuse and Maintenance process group;

The Agreement for Testing Services process group (AGT) consists of processes that are performed by the customer, or by the supplier even when acting as a customer for its own suppliers, in order to acquire or supply a test service.

Any contract performed will be managed by processes in the Test Process Management process group (TPM) and executed by the processes in the Testing process group (TST).

Table 1 - Business Life Cycle Processes – AGT process group

Process

Identification

PRM Process name

AGT.1 Testing Service Acquisition

AGT.1a Acquisition Preparation

AGT.1b Supplier Selection

AGT.1c Contract Agreement

AGT.1d Testing Service Monitoring

Page 12: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 12 of 210

AGT.1e Testing Service Acceptance

AGT.2 Testing Service Supply

AGT.2a Test Supplier Tendering

AGT.2b Testing Service Delivery

AGT.2c Testing Service / Test Product Acceptance Support

The Testing process group (TST) consists of processes that directly elicit and manage the product or testing requirements, specify, implement, or maintain the testing of software or systems.

Note: TestSPICE does not contain any test stage (like e.g. unit, integration or system testing) or test

method (like e.g. equivalence classes or boundary values) as a process. The processes of the TST

Process Group are designed to be applied for any test stage or test type. Therefore the processes of the

TST Process Group may be assessed in several instances – one for each test stage or test type which

shall be assessed.

For further Information see Annex G Usage of TestSPICE in mutual test stages.

Table 2 - Business Life Cycle Processes – TST process group

Process

Identification

Process name

TST.1 Provision of required Test Inputs (the Test Basis)

TST.2 Test Analysis & Design

TST.3 Test Realization and Execution

TST.4 Test Results Analysis and Reporting

The Test Process Management process group (TPM) consists of processes performed in order to plan, monitor, control and report the testing.

Table 3 - Business Life Cycle Processes – TPM process group

Process Identification

Process name

TPM.1a Organizational Test Strategy Development

TPM.1b Organizational Test Strategy Deployment

TPM.2 Test Requirements Analysis

TPM.3 Test Planning

TPM.4 Test Monitoring and Control

TPM.5 Test Closing & Reporting

Note 1: A further set of agile management processes and practices is provided in Annex H “TestSPICE

Support for agile projects”.

Page 13: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 13 of 210

Note 2: It must be distinguished between the development and deployment of the test strategy process

which is subject to PA 3.1 and PA 3.2 and the test-strategy itself as an outcome that has to be

developed and deployed.

The Test Regression and Reuse & Maintenance process group (TRM) consists of processes performed in order to systematically exploit reuse opportunities in organizations, to support reuse programs and to manage regression tests.

Table 4 - Business Life Cycle Processes - TRM process group

Process

Identification

Process name

TRM.1 Test Asset Management

TRM.2 Test Work Products Reuse Management

TRM.3 Regression Test Management

TRM.4 Testware Maintenance

Note: There is also an agile management process group available in annex H. This group is a draft in

TestSPICE 3.0. Further handling will be defined according to practice experience in the assessment

business.

5.3.2 Technical Life Cycle Processes Category

Figure 5 - TestSPICE Technical life cycle process category.

The Test Environment Management process group (TEM) consists of processes which directly address definition, planning, set up and support of test environments.

Table 5 - Technical Life Cycle Processes – TEM process group

Page 14: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 14 of 210

Process

Identification

Process name

TEM.1 Test Environment Requirement Analysis

TEM.2 Test Environment Design and Configuration planning

TEM.3 Test Environment Assembly

TEM.4 Test Environment Testing

TEM.5 Test Environment Operation

TEM.6 Test Environment User Support

TEM.7 Test Environment Disassembly

The Test Data Management process group (TDM) consists of processes which directly address the provision and usage of test data

Table 6 - Technical Life Cycle Processes – TDM process group

Process

Identification

Process name

TDM.1 Test Data Requirements Management

TDM.2 Test Data Provision Planning

TDM.3 Test Data Set Up

The Test Automation process group (TAU) consists of processes which directly address definition, planning, set up and support of test automation.

Table 7 - Technical Life Cycle Processes – TAU process group

Process

Identification

Process name

TAU.1 Test Automation Needs & Requirements Elicitation

TAU.2 Test Automation Design

TAU.3 Test Automation Implementation

TAU.4 Test Case Implementation

TAU.5 Test Automation Usage

TAU.6 Test Automation Process Monitoring

Page 15: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 15 of 210

5.4 Capability dimension

For the capability dimension, the process capability levels and process attributes are identical to those defined in ISO/IEC 15504-2.

Evolving process capability is expressed in the TestSPICE Process Assessment Model in terms of process attributes grouped into capability levels. Process attributes are features of a process that can be evaluated on a scale of achievement, providing a measure of the capability of the test process. They are applicable to all processes of the TestSPICE PRM. Each process attribute describes a facet of the overall capability of managing and improving the effectiveness of a test process in achieving its purpose and contributing to the business goals of the organization. A capability level is a set of process attribute(s) that work together to provide a major enhancement in the capability to perform a process. Each level provides a major enhancement of capability in the performance of a process. The levels constitute a rational way of progressing through improvement of the capability of any process and are defined in ISO/IEC 15504-2.

There are six capability levels, incorporating nine process attributes.

Level 0: Incomplete process

The process is not implemented, or fails to achieve its process purpose. At this level, there is little or no

evidence of any systematic achievement of the process purpose.

Level 1: Performed process

The implemented process achieves its process purpose.

Level 2: Managed process

The previously described Performed process is now implemented in a managed fashion (planned, monitored

and adjusted) and its work products are appropriately established, controlled and maintained.

Level 3: Established process

The previously described Managed process is now implemented using a defined process that is capable of

achieving its process outcomes

Level 4: Predictable process

The previously described Established process now operates within defined limits to achieve its process

outcomes.

Level 5: Optimizing process

The previously described Predictable process is continuously improved to meet relevant current and

projected business goals.

Within the Process Assessment Model, the measure of capability is based upon the nine process attributes (PA) defined in ISO/IEC 15504-2. Process attributes are used to determine whether a test process has reached a given capability. Each attribute measures a particular aspect of the process capability.

At each level there is no ordering between the process attributes; each attribute addresses a specific aspect of the capability level. The list of process attributes is shown in Table 8.

Table 8 - Capability levels and process attributes

Process Attribute ID Capability Levels and Process Attributes

Level 0: Incomplete process

Level 1: Performed process

PA 1.1 Process performance

Page 16: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 16 of 210

Level 2: Managed process

PA 2.1 Performance management

PA 2.2 Work product management

Level 3: Established process

PA 3.1 Process definition

PA 3.2 Process deployment

Level 4: Predictable process

PA 4.1 Process measurement

PA 4.2 Process control

Level 5: Optimizing process

PA 5.1 Process innovation

PA 5.2 Continuous optimization

The process attributes are evaluated on a four point ordinal scale of achievement, as defined in ISO/IEC 15504-2. They provide insight into the specific aspects of process capability required to support improvement and capability determination of the test process.

Page 17: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 17 of 210

5.5 Mapping

The TestSPICE PRM is included in this TestSPICE PAM, so the process description is divided in two parts:

1) “Process Reference” – PRM

2) “Process Indicators – PAM

The capability levels and the process attributes use the same names, ID and content as described in ISO/IEC 15504 Part 2.

Indicators are used to provide a more detailed understanding of processes and capability levels.

Note: Next version will include the reference and mapping to the ISO/IEC 29119 “Software and Systems

Engineering – Software Testing, part 1-4”. The basic and fundamental alignment is shown in the

following picture:

Figure 6 - Mapping ISO 29119 to TestSPICE

Page 18: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 18 of 210

5.6 Assessment Indicators

The Process Assessment Model is based on the principle that the capability of a process can be assessed by demonstrating the achievement of process attributes on the basis of evidences related to assessment indicators.

There are two types of assessment indicators: process capability indicators, which apply to capability levels 1 to 5 and process performance indicators, which apply exclusively to capability level 1.

The process attributes in the capability dimension have a set of process capability indicators that provide an indication of the extent of achievement of the attribute in the instantiated process. These indicators concern significant activities, resources or results associated with the achievement of the attribute purpose by a process.

The set of process capability indicators are:

� Generic Practice (GP)

� Generic Resource (GR)

� Generic Work Products (GWP)

As additional indicators for supporting the assessment of a process at Level 1, each process in the process dimension has a set of process performance indicators which is used to measure the degree of achievement of the process performance attribute for the process assessed.

The process performance indicators are:

� Base Practice (BP)

� Work Product (WP)

The performance of Base Practices (BPs) provides an indication of the extent of achievement of the process purpose and process outcomes. Work Products (WPs) are either used, produced or both, when performing the process.

Figure 7 - Assessment indicators

Page 19: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 19 of 210

The process performance and process capability indicators defined in the Process Assessment Model represent types of objective evidence that might be found in an instantiation of a process and therefore could be used to judge achievement of capability.

The Figure above shows how the assessment indicators are related to process performance and process capability.

Page 20: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 20 of 210

5.7 Process Capability Indicators

There are two types of process capability indicators related to levels 1 to 5 as identified in Figure 5. They are intended to be applicable to all processes.

Figure 8 - Process capability indicators

All the process capability indicators relate to the process attributes defined in the capability dimension of the Process Assessment Model. They represent the type of evidence that would substantiate judgments of the extent to which the attributes are achieved. Evidence of their effective performance or existence supports the judgment of the degree of achievement of the attribute. The generic practices are the principal indicators of process capability.

Generic Practices (GP) are activities of a generic type and provide guidance on the implementation of the attribute's characteristics. They are designed around the achievement of the process attribute and many of them concern management practices, i.e. practices that are established to support the process performance as it is characterized at level 1.

During the evaluation of process capability, the primary focus is on the instantiation of the generic practices. In general, performance of all generic practices is expected for full achievement of the process attribute.

Generic Resources (GR) are associated resources that may be used when performing the process in order to achieve the attribute. These resources may include human resources, tools, methods and infrastructure.

The availability of a resource indicates the potential to fulfil the purpose of a specific attribute.

Due to the fact that Level 1 capability of a process is only characterized by the measure of the extent to which the process purpose is achieved, the process performance attribute (PA.1.1) has a single generic practice indicator (GP.1.1.1). In order to support the assessment of PA.1.1 and to amplify the process performance achievement analysis, additional process performance indicators are defined in the Process Assessment Model.

Generic Work Products (GWP) include sets of characteristics that would be expected to be evident in work products of generic types as a result of achievement of an attribute.

The generic work products form the basis for the classification of specific work products defined as process performance indicators.

Generic work products are described in the Work Product section of this PAM (Annex B).

Generic resource k

Generic resource k+1

Generic resource k+2

Generic work product r

Generic work product r+1

GP #.#.#Achieved or

not Achieved

indicates

indicates

indicates

indicates

indicates

Generic resource k

Generic resource k+1

Generic resource k+2

Generic work product r

Generic work product r+1

GP #.#.#Achieved or

not Achieved

indicates

indicates

indicates

indicates

indicates

Page 21: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 21 of 210

5.8 Process Performance Indicators

There are two types of process performance indicators:

� Base Practices (BP) and

� Work Products (WP).

Process performance indicators relate to individual processes defined in the process dimension of the Process Assessment Model and are chosen to explicitly address the achievement of the defined process purpose.

Evidence of performance of the base practices, and the presence of work products with their expected work product characteristics, provide objective evidence of the achievement of the purpose of the process.

A Base Practice is an activity that addresses the purpose of a particular process. Consistently performing the base practices associated with a process will help the consistent achievement of its purpose. A coherent set of base practices is associated with each process in the process dimension. The base practices are described at an abstract level, identifying "what" should be done without specifying "how".

Implementing the base practices of a process should achieve the basic outcomes that reflect the process purpose. Base Practices represent only the first step in building process capability, but the base practices represent the unique, functional activities of the process, even if that performance is not systematic. The performance of a process produces work products that are identifiable and usable in achieving the purpose of the process. In this assessment model, each work product has a defined set of example work product characteristics that may be used when reviewing the work product to assess the effective performance of a process.

Page 22: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 22 of 210

5.9 Measuring process capability

The process performance and process capability indicators in this model give examples of evidence that an assessor might obtain, or observe, in the performance of an assessment. The evidence obtained in the assessment, through observation of the implemented process, can be mapped onto the set of indicators to enable correlation between the implemented process and the processes defined in this assessment model.

These indicators provide guidance for assessors in accumulating the necessary objective evidence to support judgments of capability. They are not intended to be regarded as a mandatory set of checklists to be followed.

An indicator is defined as an objective characteristic of a practice or work product that supports the judgment of the performance or capability of an implemented process. The assessment indicators, and their relationship to process performance and process capability, are shown in Figure 9.

Assessment indicators are used to confirm that certain practices were performed, as shown by observable evidence collected during an assessment. All such evidence comes either from the examination of work products of the processes assessed, or from statements made by the performers and managers of the processes.

The existence of base practices, work products, and work product characteristics, provide evidence of the performance of the processes associated with them. Similarly, the existence of process capability indicators provides evidence of process capability.

The evidence obtained should be recorded in a form that clearly relates to an associated indicator, so that the support for the assessor’s judgment can be readily confirmed or verified as required by ISO/IEC 15504-2.

The output from a process assessment is a set of process profiles, one for each process within the scope of the assessment. Each process profile consists of a set of the process attribute ratings for an assessed process.

Each attribute rating represents a judgment by the assessor of the extent to which the attribute is achieved.

Figure 9 - Relationship between assessment indicators and process capability

Page 23: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 23 of 210

6 Process Performance Indicators (level 1)

The processes in the process dimension can be directly mapped to the processes defined in the TestSPICE Process Reference Model.

The individual processes are described in terms of Process Name, Process Purpose, and Process Outcomes as defined in the TestSPICE Process Reference Model. Additional components are Process Identifier, and Process Notes, when needed. In addition the process dimension of the Process Assessment Model provides information in the form of:

� a set of base practices for the process providing a definition of the tasks and activities needed to accomplish the process purpose and fulfil the process outcomes

� a number of output work products associated with each process

� characteristics associated with each work product

6.1 Agreement Processes for Testing Services (AGT)

The processes of this process group belong to the business life cycle processes (see 5.3).

Note: This process group consist all Testing Service processes (acquisition and supply) Acquisition and

supply are –according to ISO/IEC 15504 Part 5:2012 available as high level processes (AGT.1 and AGT.2)

Or as detailed investigation of acquisition and supply (AGT.1a-f and AGT2a-c) During assessment scoping a

decision has to be made if the assessment is high level or detailed.

6.1.1 AGT.1 Testing Service Acquisition

Process Reference

Process ID AGT.1

Process name Testing Service Acquisition

Process purpose The purpose of the Testing Service Acquisition process is to acquire a suitable testing service.

Process outcomes Outcome 1 A testing service suitable for the needs of the organization or the project is acquired

Outcome 2 Governance about acquired testing services is demonstrated by the testing service acquirer as needed.

Note regarding “as needed”:

� When the acquired testing services are part of a service the

acquirer himself provides for his customers then “governance

about acquired testing services” is needed.

� In other case it depends to the acquirers risk assessment

whether “governance about acquired testing services” is rated

“as needed” or not.

Performance Indicators

Base Practices Activity ID Description Outcome reference

Page 24: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 24 of 210

Prepare Acquisition

1 Prepare the acquisition of the testing service by defining selection criteria

1

Select Supplier 2 Select the supplier that will deliver the testing service based on defined criteria for supplier selection

1

Agree on contract

3 Negotiate and fix an agreement about the delivery of a testing service including acceptance criteria for testing services.

1, 2

Monitor the testing service

4 Monitor the fulfillment of the testing service and the related SLA (Service Level Agreement).

1,2

Accept the testing service

5 Accept the testing service according to agreed criteria.

1,2

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

05-02 Business Goals 1,2 08-02 Acquisition plan 1,2

02-00 Contract 2

02-03 SLA 2

18-01 Acceptance criteria 2

18-08 Supplier selection criteria 2

6.1.2 AGT.1a Acquisition Preparation

Process Reference

Process ID AGT.1a

Process name Acquisition Preparation

Process purpose The purpose of the Acquisition Preparation process is to establish the needs and goals of the acquisition of testing resources (e.g. of a test tool, a testing service, testware, or other test activities) to be outsourced or to be out tasked and to communicate these with the potential suppliers.

Process outcomes Outcome 1 The concept or the need for the acquisition of testing resources is established and communicated to stakeholders.

Outcome 2 The acquisition requirements defining the project needs are defined and validated

Outcome 3 The customer’s known requirements are defined and validated

Outcome 4 An acquisition strategy is developed

Outcome 5 Supplier selection criteria are defined

Page 25: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 25 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Establish the needs

1 Establish a need to acquire, develop, or enhance a test tool, a testing service or testware or to outsource test activities

1

Define the requirements

2 Identify the customer / stakeholder requirements, including acceptance criteria, for a test tool, a testing service or testware or for outsourced test activities

Note 1: When “governance about acquired testing

services” is needed, this requirements shall include:

� The acquirers authority to control the

definition of the processes regarding the

subcontracted testing services, and their

interfaces to other processes;

� The acquirers authority to determine process

performance and compliance with process

requirements;

� The acquirers authority to control the

planning and prioritizing of process

improvements

Note 2: regarding “when governance … is needed”:

� When the acquired testing services are part

of a service the acquirer himself provides for

his customers then “governance about

acquired testing services” is needed.

� In other case it depends to the acquirers risk

assessment whether “governance about

acquired testing services” is rated “as

needed” or not.

2,3

Review requirements

3 Analyze and validate the defined requirements against the identified needs. Validate the requirements to reduce risk of misunderstanding by the potential suppliers.

3

Develop acquisition strategy

4 Develop a strategy for the acquisition of testing resources (e.g of test tool, testing service, testware, or other test activities to be outsourced), according to the acquisition needs.

4

Define selection criteria

5 Establish and agree on supplier selection criteria and the means of evaluation to be used

4,5

Communicate the need

6 Communicate the need for acquisition to interested parties through the identified channels

1

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

Page 26: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 26 of 210

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

05-02 Business Goals 1 08-02 Acquisition plan 4

15-26 Needs assessment report

1,3,4 12-01 Request for proposal 4,5

13-19 Review record 3

17-03 Customer requirements 3

17-09 Product requirements 3

17-10 Service requirements 3

18-01 Acceptance criteria 2,4

18-08 Supplier selection criteria 5

13-04 Communication Record 1

6.1.3 AGT.1b Supplier Selection

Process Reference

Process ID AGT.1b

Process name Supplier Selection

Process purpose The purpose of the Supplier Selection process is to choose the testing service organization that will be mandated to supply a test tool, a testing service, testware, or other test activities to be acquired by the customer.

Process outcomes Outcome 1 The supplier selection criteria are established and used to evaluate potential suppliers

Outcome 2 The supplier is selected based upon the evaluation of the supplier’s proposals, process capabilities, and other factors

Outcome 3 An agreement proposal is developed to support negotiation between the customer and the supplier

Performance Indicators

Base Practices Activity ID Description Outcome reference

Evaluate stated or perceived supplier capability.

1 Evaluate stated or perceived supplier capability against the stated requirements, according to the supplier selection criteria.

1

Select supplier 2 Evaluate supplier's proposal against the stated requirements, according to the supplier selection criteria to select supplier.

2

Page 27: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 27 of 210

Prepare agreement proposal

3 Prepare a supplier agreement proposal that clearly expresses the customer expectations and the relative responsibilities of the supplier and customer.

Note 1: When “governance about acquired testing

services” is needed, this agreement shall include:

� The acquirers authority to control the

definition of the processes regarding the

subcontracted testing services, and their

interfaces to other processes;

� The acquirers authority to determine process

performance and compliance with process

requirements;

� The acquirers authority to control the

planning and prioritizing of process

improvements

Note 2: regarding “when governance … is needed”:

� When the acquired testing services are part

of a service the acquirer himself provides for

his customers then “governance about

acquired testing services” is needed.

In other case it depends to the acquirers risk assessment whether “governance about acquired testing services” is rated “as needed” or not.

3

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

09-04 Supplier selection policy

2 02-01 Commitment / agreement 3

12-01 Request for proposal 2 08-02 Acquisition plan 1

12-04 Supplier proposal response

2,3 09-04 Supplier selection policy 1

15-13 Assessment report 2 13-05 Contract review record 3

15-24 Audit report 2 13-10 Review record 2

17-09 Product requirements 2 14-05 Preferred suppliers register

2

17-10 Service requirements 2 15-01 Analysis report 2

18-08 Supplier selection criteria

2 15-13 Assessment report 1

15-21 Supplier evaluation report

1,2

Page 28: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 28 of 210

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

15-24 Audit report 1

18-08 Supplier selection criteria 1

6.1.4 AGT.1c Contract Agreement

Process Reference

Process ID AGT.1c

Process name Contract Agreement

Process purpose The purpose of the Contract Agreement process is to negotiate and approve a contract / agreement that clearly and unambiguously specifies the expectations, responsibilities, services, work products / deliverables and liabilities of both the testing service supplier and the customer.

Note: A testing service supplier’s organization can range from an external or

internal test lab over a test department to a test company.

Process outcomes Outcome 1 A contract or agreement is negotiated, reviewed, approved and awarded to the supplier

Outcome 2 Mechanisms for monitoring the capability and performance of the supplier and for mitigation of identified risks are reviewed and considered for inclusion in the contract conditions

Outcome 3 Proposers/tenderers are notified of the result of proposal/tender selection

Outcome 4 Necessary agreement changes are negotiated between the customer and the supplier and documented in the agreement

Note: The need is identified in AGT 1d

Performance Indicators

Base Practices Activity ID Description Outcome reference

Negotiate the contract / agreement.

1 Negotiate all aspects of the contract / agreement with the supplier

1

Approve contract 2 The contract is approved by relevant stakeholders 1

Page 29: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 29 of 210

Review contract for supplier capability monitoring

3 Review and consider to include a mechanism for monitoring the capability and performance of the supplier in the contract conditions.

Note 1: When “governance about acquired testing

services” is needed, this mechanism shall include:

� The acquirers authority to control the

definition of the processes regarding the

subcontracted testing services, and their

interfaces to other processes;

� The acquirers authority to determine

process performance and compliance with

process requirements;

Note 2: regarding “when governance … is

needed”:

� When the acquired testing services are

part of a service the acquirer himself

provides for his customers then

“governance about acquired testing

services” is needed.

In other case it depends to the acquirers risk assessment whether “governance about acquired testing services” is rated “as needed” or not.

2

Review contract for risk mitigation actions

4 Review and consider to include a mechanism for the mitigation of identified risks in the contract conditions.

Note: When “governance about acquired testing

services” is needed, this mechanism shall include:

� The acquirers authority to control the

planning and prioritizing of process

improvements

2

Award contract 5 The contract is awarded to the successful proposer / tenderer

1

Communicate results to tenderers

6 Notify proposers / tenderers of the results of the proposal / tender selection. After contract award, inform all tenderers of the decision

3

Page 30: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 30 of 210

Agree on changes

5 Changes proposed by either party are negotiated and the results are documented in the agreement.

Note 1: When “governance about acquired testing

services” is needed, this mechanism shall include:

� The acquirers authority to control the

planning and prioritizing of process

improvements

Note 2: regarding “when governance … is

needed”:

� When the acquired testing services are

part of a service the acquirer himself

provides for his customers then

“governance about acquired testing

services” is needed.

In other case it depends to the acquirers risk assessment whether “governance about acquired testing services” is rated “as needed” or not.

4

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

08-19 Risk management plan 2 02-00 Contract 1,2

12-01 Request for proposal 1 02-01 Commitment / agreement 1,3

12-04 Supplier proposal response

1 13-05 Contract review record 1

15-08 Risk analysis report 2 15-08 Risk analysis report 2

15-18 Process performance report

2

17-09 Product requirements 1

17-10 Service requirements 1

6.1.5 AGT.1d Testing Service Monitoring

Process Reference

Process ID AGT.1d

Process name Testing Service Monitoring

Process purpose The purpose of the Testing Service Monitoring process is to track and assess performance of the testing service supplier against agreed requirements.

Process outcomes Outcome 1 Joint activities between the customer and the supplier are agreed and performed as needed

Page 31: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 31 of 210

Outcome 2 Information on technical progress is exchanged regularly with the supplier

Outcome 3 Performance of the supplier is monitored against the agreed requirements

Outcome 4 The need for agreement changes are identified and requested

Performance Indicators

Base Practices Activity ID Description Outcome reference

Establish and maintain communication link

1 Establish and maintain communication link between customer and supplier (e.g. define interfaces, schedule, agenda, messages, documents, meetings, joint review)

1, 2

Exchange information on technical progress

2 Use the communication link to exchange information on technical progress of the supply, including the risks that may inhibit a successful completion.

1, 2

Review supplier performance

3 Review performance aspects of the supplier (e.g. technical, quality, cost, and schedule) on a regular basis against the agreed requirements

Note: This may include also process reviews

(audits, appraisals, assessments)

3

Monitor the acquisition

4 Monitor the acquisition against the agreed acquisition documentation; analyse the information from the reviews with the supplier to evaluate the progress and ensure that specified constraints such as cost, schedule, and quality are met

3

Analyse the need for contract changes

5 Analyse and communicate the need for contract changes.

4

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

02-00 Contract 1 13-16 Change Request 4

02-01 Commitment / agreement

3,4 13-01 Acceptance record 3

13-14 Progress status record 2 13-14 Progress status record 2

13-16 Change Request 4 13-10 Review record 2

13-17 Customer request 4 15-01 Analysis report 3

15-21 Supplier evaluation report

3

Page 32: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 32 of 210

6.1.6 AGT.1e Testing Service Acceptance

Process Reference

Process ID AGT.1e

Process name Testing Service Acceptance

Process purpose The purpose of the Testing Service Acceptance process is to approve the testing service supplier’s services and/or products when all acceptance criteria are satisfied.

Process outcomes Outcome 1 The testing service supplier’s services and/or products are evaluated with regard to the agreement

Outcome 2 The customer’s acceptance decision is based on the agreed acceptance criteria

Outcome 3 The testing service supplier’s services and/or products are accepted by the customer – or rejected (when acceptance criteria are not satisfied)

Performance Indicators

Base Practices Activity ID Description Outcome reference

Evaluate the delivered services/products

1 Carry out the evaluation of the testing service supplier’s services and/or products using the defined acceptance criteria

1,2

Claim compliance with agreement

2 Resolve any acceptance issues in accordance with the procedures established in the agreement and confirm that the delivered services/products comply with the agreement.

2

Accept services/products

3 Accept the delivered services/products and communicate acceptance to the supplier.

3

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

02-00 Contract 1 13-01 Acceptance record 3

02-01 Commitment / agreement

1 13-07 Problem record 1

08-01 Acceptance test plan 1,2 15-10 Test incident report 2

08-02 Acquisition plan 1

11-11 Testware 1,3

11-14 Testing service 1,3

18-01 Acceptance criteria 2

17-03 Customer requirements 2

Page 33: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 33 of 210

6.1.7 AGT.2 Testing Service Supply

Process Reference

Process ID AGT.2

Process name Testing Service Supply

Process purpose The purpose of the Testing Service Supply process is to supply testing services according to market needs.

Process outcomes Outcome 1 Opportunities to supply testing services are identified

Outcome 2 Contracts and SLA are negotiated and agreed

Outcome 3 Testing service is delivered

Outcome 4 Testing service acceptance support is provided

Performance Indicators

Base Practices Activity ID Description Outcome reference

Respond to tender

1 A communication interface is established and maintained in order to respond to customer inquiries or requests for proposal

1,2

Deliver service 2 Deliver service according to contract and SLA 3

Support customer in services acceptance

3 Support customer in accepting the delivered service

4

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

05-02 Business goals 2 02-00 Contract 2

12-01 Request for proposal 1,2,3,4 02-01 Commitment / agreement 2

13-11 Personnel performance review record

4 12-04 Supplier proposal response

1,2

13-05 Contract review record 1,2,3,4

13-15 Proposal review record 1,2

6.1.8 AGT.2a Test Supplier tendering

Process Reference

Process ID AGT.2a

Process name Test Supplier Tendering

Page 34: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 34 of 210

Process purpose The purpose of the Test Supplier Tendering process is to prepare and submit proposals in response to customer requests for proposal.

Process outcomes Outcome 1 A communication interface is established and maintained in order to respond to customer inquiries and requests for proposal

Outcome 2 Requests for proposal are evaluated according to defined criteria to determine whether or not to submit a proposal

Outcome 3 The need to undertake preliminary surveys or feasibility studies is determined

Outcome 4 Suitable resources are identified to perform the proposed work

Outcome 5 A supplier proposal is prepared, reviewed and submitted in response to the customer request

Outcome 6 Formal confirmation of agreement is sent to the customer and vice versa obtained from the customer

Performance Indicators

Base Practices Activity ID Description Outcome reference

Establish communication interface

1 A communication interface is established and maintained in order to respond to customer inquiries or requests for proposal

1

Perform customer enquiry screening

2 Perform customer enquiry screening to ensure source of lead is genuine, the nature or type of product or service is clearly established. The right person is identified quickly that will be tasked to progress the lead

1

Establish customer proposal evaluation criteria

3 Establish evaluation criteria to determine whether or not to submit a proposal based on appropriate criteria

2

Evaluate customer request for proposal

4 Requests for proposal are evaluated according to appropriate criteria

2

Determine need for preliminary evaluations or feasibility studies

5 Determine need for preliminary evaluations or feasibility studies to ensure that a firm quotation can be made based on available requirements.

3

Identify and nominate staff

6 Identify and nominate staff with appropriate competency for the assignment.

4

Perform preliminary overall estimation

7 Estimate total costs, resources, and needed delivery date

4,5

Prepare supplier proposal or tender

8 Prepare a supplier proposal or tender in response to the customer request and review it before submission to the customer.

5

Page 35: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 35 of 210

Negotiate contract / agreement with the customer

9 Negotiate all relevant aspects of the contract / agreement with the customer.

5,6

Establish confirmation of contract / agreement

10 Formally confirm the contract / agreement to protect the interests of both parties

6

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

05-02 Business goals 2 02-00 Contract 5,6

12-01 Request for proposal 1,2,3,4,5 02-01 Commitment / agreement 5

13-11 Personnel performance review record

4 12-04 Supplier proposal response

5

13-05 Contract review record 6

13-15 Proposal review record 5

6.1.9 AGT.2b Testing Service Delivery

Process Reference

Process ID AGT.2b

Process name Testing Service Delivery

Process purpose The purpose of the Testing service Delivery process is to prepare the testing service to be delivered and to deliver it to the customer.

Process outcomes Outcome 1 The testing service to be delivered is determined.

Note: The testing service may include a test product (testware) or is

only a test product

Outcome 2 The testing service to be delivered is assembled from configured items. This includes also the assembly of a product from configured items

Outcome 3 The documentation for the testing service to be delivered is defined and produced. This includes the necessary documentation for a test product

Outcome 4 The testing service delivery mechanism and media are determined

Outcome 5 The approval of the testing service to be delivered is effected against defined criteria. This may include approval of test products

Outcome 6 The testing service is delivered to the customer

Outcome 7 The confirmation of delivery is obtained

Page 36: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 36 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Define the testing services / test product to be delivered

1 Define the testing services associated with the product.

1

Prepare the testing services / test product for delivery

2 Update and prepare the deliverable testing services.

Establish baseline for the service including the testware.

2

Assemble the test product to be delivered from configuration items

3 Built the test product or service from configured test assets to ensure integrity

2

Communicate the type, level and duration of support

4 The type, level and duration of a release is identified and communicated.

3

Define and produce the documentation for the release

5 Ensure that all documentation to support the release is produced, reviewed, approved and available

3

Determine the delivery media type and the delivery mechanism for the testing services / test product

6 The media type and delivery mechanism for testware delivery are determined in accordance with the needs of the end user, including security aspects

4

Ensure release approval before delivery

7 The delivery mechanism for testing service delivery is determined in accordance with the needs of the end user, including security aspects

5

Deliver the testing services / test product to the customer

8 Criteria for the release are satisfied before release takes place

6, 7

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

11-14 Testing service 1 08-16 Release plan 3,4

11-15 Test product 1 11-09 Release information 1

Page 37: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 37 of 210

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

08-16 Release plan 3,4 11-10 Release package 2,3

17-03 Customer requirements 1 13-06 Delivery record 6,7

18-09 Release criteria 1,5 13-29 Release approval record 5

19-14 Release strategy 4 17-04 Delivery instructions 4

6.1.10 AGT.2c Testing Service / Test Product Acceptance Support

Process Reference

Process ID AGT.2c

Process name Testing Service / Test Product Acceptance Support

Process purpose The purpose of the Testing Service / Testing Product Acceptance Support process is to assist the customer to achieve confidence in taking ownership of the testing services / test product delivered from the test supplier

Process outcomes Outcome 1 The testing service / testing product is completed and delivered to the customer

Outcome 2 The testing product is put into operation in the customer's test environment

Outcome 3 The testing services are utilized in the customer test projects

Outcome 4 Customer acceptance tests and reviews for the delivered test product are supported

Outcome 5 Customer service level fulfilment reviews for the delivered testing services are supported

Performance Indicators

Base Practices Activity ID Description Outcome reference

Deliver the test product to the costumer

1 The test product is completed and handed over to the customer with detailed configurations and technical / operational documents.

1

Import the test product into the customer’s test environment

2 The test product should be adapted and evaluated in parallel with the existing systems or processes

2

Support the customer in using the testing services

3 The deployment of the testing services in the customer’s test projects is supported

3

Page 38: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 38 of 210

Support the customer in evaluating the acceptance of the test product

4 Provide support for customer reviews of the test product

4

Support the customer in evaluating the acceptance of the testing services

5 Provide support for customer reviews of the testing services

5

Provide training to the customer

6 Provide training and support to the customer as specified in the contract

4,5

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

06-03 Installation guide 2 01-06 Test configuration 1

10-03 Customer support procedure

4,5 13-06 Delivery record 1

17-03 Customer requirements 2,3 13-08 Installation record 2

08-24 Training plan 4,5 13-10 Review record 4,5

13-01 Acceptance record 4,5

6.2 Testing Process Group (TST)

The processes of this process group belong to the business life cycle processes (see 5.3).

Note: TestSPICE does not contain any test stages or test types as a process. The processes of the TST

Process Group are designed to be applied for any test stage or test type. Therefore the processes of the

TST Process Group may be assessed in several instances – one for each test stage or test type which shall

be assessed.

For further information see Annex G Usage of TestSPICE in multiple test stages

6.2.1 TST.1 Provision of required Test Inputs

Process Reference

Process ID TST.1

Process name Provision of required Test Inputs

Page 39: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 39 of 210

Process purpose The purpose of the Provision of required Test Inputs process is to ensure, that the Test input documents (the test basis) permits the establishment of tests which satisfy the purpose of the corresponding test stage or test type.

Note: The „Provision of required Test Inputs “ should be evaluated as an entry

check at the beginning of the performance of the corresponding test stage or test

type:

� This evaluation should confirm that the test basis satisfies the corresponding

entry criteria.

� This evaluation may help to identify the root causes of poor performance of

the TST processes.

� In a TestSPICE Assessment the evaluation of this process acts as a base

line for the understanding of the respective test stage or test type.

Process outcomes Outcome 1 A (comprehensive) set of testable requirements is provided to the tester / test team as needed for the corresponding test stage or test type;

Note: “testable requirements” are stated in terms that permit

establishment of test designs (and subsequently test cases) and

execution of tests to determine whether the requirements have been

met.

Outcome 2 A testable design is provided to the tester / test team as needed for the corresponding test stage or test type;

Outcome 3 A test object documentation is provided to the tester / test team as needed for the corresponding test stage or test type;

Note: Outcome 1, 2 & 3 together constitute the “test basis”.

Outcome 4 Consistency is established and confirmed between the documents of the test basis and the corresponding test object.

Performance Indicators

Base Practices Activity ID Description Outcome reference

Provide a set of testable requirements

1 Provide a set of testable requirements

These requirements shall be analysed for correctness, completeness, consistency, feasibility and testability; These requirements shall define acceptance criteria for the tests with respect to the corresponding test stage or test type.

Note: In case of incomplete requirements define

mechanisms to inform the customer and agree on

the further steps.

1

Page 40: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 40 of 210

Provide a design for the test object

2 Provide an architectural and/or detailed design for the test object to the tester / test team.

This design shall reflect the relevant design decisions and directives as needed for the corresponding test stage or test type.

This design shall specify and document the external and internal interfaces between the software/system items

This design shall be analysed for correctness and testability to ensure that the software/system items can be built and tested.

2

Provide a test object documentation

3 Provide a test object documentation to the tester / test team as needed for the corresponding test stage or test type.

Note: “Test object documentation” may comprise:

� System documentation

� SW-unit documentation

� Customer manuals

� Installation guidelines

� Integration plans

3

Confirm consistency

4 Confirm consistency between the documents of the test basis and the corresponding test object:

� Consistency between the documents of the test basis themselves.

� Consistency between the test basis and the test object.

Note: Consistency means that the information is

correct, without discrepancies, up to date and

relating to the actual version of the software/system

under test.

Any inconsistencies shall be reported by the tester / test team to the development team and subsequently eliminated as needed.

4

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

17-12 System requirements 1 17-12 System requirements 1

17-11 Software requirements

1 17-11 Software requirements 1

04-04 High level software design

2 04-04 High level software design

2

Page 41: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 41 of 210

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

04-05 Low level software design

2 04-05 Low level software design

2

04-06 System architectural design

2 04-06 System architectural design

2

04-08 architectural model

2 04-08 architectural model

2

06-10 Test object documentation

3

15-01 Analysis report

4

13-22 Traceability record

4

6.2.2 TST.2 Test Analysis & Design

Process Reference

Process ID TST.2

Process name Test Analysis and Design

Process purpose The Purpose of the Test Analysis and Design process is to provide logical test cases as well as a description of the needed test data.

Process outcomes Outcome 1 The testability of the system is analyzed and assessed;

Outcome 2 The frame conditions of the test cases are defined;

Outcome 3 The logical test cases are available;

Outcome 4 The needed test data are identified;

Outcome 5 Bidirectional traceability to system-, software-, and test-requirements is established.

Performance Indicators

Base Practices Activity ID Description Outcome reference

Page 42: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 42 of 210

Assess the testability of the system

1 Asses the testability of

� requirements

� architecture

� design

� interfaces

1

Establish the frame conditions of the test cases

2 1) Identify the test conditions

2) Identify the test requirements

3) Identify the test criteria

2

Design the logical test cases

3 Design the logical test cases with test design techniques.

Describe the following:

3) preconditions,

4) test tasks to be done,

5) expected result

6) post conditions (e.g. database conditions).

Consider both explorative and systematic techniques for development of test cases

Note 1: The test planning process (TPM.3)

provides information

� which test design techniques must be used

� which test coverage has to be achieved.

If these informations are not contained in the test plan, they have to be derived by this base practice.

Note 2: This base practice correlates to activity

TD3 (Derive Test Coverage Items) in

ISO/IEC/IEEE 29119-3

3

Page 43: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 43 of 210

Identify the needed test data

4 Identify needed test data

Those test data might be:

1) Direct test data

a) Preconditions

b) Data used to execute the test

c) Post conditions

Indirect test data = All test data that are needed to support the execution of a test case,. e.g.

� Concrete conditions of the test environment

� The concrete parameters of all tools that run in the test environment

Note: This activity addresses all types of test data.

The definition of conditions and parameters of the

test environment is described in the TEM

Processes.

4

Establish traceability

5 Establish bi directional traceability to system-, software and test requirements to ensure that all information is correct and complete.

5

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

17-00 Requirement specification

1 17-16 Test data specification 4

06-07 System documentation 1 17-14 Test case specification 3

17-13 Test design specification

3 17-13 Test design specification 2

08-31 Test plan 3,4 13-22 Traceability record 5

04-01 Database Design 3,4

04-04 High level software design

1

04-05 Low level software design

1,2,4

17-11 Software requirements 1,2,4

17-12 System requirements 1,2,4

13-22 Traceability record 5

Page 44: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 44 of 210

6.2.3 TST.3 Test Realization and Execution

Process Reference

Process ID TST.3

Process name Test Realization and Execution

Process purpose The purpose of the Test Realization and Execution process is to provide test data and test procedures as well as to create test results.

Note: This process covers manual and automated test execution. The automation

tasks are part of TAU Process Group.

Process outcomes Outcome 1 The low level test cases are available;

Outcome 2 The sequence of test cases is defined;

Outcome 3 The test objects fulfill the entry criteria for test execution;

Outcome 4 The tests are executed within the planned test environment;

Outcome 5 The test execution and its results are recorded

Note: Requirements for the recording are is an aspect of PA 2.2c;

Outcome 6 The test incidents are analyzed and communicated;

Outcome 7 The test data are available;

Outcome 8 The test procedures are available;

Outcome 9 The test execution is planned.

Note 1: This outcome covers only technical scheduling (sequence of

procedures and test cases from execution perspective) as often used

for test automation.

Note 2: any other planning is addressed in the test management

processes or in PA 2.1b

Performance Indicators

Base Practices Activity ID Description Outcome reference

Plan the test execution

1 Prepare a plan for the test execution:

� Which tests are planned to be executed

� When are they planned to be executed

� Who will execute the tests

Note 1: This practice covers only technical

scheduling (sequence of procedures and test

cases from execution perspective) as often used

for test automation.

Note 2: This applies also if confirmation tests (also

known as re-tests) have to be executed

9

Request the test data

2 Prepare or request the test data 7

Page 45: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 45 of 210

Check the test data for completeness and appropriateness

3 Check the delivered or prepared test data for quality, completeness and appropriateness and also in relation to the volume of the data, so that different variations of data can be tested or the effectiveness of some tests can be increased

7

Design the low level test cases

4 Design the low level test cases 1

Define a sequence for the test cases

5 1) Prioritize the test cases taking into account the risks associated with the test cases

2) Develop test sequences

3) Identify sets of associated test cases commonly called sets, scenarios or suites

2

Define the test procedures

6 Define the test procedures 8

Check the delivery of the test objects to the test

7 1) Check the delivery of the test objects to the test against defined criteria, e.g.

a) Identity of the test object (version/part ID)

b) Corresponding documentation, e.g. requirements, design, release Notes, … (including their version/part ID)

c) Corresponding release information – relationship to Product release.

2) In case of confirmation tests it has also to be checked if expected corrections were reported as implemented.

3) Execute the intake tests for the test objects

3

Use the planned test environment for test execution

8 1) Implement and use the planned test environment.

2) Coordinate the test workstations and the test-ware

4

Execute the tests 9 Execute the test cases 4, 5

Log the tests 10 1) Record the test execution and results

2) Develop test logs

5

Evaluate test results

11 Compare the actual test results with the expected results and record deviations

5

Deal with the found incidents

12 1) Analyse and document found issues

2) Communicate Issues Note: Use mechanism provided by (or

implemented as defined in) problem resolution

management.

3) Execute confirmation tests to verify resolved issues.

6

Work Products

Page 46: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 46 of 210

Input Output

Result type Outcome reference Result type Outcome reference

17-14 Test case specification 1,7 17-18 Test procedure specification/ Test scenario

1,2,8

17-16 Test data specification 1,7 14-14 Test schedule 9

13-33 Test object intake checklist

3 15-39 Test object delivery report

3

17-18 Test procedure specification/ Test scenario

4 14-04 Test log 5

14-14 Test schedule 4 15-36 Issue report 6

14-04 Test log 6 15-11 Defect report 6

04-01 Database Design 1,7 03-07 Test Data 7

01-01 Product configuration 4 10-02 Test procedure 8

11-04 Product release package

4 13-07 Problem record 5,6

01-03 Software item 4 13-30 Error Note 5,6

11-01 Software product 4

11-05 Software unit 4

11-06 System 4

11-08 System element 4

15-23 Test item transmittal report

3

6.2.4 TST.4 Test Results Analysis and Reporting

Process Reference

Process ID TST.4

Process name Test Results Analysis and Reporting

Process purpose The Purpose of the Test Results Analysis and Reporting process is to check if test coverage and completion criteria are fulfilled.

Process outcomes Outcome 1 The test coverage and the test completeness are checked and analyzed using predefined criteria;

Outcome 2 The assessment of the test object is complete and recorded.

Note: Requirements for the recording and documentation are aspect of

PA 2.2.c

Outcome 3 The assessment results are documented and communicated to the relevant stakeholders.

Page 47: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 47 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Evaluate the test logs

1 1) Check the test end criteria

2) Evaluate the test coverage

1

Analyze the test object

2 Identify and communicate test cases, that have to be adapted/ revised and test activities, that have to be replicated.

1, 3

Prepare test verdict

3 Deliver and communicate a judgment over the test object. This judgment refers to the acceptance criteria and is based on the test strategy

2, 3

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

08-31 Test plan 1 15-30 Test summary report 1

14-04 Test log 1,2 15-37 Test report 2

15-30 Test summary report 2

19-19 Test strategy 2

6.3 Test Process Management Process Group (TPM)

The processes of this process group belong to the business life cycle processes (see 5.3).

6.3.1 TPM.1 Organizational Test Strategy Management

Process Reference

Process ID TPM.1

Process name Organizational Test Strategy Management

Process purpose The purpose of the Organizational Test Strategy Management process is to develop, establish and control a common test strategy for all testing process instances within a testing organization.

Process outcomes Outcome 1 Overall rules and principles of the test process in the organization fulfilling all common requirements of the stakeholders are defined, communicated and agreed between all stakeholders of the test process.

Outcome 2 The Organizational Test Strategy is adopted by all testing processes that are instantiated within the organization and regularly adapted to the needs of the organization.

Page 48: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 48 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Define Organizational Test Strategy

1 Establish and agree on the Organizational Test Strategy addressing all common requirements of the organization on test process, test stages, test approach, test methods, tools and legal constraints.

Note: This activity is covered in more detail by the

process Organizational Test Strategy Development

(TPM.1a).

1,2

Use Organizational Test Strategy

2 Use the rules, methods and principles described in the selected test strategy for development and maintenance of the master test plan.

Note: This is linked to the processes TPM.2 and TPM.3

2

Monitor Organizational Test Strategy

3 Perform evaluations and measurements on effectiveness and efficiency of the test strategy throughout the test organization.

2

Update Organizational Test Strategy

4 Update the test strategy to reflect the evaluation results and to improve usage and usefulness of the test strategy in future projects.

2

Communicate Organizational Test Strategy

5 Make available and give informations about relevant changes of the test strategy to all relevant stakeholders.

1,2

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

09-05 Test policy 1 19-19 Test strategy 1,2

18-13 Laws 1 08-31 Master test plan 2

18-10 Engineering standards

1 13-04 Communication record 1,2

15-00 Test strategy usage report

2

6.3.2 TPM.1a Organizational Test Strategy Development

Process Reference

Process ID TPM.1a

Process name Organizational Test Strategy Development

Page 49: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 49 of 210

Process purpose The purpose of the Organizational Test Strategy process is to define the overall rules and principles of the test process.

Note 1: This process is linked to the TPM.1b and TPM.3 process

Note 2: In many organizations this process is the starting point of the test process

documentation system.

Note3: Without a documented and agreed test strategy there is a high risk that the

test process of an organization does not follow a common test approach. This might

cause uncertainties about the validity and the hierarchy of test documents.

Process outcomes Outcome 1 The organizational goals of the test process are defined, communicated and agreed;

Outcome 2 The general test levels/stages and their content are defined, communicated and agreed;

Outcome 3 The available test methods for each test level/stage are defined, communicated and agreed;

Outcome 4 The approach for scalability of methods and tailoring of the test processes is defined, communicated and agreed;

Outcome 5 The frame conditions of the test process are evaluated, analysed and communicated;

Outcome 6 The legal and organizational requirements for the test process are identified, analysed and communicated;

Outcome 7 The test automation approach is defined, communicated and agreed;

Outcome 8 The need for synchronisation of test data provision, test automation and test environment design is addressed;

Outcome 9 Guidelines for the resolution of conflicts between needs and approaches of test automation, test data provision and test environment design are developed.

Performance Indicators

Base Practices Activity ID Description Outcome reference

Establish goals for the test process

1 Define the goals for the test process based on the test policy and communicate the goals to all relevant stakeholders

1

Define test stages and their content

2 1) Determine the test levels/stages (e.g. component test, integration test, system test, acceptance test)

2) Define the goals, responsibilities and main activities of each test level/stage

3) Communicate the information to all relevant parties

2

Define test methods

3 1) Choose test case design techniques to be used at each test level/stage

2) Choose test types to be carried out at each test level/stage

3) Communicate the information to all relevant parties

3

Page 50: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 50 of 210

Define the frame conditions of the project specific tailoring

4 1) Determine overall test model (e.g. V-model, incremental life cycle)

2) Describe the environments the tests shall be executed

3) Communicate the information to all relevant parties

4

Identify the legal and organizational requirements

5 1) Identify and analyse the legal requirements (e.g. laws, engineering standards)

2) Identify and analyse the organizational requirements (e.g. engineering standards)

3) Communicate the information to all relevant parties

5

Define a test automation approach

6 1) Define for each test level/stage an approach for test automation

2) Communicate the information to all relevant parties

6

Synchronize technical testing approaches

7 Synchronize the approaches for test automation, test data provision and test environment design

8

Develop guidelines for the resolution of technical conflicts

8 Make sure that conflicts between test automation needs and the planning for test data provision and test environment delivery are identified as soon as possible.

Note: Take into account, that the responsibility of

the delivery and usage of test environments, test

data and test automation solutions might be

owned by different units of the organization (may

be also that they be outsourced). Make sure that

these stakeholders establish effective

communication and problem solving mechanisms

regarding needs and constraints of technical

testing processes

9

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

09-05 Test policy 1 19-19 Test strategy 1-7

18-13 Laws 5

18-10 Engineering standards 5

6.3.3 TPM.1b Organizational Test Strategy Deployment

Process Reference

Process ID TPM.1b

Page 51: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 51 of 210

Process name Organizational Test Strategy Deployment

Process purpose The purpose of the organizational Test Strategy Deployment process is to assure that an organizational Test Strategy is adopted by all testing processes that are instantiated within the organization and regularly adapted to the needs of the organization.

Process outcomes Outcome 1 An organizational test strategy is actually used in each testing process within the organization;

Outcome 2 The organizational test strategy is monitored for effectiveness and efficiency on a regular basis;

Outcome 3 The organizational test strategy is updated regularly to reflect the need for improvement derived from the monitoring.

Performance Indicators

Base Practices Activity ID Description Outcome reference

Select an appropriate test strategy

1 Select from the available test strategies one that is appropriate for the project goals, the system to be tested and other constraints in the context of usage.

Note: There may be different test strategies

available within the organization for different

purposes.

1

Adopt the test strategy

2 Use the rules, methods and principles described in the selected test strategy for development and maintenance of the master test plan.

Note: This is linked to the processes TPM.2 and

TPM.3

2

Initiate monitoring of the test strategy

3 Include evaluations and/or measures in the test plan that support monitoring the test strategy for effectiveness and efficiency.

2

Regularly monitor the test strategy

4 Perform the defined evaluations and measurements on a regular basis to collect data and information about use and usefulness of the selected test strategy.

2

Evaluate monitoring data

5 Use the collected data and information to evaluate the usage and usefulness of the test strategy.

2

Communicate the evaluation results

6 Use the evaluation results to give feedback about usage and usefulness of the test strategy.

3

Collect and interpret evaluation results

7 Gather and consolidate evaluation results from different projects using the same test strategy.

3

Update the test strategy

8 Update the test strategy to reflect the evaluation results and to improve usage and usefulness of the test strategy in future projects.

3

Page 52: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 52 of 210

Communicate changes of the test strategy

9 Communicate existence of the changed test strategy and contents of the changes to the prospective users.

3

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

09-05 Test policy 1 15-00 Test strategy usage report

2

19-19 Test strategy 1, 2, 3 19-19 Test strategy 3

15-00 Test strategy usage report

2 08-31 Master test plan 1

13-04 Communication record 3

6.3.4 TPM.2 Test Requirements Analysis

Process Reference

Process ID TPM.2

Process name Test Requirements Analysis

Process purpose The purpose of the Test Requirements Analysis is to establish the requirements for testware elements and test services.

Note 1: Goal is to identify the needs for defining all necessary activities to plan and

execute all testing related activities. This includes the quality criteria to be verified /

validated as well as the needs for acquiring services and infrastructure.

Note 2: Outcomes of this process may be documented in a draft version of the

master test plan which is finalized during process TPM.3 afterwards.

Process outcomes Outcome 1 The test objects are derived from the architectural model;

Outcome 2 The external and internal standards containing requirements for test activities, test artifacts, test roles or the test environment are known and analyzed;

Outcome 3 The test levels/stages which are necessary to fulfill the test requirements are identified;

Outcome 4 The specific test requirements are allocated to each test object;

Outcome 5 The operational environment of each test object is determined;

Outcome 6 The requirements are analyzed for correctness and testability;

Outcome 7 The consistency and traceability are established between the test requirements and system requirements;

Outcome 8 The prioritization for implementing the test requirements is defined;

Outcome 9 The test requirements are approved and updated as needed;

Page 53: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 53 of 210

Outcome

10

Changes to the test requirements are evaluated for cost, schedule and technical impact;

Outcome

11

The test requirements are baselined and communicated to all affected parties.

Performance Indicators

Base Practices Activity ID Description Outcome reference

Derive the test objects

1 Derive the test objects from the architectural model.

1

Identify the relevant standards

2 Identify and analyze external or internal standards as well as further applicable documents that might contain requirements for test activities, test artifacts, test roles or the test environment.

2

Define requirements for the test stages, test steps and test objects

3 1) Define requirements for test stages and test steps

2) Define requirements for the test objects in the master test plan, e.g. entry criteria.

2,3

Allocate the test requirements to the test objects

4 Allocate the test requirements to the appropriate test objects.

4

Analyze the requirements for correctness and traceability

5 Review the requirements for correctness and testability

6

Identify the operational environment

6 Identify the operational environment for each test object, which is predetermined by the system architecture

Note: For better understanding and to have a

better overview of high level test steps it is

recommended to visualize the operational

environment with all test objects graphically ('big

picture')

5

Identify requirements with impact on the test environments

7 Analyze and document the requirements with impact on the test environments.

Note: These requirements will be transformed into

requirements for the test environment as

described in TEM.1

5

Ensure consistency and traceability between the test and the system requirements

8 Ensure consistency and traceability between the test requirements and the appropriate system requirements.

7

Page 54: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 54 of 210

Prioritize the test requirements

9 Prioritize the test requirements.

Note: Test requirements might be prioritized

regarding the test process, one or more releases

and/or one or more features. At the very end all

test requirements that are relevant for a test item

are prioritized.

8

Approve and update the test requirements as needed

10 Approve and update the test requirements as needed.

9

Evaluate changes to the test requirements

11 Evaluate the impact of proposed changes and new requirements for cost, schedule, risk and technical impact.

10

Baseline the test requirements and communicate them

12 1) Establish a baseline for the test requirements.

2) Communicate the baseline to the relevant stakeholders.

11

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

04-08 Architectural model 1 11-16 Test object 1

19-19 Test strategy 2, 3, 8 17-26 Test requirements specification

2,3,4,9

17-17 test environment requirements specification

5

15-00 Test requirement baseline report

11

13-22 Traceability record 3,4,7

6.3.5 TPM.3 Test Planning

Note 1: Level 2 of this process covers the activity TP2 (Organize Test Plan Development) of

ISO/IEC/IEEE 29119 part 2.

Note 2: This process is usually instantiated more than once in a typical test project:

� once for developing the master test plan

� once for each test level (e.g. unit test, software integration test, software test, system integration test,

system test)

Process Reference

Process ID TPM.3

Process name Test Planning

Page 55: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 55 of 210

Process purpose The purpose of the Test Planning process is to identify and plan the activities, tasks and resources necessary for a project to test a product, in the context of the project’s requirements and constraints.

Process outcomes Outcome 1 The scope of work for the test project is defined;

Outcome 2 The feasibility of achieving the goals of the test project with available resources and constraints are evaluated;

Outcome 3 The test goals are defined;

Outcome 4 The test activities and the test end criteria are defined;

Outcome 5 The tasks and resources necessary to complete the work are sized and estimated;

Outcome 6 The usage of test environments, the test automation and the provision of test data is planned;

Outcome 7 The project requirements in terms of team size and competencies are defined;

Outcome 8 The efforts are aligned with the superior project planning;

Outcome 9 The relevant stakeholders are identified and their commitment is achieved;

Outcome

10

Delivery and communication of the test plan to all relevant stakeholders is assured ;

Outcome

11

Test goals, scope and test end criteria are aligned with product risks.

Performance Indicators

Base Practices Activity ID Description Outcome reference

Define the scope of work

1 Determine the scope of work for the test project. 1

Define the test goals

2 Define the test goals. 3

Define the test end criterias

3 1) Define the test end criteria including suspension and resumption criteria.

2) Define the requested test coverage.

Note: test coverage may be measured, among

others. in terms of requirements coverage, system

element coverage, model coverage and code

coverage.

4

Page 56: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 56 of 210

Analyse product risks

4 Align scope, test goals and test end criteria with the identified product risk.

Note: Informations about the product risk to be

covered by the tests may be

� contained in a product risk register

� gained by reviewing product specifications

and other appropriate documentation,

through workshops, interviews or by other

suitable means.

� covered by contractual obligations.

11

Analyse the feasibility of achieving the goals of the test

5 Analyse the feasibility of achieving the goals of the test project with the available resources and constraints.

2

Define the test activities

6 Define the test activities needed to reach the scope and goals of the test, inclusive the sequence of the activities and the responsibilities for the activities.

Note: This may cover definition of test design

techniques to be used during Test Analysis and

Design (TST.2).

4

Prioritise the test objects

7 Prioritize the test objects e.g. based on their risk and complexity.

Note according to common testing strategies there

are two options for the test scheduling:

� Test consequently according to given

priorities

� Advantage: most relevant things are

tested first

� Risk: Probably not all features will be

tested

� Test a broad sample first than focus on

prioritized test objects

� Advantage: Test delivers a rough

overall quality picture

� Risk: To less time/resources for high

priority test objects

4,5

Estimate the efforts of the test

8 Estimate the efforts for the test activities using estimation approaches like:

� Experience

� Company standards and norms

� Work-breakdown-structures

� formula-based

� Function Point Analysis

� Test Point Analysis

5

Page 57: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 57 of 210

Plan the performance of technical testing processes

9 Plan the usage of test environments, of test automation and the provision of test data.

Note 1: Design, assembly, operation, support and

disassembly of test environments is described in

the TEM process group.

Note 2: Provision of test data is described in the

Test Data Management Process Group (TDM).

Note 3: Design and implementation of Test

Automation is described in the Test Automation

Process Group (TAU).

Note 4: This practice also includes the prevention

of uncoordinated concurring usage of test

environments and test data.

6

Determine the personnel requirements

10 1) Define the staff needs, based on quantity, qualification and time [G chapter 1].

2) The needed test functions are covered.

7

Align the efforts with the superior project planning

11 Align the efforts with the superior project management.

8

Identify relevant stakeholders and achieve their commitment to the test planning

12 Identify the relevant stakeholders.

Make sure that all stakeholders have all the latest relevant information.

Establish and maintain plan commitment.

9, 10

Deliver and communicate the test plan

13 Deliver and communicate the test plan to the relevant stakeholders

10

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

19-19 Test strategy 1 08-37 Quality assurance plan 1

08-32 Project process plan 1 08-31 Test plan 1, 3, 4, 5, 6, 7,9

08-12 Project plan 2,8 14-14 Test schedule 2, 4, 5,7

04-06 System architecture design

1,2 17-17 Test environment requirements specification

6

14-00 Product risk register 11 14-15 List of tools 6

17-26 Test requirements specification

2,3 08-34 Stakeholder plan 9

04-09 Role model 7

04-10 Job descriptions 7

Page 58: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 58 of 210

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

05-07 Test goals 3

6.3.6 TPM.4 Test Monitoring and Control

Process Reference

Process ID TPM.4

Process name Test Monitoring and Control

Process purpose The purpose of the Test Monitoring and Control process is to monitor the planned activities, tasks, interfaces and resources necessary for a test project of a product, in the context of the project’s requirements and constraints. Define and control activities in case of deviations. Adjust planning to ensure the execution of the planned activities.

Note 1: This process contains a management view on the whole operative testing

process. It is not restricted to the “Test Execution” process TST.3

Note 2: This process covers both monitoring and control activities according to

TMC2 and TMC3 in ISO/IEC/IEEE 29119-2.

Process outcomes Outcome 1 The planned activities are monitored to ensure compliance with the planning;

Outcome 2 The organizational interfaces are monitored;

Outcome 3 The test planning is up to date;

Outcome 4 The open issues are identified and analyzed;

Outcome 5 Actions to correct deviations from the planning and to prevent reoccurrence of problems are identified in the project and are taken when project goals are on risk or not achieved;

Outcome 6 The effectiveness of the corrective actions is monitored;

Outcome 7 The test status is communicated;

Outcome 8 The test environment is used in compliance with the plan;

Outcome 9 The test project roles are covered with qualified personnel.

Performance Indicators

Base Practices Activity ID Description Outcome reference

Initiate monitoring

1 Identify and make available all necessary measures for monitoring the progress against the test plan.

1

Page 59: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 59 of 210

Monitor the execution of planned test activities

2 Monitor the execution of planned test activities according to the test plan and the test schedule.

1

Monitor the organizational interfaces of the test project

3 Communicate the test planning, deviations from the planning, risks etc. to the project management.

2

Update the test planning in cases of uncorrectable deviations

4 Update the test planning with the information from test monitoring and controlling.

3

Identify and analyse the cause of deviations from planning

5 Identify and analyse the cause of deviations from planning.

4

Initiate the necessary actions to repair the compliance with the plan

6 Initiate corrective actions if necessary based on information and metrics out of test monitoring.

Is there a deviation in the test progress there have to be effort neutral actions like reducing the test scope or test quality or effort changing actions.

5

Monitor the effectiveness of the taken actions

7 Monitor the effectiveness of the taken actions and if the planning conformity has been renewed.

6

Communicate the test status

8 1) Summarize the information about the test activities.

2) Monitor and document the test progress, the reached test coverage and reached test end criteria.

7

Manage the test team

9 1) Ensure that the personal/staff is qualified.

2) Monitor the test progress by initiating regular meetings and discussing the test progress, the quality of the test process and found issues.

1, 9

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

15-30 Test summary report 3 15-30 Test summary report 1

14-14 Test schedule 1 14-14 Test schedule 3,4,5

08-12 Project plan 2 08-33 Test infrastructure plan 8

08-33 Test infrastructure plan 8 13-12 Personnel record 9

04-10 Job description 9 08-24 Training plan 9

13-12 Personnel record 9 15-38 Test Controlling Report 7

Page 60: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 60 of 210

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

13-40 Personnel profile 9

11-03 Product release information

1,4,8

15-23 Test item transmittal report

1,3,4,8

6.3.7 TPM.5 Test Closing and Reporting

Process Reference

Process ID TPM.5

Process name Test Closing and Reporting

Process purpose The purpose of the Test Closing and Reporting process is to close all activities, prepare for maintenance, store and analyse process data.

Process outcomes Outcome 1 The decision of acceptance is made and communicated;

Outcome 2 The activities of the test project are closed;

Outcome 3 The open issues and error Notes are decided and closed;

Outcome 4 The resources are de allocated;

Outcome 5 The test environment is de-allocated;

Outcome

55

The design of the test environment is archived in order to enable reconstruction of the test environment

Outcome 6 The test-ware is transferred to maintenance;

Outcome 7 Process data is stored and analyzed;

Outcome 8 Lessons learned is performed and analyzed.

Performance Indicators

Base Practices Activity ID Description Outcome reference

Make and communicate the acceptance decision

1 Decide the acceptance of the delivered product, change or service.

Communicate the acceptance decision to all involved parties (stakeholders).

1

Decide on the further actions for the open issues

2 List all currently open issues.

Decide how to handle these issues.

Transfer issues that have to be solved into the change management process.

Close all other issues.

2, 3

Page 61: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 61 of 210

Decide on the further actions for all open themes of the test project

3 List all open themes.

Decide on open themes.

Communicate decision

2, 3

Implement the decisions and close all error Notes and open themes

4 Check if all issues and themes are transferred to maintenance or change management.

Implement and close all other decisions.

2, 3

Archive all test artefacts

5 Check for all artefacts if and how long they need to be archived.

Archive test artefacts to media that cover the requirements.

Communicate successful archiving of artefacts.

6, 7

Transfer the testware to the maintenance organization

6 Transfer all testware to maintenance if needed. 6

Archive the process data of the test project

7 Check process data if it needs to be archived. 7

Close all test activities of the test project

8 Full performance of this activity requires evidence that all activities are really closed.

2, 3, 4

De-allocate the test environment

9 De-allocate all used test environments to make it available to other test projects.

Archive the test environment design data in order to enable reconstruction in case of need

Note: Details are mentioned in the TEM process

group

5,55

De-allocate the test team

10 De-allocate the test team members and make them available to other projects.

Note: If a project is not able to close activities and

issues it is also not really able to effectively de-

allocate a test team.

4

Evaluate the test project and the test activities

11 Check if the project reached the intended performance.

7

Perform a lessons learned workshop

12 Perform a lessons learned workshop. Make sure that all ideas for improvement are discussed, stored and evaluated for improvement of the test process.

8

Analyse the process data of the test project

13 Analyse the data of the test project if any variance can be identified.

Use analysis as input for the improvement of the test process.

8

Page 62: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 62 of 210

Identify and communicate improvement potentials

14 Evaluate the results of lessons learned and data analysis and check if they give any hint how the test process can be improved.

7, 8

Work Products

Input Output

Result type Outcome reference Result type Outcome reference

15-30 Test summary report 3 15-30 Test summary report 1

14-14 Test schedule 1 15-29 Acceptance report 1,2

13-30 Error Note 3 15-32 Test closure report 3,4,5,6,7,8

15-16 Improvement Opportunity 8

15-33 Experience report 8

6.4 Test Regression Reuse and Maintenance Process Group (TRM)

The processes of this process group belong to the business life cycle processes (see 5.3).

6.4.1 TRM.1 Test Asset Management

Process Reference

Process ID TRM.1

Process name Test Asset Management

Process purpose The purpose of the Test Asset Management process is to manage the life of reusable test assets from conception to retirement.

Process outcomes Outcome 1 A test asset management strategy is documented

Outcome 2 A test asset classification scheme is established

Outcome 3 A criteria for test asset acceptance, certification and retirement is defined

Outcome 4 A test asset storage and retrieval mechanism is operated

Outcome 5 The use of test assets are recorded

Outcome 6 Changes to the test assets are controlled

Outcome 7 Users of test assets are notified of problems detected, modifications made, new versions created and deletion of test assets from the storage and retrieval mechanism

Page 63: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 63 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Define and document a test asset management strategy

1 Define and document a test asset management strategy for reuse.

1

Establish a classification scheme for test assets

2 Provide a classification scheme for test assets to support their reuse

2

Define criteria for test assets

3 Define acceptance, certification and retirement criteria for test assets

3

Establish the test asset storage and retrieval mechanisms

4 Establish the test asset storage and retrieval mechanisms, and make them available to users for storing and retrieving and for providing information on reusable test assets

4

Identify reusable test assets

5 Identify test assets to be made available for reuse 2

Accept reusable test assets

6 Certify, classify, record and baseline test assets that are submitted for storage and make them available for reuse.

3,4

Operate test asset storage

7 Provide and control operation of test asset storage, retrieval and distribution mechanisms.

4,6

Record use of test assets

8 Keep track of each reuse of test assets and record information about actual reuse of test assets.

5

Notify re-users of test asset status

9 Notify all test asset re-users of any problems detected in the test assets, modifications, new versions, and deletions from the test asset storage and retrieval mechanism

7

Retire test assets 10 Retire test assets from the asset storage and retrieval mechanism following the defined test asset management strategy

3,6,7

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

05-02 Business goals 1 01-02 Reusable object 6,7

13-07 Problem record 7 03-09 Test asset use data 5

13-21 Change control record 6,7 13-04 Communication record 7

14-16 Test asset register 1,5,7 13-21 Change control record 6

16-07 Test asset repository 4,6 14-16 Test asset register 5,7

Page 64: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 64 of 210

19-13 Test asset management strategy

2,3 16-07 Test asset repository 2,4

15-03 Configuration status report

7 15-03 Configuration status report

6

16-05 Reuse library 4

17-15 Test asset specification 2,3

19-13 Test asset management strategy

1

13-27 Retirement notification 7

12-02 Retirement request 6 12-02 Retirement request 6

6.4.2 TRM.2 Test Work Products Reuse Management

Process Reference

Process ID TRM.2

Process name Test Work Products Reuse Management

Process purpose The purpose of the Test Work Product Reuse Management process is to plan, establish, manage, control, and monitor an organization’s test work product reuse program and to systematically exploit reuse opportunities for test work products.

Process outcomes Outcome 1 The organization's testware reuse strategy, including its purpose, scope, goals and objectives, is defined

Outcome 2 The domains in which to investigate reuse opportunities, or in which reuse will be practiced, are identified

Outcome 3 The organization's systematic reuse capability is assessed

Outcome 4 The reuse potential of each test domain is assessed

Outcome 5 Reuse proposals are evaluated to ensure the reuse product is suitable for the proposed application

Outcome 6 The reuse strategy is implemented in the test organization

Outcome 7 Feedback, communication, and notification mechanisms are established, that operate between affected parties

Outcome 8 The reuse program is monitored and evaluated

Performance Indicators

Base Practices Activity ID Description Outcome reference

Define organizational test reuse strategy

1 Define the reuse program and necessary supporting infrastructure for the test organization

1

Identify domains for potential reuse

2 Identify set(s) of systems and their components in terms of common properties that can be organized into

2

Page 65: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 65 of 210

a collection of reusable assets that may be used to construct systems in the domain

Assess reuse maturity

3 Gain an understanding of the reuse readiness and capability of the test organization, to provide a baseline and success criteria for test reuse program management

3

Assess test domains for potential reuse

4 Assess test domain to identify potential use and applications of reusable components and products.

4

Evaluate reuse proposals

5 Evaluate suitability of the provided reusable components and product(s) to proposed use

5

Implement the reuse program

6 Perform the defined activities identified in the reuse program

6

Collect and manage learning

7 Collect learning and information from project and related processes, analyse them and store them into the process repository

7

Get feedback from reuse

8 Establish feedback, assessment, communication and notification mechanism to control the progress of the reuse program

7,8

Monitor reuse 9 Monitor the implementation of the reuse program periodically and evaluate its suitability to actual needs

6,8

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

08-03 Process assessment plan 3 08-17 Reuse plan 5,6

08-17 Reuse plan 5 09-03 Reuse policy 1

09-03 Reuse policy 6 12-03 Reuse proposal 4

12-03 Reuse proposal 5 13-04 Communication record 7

05-02 Business goals 1 15-07 Reuse evaluation report 5,6,8

14-16 Test asset register 1 15-13 Assessment report 3,4

16-05 Reuse library 3,4 15-24 Audit report 3,4

19-05 Reuse strategy 2,6,7,8 19-05 Reuse strategy 1

6.4.3 TRM.3 Regression Test Management

Process Reference

Process ID TRM.3

Process name Regression Test Management

Page 66: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 66 of 210

Process purpose The purpose of the Regression Test Management process is to identify plan and perform necessary regression tests

Note 1: In difference to re-test which is dealing with verification or successful defect

removal, the regression testing is focused on re-testing of already tested SW based

systems to ensure that changes not induced any unexpected behaviour.

Note 2: For managing and execution of regression the principles and practises of

TST- and TPM-processes should be applied.

Process outcomes Outcome 1 The regression test strategy is defined, communicated and established.

Outcome 2 The regression tests are estimated and planned

Outcome 3 The need for regression test sets is elicited and analysed

Outcome 4 The requirements for regression tests are identified, analysed and accepted

Outcome 5 The test sets for the planned regression tests are available

Performance Indicators

Base Practices Activity ID Description Outcome reference

Define a strategy for regression tests

1 Define a strategy for the using of regression tests during testing

1

Determine and estimate the regression testing

2 Determine regression tests during test planning

� Scope of regression tests

� Which test cases should be used for regression testing

� Which type of testing should be used?

� Manual or automated testing

Plan the resources and estimate the budget for regression tests

2

Analyse the needs for regression testing

3 Analyse the needs for regression testing and regression test sets

3

Define the requirements for regression tests

4 Identify and analyse the requirements for regression testing and communicate them to the relevant stakeholders

4

Develop test sets for regression testing

5 1) Identify the impacted components (e.g. through an impact analysis)

2) Identify the tests used for the regression test(reduce them e.g. through elimination of redundant tests and combining multiple tests in one test run) and integrate them into test suites

3) As regression tests are used very often, they are suitable for automation

4) Develop scripts for test automation solution

5

Page 67: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 67 of 210

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

19-17 regression test strategy 2 19-17 regression test strategy 1

15-40 Result of impact-analysis

5 08-31 Test plan 2

14-18 Test set 5 15-40 Result of impact-analysis 5

14-19 Test suite 5 14-19 Test suite 5

11-12 Test automation solution 5

17-25 Requirements for regression tests

4

08-15 Regression test plan 2

6.4.4 TRM.4 Testware Maintenance

Process Reference

Process ID TRM.4

Process name Testware Maintenance

Process purpose The purpose of the Testware Maintenance process is to make sure that the testware (Test cases, test data, scripts etc.) are consistent with the actual requirements

Note: requirements may consist of system and/or software requirements

Process outcomes Outcome 1 A maintenance strategy is developed to manage modification, migration and retirement of test work products according to the release strategy

Outcome 2 The impact of changes to the existing system on organization, operations or interfaces for the testware are identified

Outcome 3 The affected test documentation is updated as needed

Outcome 4 The modified test work products are developed with associated tests that demonstrate that requirements are not compromised

Outcome 5 The test work product upgrades are migrated to the customer’s environment

Outcome 6 The test work products are retired from use in a controlled manner that minimizes disturbance to the customers based on user or organizational requests

Outcome 7 The testware modification is communicated to all affected parties.

Page 68: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 68 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Establish a maintenance strategy

1 1) Identify which work products needs to be maintained (test cases, system documentation, test objects, test documentation….)

2) Develop a strategy for delivery, capturing, storage and usage

1

Identify impact on the testware

2 Identify impact on testware, if there are changes on the existing system, the organization or the interfaces

2

Update the test documentation

3 Changes to the testware are documented via configuration management and communicated to all impacted users

3

Develop tests for modified test work products

4 Develop associated test, when test work products are changed to demonstrate that the requirements are not compromised

4

Establish relationships between the test cases and the requirements

5 1) Every requirement is based on one or more test cases and vice versa

2) These relationships have to be monitored on the level of the single versions

4

Migrate the test work product upgrades to the customer's environment

6 Test work product upgrades are migrated to the costumer’s test environment

5

Archive the testware and deliver it to the maintenance organization

7 1) Document and archive the testware, test environment and the test infrastructure for later reuse

2) Deliver the testware to the maintenance organization

3) Archiving has no influence on the customers based on user or organizational requests

6

Communicate changes

8 Communicate changes to all relevant stakeholders 7

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

17-00 Requirement Specification

4 08-31 Test plan 1

01-05 Test cases 4 08-04 Configuration management plan

1

Page 69: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 69 of 210

11-13 Test documentation 6 16-08 Archive of testware 6

11-11 Testware 4,5,6 19-06 Maintenance strategy 1

13-22 Traceability record 4

6.5 Test Environment Management Process Group (TEM)

The processes of this process group belong to the technical life cycle processes (see 5.3).

6.5.1 TEM.1 Test Environment Requirements Analysis

Process Reference

Process ID TEM.1

Process name Test Environment Requirements Analysis

Process purpose The purpose of the Test Environment Requirements Analysis Process is to establish the requirements for the test environment, test environment operation and test environment user support.

Process outcomes Outcome 1 The impact of the test requirements on the test environment is understood.

Outcome 2 The requirements on the test environment, including the test automation requirements are specified.

Note:

� Multiple Test Environments can be used concurrently.

Outcome 3 Detailed requirements for the

� test environment intake procedure

� test environment operation and

� test environment user support

are developed in correspondence with the agreed requirements of the test environment.

Outcome 4 Traceability is established between the test requirements and test environment requirements to ensure completeness and consistency.

Outcome 5 Requirements for the disassembly of the test environment are gathered

Outcome 6 Changes to the test environment requirements are evaluated for cost, schedule and technical impact

Outcome 7 The test environment requirements are baselined and communicated to all affected parties

Page 70: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 70 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Analyze test requirements regarding the impact on the test environment

1 Analyze test requirements regarding the impact on the test environment

1

Specify test environment requirements

2 Specify the requirements on the test environment (if necessary for multiple Test Environments).

2

Specify test automation requirements

3 Specify the test automation requirements on the test environment.

2

Develop the test environment intake procedure, test environment Operational Level Agreements (OLA) and test environment Service Level Agreement (SLA)

4 Develop the detailed requirements for the

� test environment intake procedure,

� test environment Operational Level Agreements (OLA) and

� test environment Service Level Agreement (SLA)

with the agreed requirements of the test environment (as draft documents)

3

Ensure consistency

5 Ensure consistency of test environment requirements to the appropriate test requirements. Consistency is supported by establishing and maintaining traceability between test environment requirements and the test requirements when needed

4

Evaluate changes to the test environment requirements

6 Evaluate the impact of proposed changes and new requirements for cost, schedule, risk and technical impact

Note: Any Test Requirement change can cause Test

Environment Requirement changes.

6

Gather disassembly requirements

7 Gather requirements regarding disassembly of test environments

5

Establish requirements baseline

8 Regularly establish the baseline of test environment requirements

7

Communicate requirements

9 Communicate test environment requirements to all parties affected.

7

Work Products

Inputs Outputs

Page 71: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 71 of 210

Result type Outcome reference Result type Outcome reference

17-26 Test requirements Specification

2 17-17 test environment requirements specification

1, 2, 3, 4, 6

08-31 Test Plan 1,2, 08-31 Test Plan 1, 2, 4, 6

13-16 Change Request

6 17-27 test environment intake procedure requirements

3

17-28 test environment OLA requirements

3

17-29 test environment SLA requirements

3

17-30 test environment disassembly requirements specification

5

13-16 Change Request

6

6.5.2 TEM.2 Test Environment Design and Configuration Planning

Process Reference

Process ID TEM.2

Process name Test Environment Design and Configuration Planning

Process purpose The purpose of the Test Environment Design and Configuration Planning Process is to establish the design and configuration plan for test environment with all needed detail information necessary for assembling the test environment.

Process outcomes Outcome 1 The test environment design is defined, based on the test environment requirements specification including all needed test utilities and test automation tools

Outcome 2 The test environment configuration plan is defined, based on the test environment requirements specification and test environment design, with all detailed information regarding the HW, SW version, test utilities and test automation tools network configuration data and test data, based on the test data specification

Outcome 3 Constraints that affect the design of test environments are identified and dealt with.

Note 1: Such constraints may result from budget or availability limitations.

Note 2: Such constraints might lead to restrictions and/or risks like

concurring usage of an environment for several test items of a project or

even of several projects. In this case procedures have to be established

with the capability to mitigate the risk of technical or test data corruption in

the test environment

Outcome 4 The test environment operation manual is prepared

Outcome 5 The test environment disassembly plan is prepared based on test environment disassembly requirements specification

Page 72: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 72 of 210

Outcome 6 The Test design for test environment testing (TEM.4) is available

Performance Indicators

Base Practices Activity ID Description Outcome reference

Create the test environment design

1 Create the test environment design, based on the test environment requirements specification to have an overview about the test environment needs including all needed test utilities and test automation tools

1

Identify test environment constraints

2 Identify constraints that might raise risks for test projects using the test environment

3

Deal with test environment constraints

3 Deal with constraints that might raise risks for test projects using the test environment.

Note 1: This includes the definition of technical

solutions for this type of issues

Note 2: This might lead to escalation on test

planning and test strategy level

3

Create the test environment configuration plan

4 Create the test environment configuration plan with all needed information regarding the components necessary to establish the test environment

Note 1: the test environment configuration plan

contains all information necessary to set up,

deliver, maintain and support the test environment.

This might contain

� the HW,

� operation systems,

� SW programs,

� stubs,

� test drivers,

� test utilities,

� test automation tools,

� test data,

� User access rights,

� and all needed infrastructure.

Note 2: The location of all HW shall be defined in

the test environment configuration plan (e. g. the

designated server or client room shall be defined).

Note 3: The test environment configuration plan

might also include all needed information regarding

network (e.g. IP addresses) and optional redundant

systems

2

Create the test environment

5 Prepare the test environment operation manual as input for the test environment testing (TEM.4) and

4

Page 73: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 73 of 210

operation manual test environment operation (TEM.5)

Plan the disassembly of test environments

6 Plan the disassembly of test environments 5

Create the test environment test design

7 Create the test design and abstract test cases for test environment testing (TEM.4), based on test environment requirements specification

6

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

17-17 test environment requirements specification

1,3,6 04-11 test environment design 1,3

17-13 Test design specification 2 08-38 test environment configuration plan (draft)

2

17-16 Test data specification 2 06-09 test environment operation manual (draft)

4

06-07 System documentation 2 08-41 test environment disassembly plan (draft)

5

17-30 test environment disassembly requirements specification

5 04-12 test environment test design

6

06-02 handling and storage guide

5

08-09 installation and maintenance plan

5

08-14 recovery plan 5

6.5.3 TEM.3 Test Environment Assembly

Process Reference

Process ID TEM.3

Process name Test Environment Assembly

Process purpose The purpose of the Test Environment Assembly Process is to establish the test environment with all needed testware and its readiness for test environment testing (TEM.4).

Process outcomes Outcome 1 The test environment configuration Plan is understood and agreed.

Outcome 2 Components as described in the test environment configuration plan are installed, configured, correctly and completely.

Note 1: Potential parts of this outcome are:

� Network is installed according to test environment configuration plan

Page 74: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 74 of 210

� All needed HW is set to correct position and the relevant operating

systems are installed and configured correctly.

� All needed SW programs, data bases and test data are installed

and configured in the test environment.

� All needed user access rights are configured and administrated

correctly.

Note 2: For the settlement of this purpose test environment user support

(TEM.6) could be requested

Outcome 3 test environment is ready for test environment testing (TEM.4).

Outcome 4 The procedure for the handover of test objects (intake procedure) is defined (based on used SW development model and agreements with the stakeholder)

Outcome 5 test environment operation manual is updated.

Outcome 6 test environment intake procedure is defined (draft), based on the test environment intake procedure requirements

Outcome 7 test environment test case specification

Performance Indicators

Base Practices Activity ID Description Outcome reference

Ensure, that test environment configuration Plan is understood and agreed

1 Make sure, that test environment configuration plan is fully understood and agreed.

1

Establish the test environment according to test environment configuration plan

2 Establish the test environment according to the test environment configuration plan.

Note: For the settlement of this purpose test

environment user support (TEM.6) could be

requested

2

Confirm readiness of test environment for TEM.4

3 Check and document the status of Test environment and communicate the readiness for test environment testing (TEM.4) in the test environment report

3

Establish a hand over procedure for the test objects

4 Create the intake procedure for test objects from the development organization to the test organization

4

Update test environment operation manual as needed

5 Make sure that the test environment operation manual is in line with the delivered test environment

5

Create test environment intake procedure

6 Create the test environment intake procedure, based on the test environment intake procedure requirements

6

Page 75: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 75 of 210

Create test environment test case specification

7 Create the test environment test case specification (low level test cases) based on test environment test design

7

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

08-38 test environment configuration plan

1 06-09 test environment operation manual (draft, up-to-date)

5

04-11 test environment design 1 08-38 test environment configuration plan (final)

1

14-03 Hardware assets register

1 15-41 test environment report 3

19-07 software development methodology

4 10-08 Test objects intake procedure

4

01-07 test environment components

2 11-17 Test Environment 3

17-27 test environment intake procedure requirements

6 10-07 test environment intake procedure (draft)

6

04-12 test environment test design

7 17-31 test environment test case specification

7

6.5.4 TEM.4 Test Environment Testing

Process Reference

Process ID TEM.4

Process name Test Environment Testing

Process purpose The Test Environment Testing process verifies that the test environment meets its requirements and is ready to use.

Process outcomes Outcome 1 The test of the assembled test environment is planned, and executed.

Outcome 2 The relevant open issues, incidents and problems of the test environment are documented, analyzed, decided and cleared

Outcome 3 The user documentation of the test environment (i.e. the test environment operation manual) is up-to-date and contains all necessary workarounds

Outcome 4 The test environment is ready to use.

Page 76: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 76 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Ensure that test environment intake procedure is understood and agreed.

1 Make sure that the test environment intake procedure is fully understood and agreed.

1

Do component and integration testing of the test environment

2 1) Execute the specified test environment test cases.

2) Assure that the components of the test environment are tested and integrated properly and ensure that the test environment is ready for the real testing

3) Log the tests

1,2

Deal with the found issues

3 1) Analyze and document found issues

2) Communicate Issues

Note: Use mechanism provided by (or implemented

as defined in) problem resolution management.

3) Execute confirmation tests to verify resolved issues.

2

Update the test environment operation manual

4 Document the changes of the test environment and the necessary workarounds, if applicable

3

Hand over the test environment to the test environment owner

5 Hand over the test environment to the test environment owner and document the status “test environment is ready to use” in the test environment Release Protocol

4

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

17-17 test environment requirements specification

1,2 10-07 test environment intake procedure (final)

1

10-07 test environment intake procedure (draft)

1 14-04 Test log (of test environment testing)

2

11-17 Test Environment 2, 4 06-09 test environment Operation Manual (final)

3

06-09 test environment Operation Manual (draft)

3 13-36 test environment release protocol

4

17-31 test environment test case specification

1 11-17 Test Environment 4

Page 77: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 77 of 210

6.5.5 TEM.5 Test Environment Operation

Process Reference

Process ID TEM.5

Process name Test Environment Operation

Process purpose The purpose of the test environment operation process is to ensure the correct and efficient operation of the test environment and the test object (product to be tested) for the duration of intended test activities.

Process outcomes Outcome 1 The test environment Operational Level Agreement (OLA) according to test environment requirements specification and test environment OLA requirements is prepared and finalized in detail and agreed with all Stakeholders.

Outcome 2 The product to be tested is operated in its intended test environment according to requirements and intake procedure for test objects

Note: If necessary continuous deployment shall be proceeded in all

intended Test Environments

Outcome 3 Continuous Configuration Management is performed for the Test Environment

Performance Indicators

Base Practices Activity ID Description Outcome reference

Create and communicate the test environment Operational Level Agreement

1 Finalize and communicate the test environment Operational Level Agreement (OLA) to the stakeholders and establish the agreement.

1

Ensure continuous monitoring of the risks

2 Monitor continuously the identified risks and react on occurring problems

2

Deploy and operate the product in the planned test environment

3 Deploy and operate the product to be tested in its intended test environment and in the specified way.

Note:

� If necessary deploy and operate the product

to be tested continuously in the intended

test environment

� If necessary deploy and operate the product

to be tested in multiple Test Environments.

2

Monitor the use of the test environment with the defined criteria

4 Provide the capability to monitor operational service of the test environment on a regular basis, where appropriate against defined criteria.

2

Keep track of test environment

5 Manage the configuration of the test environment (including the needed test data) during the test

3

Page 78: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 78 of 210

configuration phase.

Add a record for each deployment of a product to be tested to one of the components of the test environment and keep track of changes.

Note: A record should contain at least:

� Deployment requested by

� Request date

� Affected environment component

� Product to be tested

� Version of the product to be tested

� Prerequisites for the deployment

� Required workarounds for the deployment

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

07-08 Service level measure 1 07-08 Service level measure 2

17-28 test environment Operation Level Agreement (OLA) requirements

1,2 02-02 test environment Operational Level Agreement (OLA) - final

1

17-17 test environment requirements specification

1 14-17 test environment reservation schedule

2

14-17 test environment reservation schedule

2

6.5.6 TEM.6 Test Environment User Support

Process Reference

Process ID TEM.6

Process name Test Environment User Support

Process purpose The purpose of the test environment user support process is to establish and maintain an acceptable level of service through assistance and consultation to the tester to support effective test of the test objects (product).

Process outcomes Outcome 1 The Service Level Agreement (SLA) for the test environment user support according to the test environment SLA requirements is prepared and finalized in detailed and agreed with all stakeholders

Outcome 2 The service needs for tester support are identified and monitored on an on-going basis

Outcome 3 Testers have defined access to the test environment according to plans.

Note: It might be required that the access to the test environment is

controlled. In this case the following aspects have to be taken into

Page 79: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 79 of 210

consideration:

� User registration

� Privilege Management

� User Password Management

� Review of User Access Right

� Removal of access Right

Outcome 4 The test environment user support provides the user of the test environment (e. g. tester or developer) by handling of inquiries, requests and resolving operational problems of the test environment.

Note: Inquiries, Requests and resolving operational problems like:

� Inquiries regarding handling of test environment

� Deployment of new SW version

� Create/restore back-up of images or data bases in case of finding

defects

Outcome 5 The tester support needs are met through delivery of appropriate services

Outcome 6 The tester satisfaction with the support services being provided and the product itself is evaluated on an on-going basis

Performance Indicators

Base Practices Activity ID Description Outcome reference

Create and communicate the SLA regarding test environment User Support

1 Prepare, finalize and agree a Service Level Agreement (SLA) for the test environment user support with all Stakeholders.

Communicate the SLA to all affected parties.

Note: It is also defined in test environment user

support agreement, whether a ticketing tool is

needed for service request. If a ticketing tool is

defined for the service request, then the workflow of

usage of this tool shall be also defined in the SLA

1

Establish a test support

2 Establish a service by which the tester can raise problems and questions encountered in use of the test environment and receive help in resolving them

2, 3, 4, 5

Identify the support needs of the testers

3 Identify the support needs of the testers and provide trainings, documentations and other support activities so the test environment can be used effectively

2, 5

Monitor the performance of the support and the changes of the support needs

4 Monitor the performance of the test environment support and the changes in the support needs of the testers

6

Page 80: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 80 of 210

Determine the satisfaction with the test environment

5 Determine the level of tester satisfaction with the test environment

6

Determine the satisfaction with the support

6 Determine the level of tester satisfaction with the support service

6

Deal with the support request of the testers

7 Deal with the support request of the testers 3, 4

Take actions to solve problems in the test environment

8 1) Monitor the performance of the test environment to identify problems

2) Solve problems and defects in the test environment

2, 5, 6

Evaluate the service of the test environment support for adequateness

9 Evaluate the service of the test environment support for adequateness to the occurred problems

6

Communicate satisfaction and performance

10 Communicate the satisfaction of the testers to the appropriate people

6

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

17-29 Detailed requirements for the Service Level (test environment SLA requirements)

1

02-03 Finalized test environment service level agreements (SLA)

1

17-17 test environment requirements specification

1 03-08 Tester satisfaction data 2, 6

07-08 Service level measure 1 07-09 Tester satisfaction survey 5, 6

10-05 Test support procedure 3 13-07 Problem record 3, 4

13-07 Problem record 2 15-25 Tester satisfaction report 6

13-28 Support request 2 15-20 Service Level Performance 6

17-10 Service requirements 2

14-17 Test environment reservation schedule

2,3

Page 81: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 81 of 210

6.5.7 TEM.7 Test Environment Disassembly

Process Reference

Process ID TEM.7

Process name Test Environment Disassembly

Process purpose The purpose of the Test Environment Disassembly Process is to disassemble the whole test environment

Process outcomes Outcome 1 The test environment disassembly plan is finalized in detailed and agreed with all stakeholders, based on the test environment disassembly requirements specification and test environment disassembly plan (draft).

Outcome 2 The test environment is disassembled according to the test environment disassembly plan.

Outcome 3 The needed trace data and information are saved or archived

Outcome 4 The disassembled components are ready for subsequent use

Outcome 5 A reassembly plan is available

Outcome 6 The complete and correct disassembly of the test environment is verified and confirmed

Performance Indicators

Base Practices Activity ID Description Outcome reference

Create the test environment disassembly plan

1 Prepare, finalize and agree the test environment disassembly plan with all stakeholders, based on the test environment disassembly requirements specification and test environment disassembly plan (draft).

1

Communicate the intention to disassemble the test environment

2 Communicate the intention to disassemble the test environment to all involved stakeholders

2

Disassemble the test environment

3 Disassemble the test environment according to the test environment disassembly plan

2

Save or archive the trace data and information

4 Save or archive all needed trace data or relevant information

3

Prepare components for subsequent use

5 Prepare the components of the test environment for subsequent use.

Store and register components so that they are ready to be assembled in other test environments, handle licences that were used in order to set up

4

Page 82: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 82 of 210

and run the test environment.

Organize and register the disposal of components that will not be reused

Plan reassembly 6 Plan reassembly if the test environment might be needed again in the future

Decide if reuse of the environment might be needed in the future

Make sure that special components are available in the future

Note: These components might consist of

� Special hardware

� Licenses

� Tools

5

Confirm disassembly

7 Verify that the disassembly was performed according to the test environment disassembly plan and confirm complete and correct disassembly.

6

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

08-38 test environment Configuration Plan

1

08-41 test environment Disassembly Plan (final)

1

08-41 test environment Disassembly Plan (draft)

1 15-42 test environment disassembly report.

5,4,3,2,6

17-30 test environment disassembly requirements specification

1

6.6 Test Data Management Process Group (TDM)

The processes of this process group belong to the technical life cycle processes (see 5.3).

The Test Data Management Process Group contains processes and practices necessary to provide accurate and sufficient test data as needed.

Note: As multiple test stages and test tasks might be relevant, this process might have more than one

instance to check (e.g. test data for performance testing might differ from test data for unit testing).

Page 83: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 83 of 210

6.6.1 TDM.1 Test Data Requirements Management

Process Reference

Process ID TDM.1

Process name Test Data Requirements Management

Process purpose The purpose of the Test Data Requirements Management Process is to collect and synthesize all requirements that have influence on the provision of test data e.g. test requirements, product requirements, legal and technical constraints, test stage specific needs, and test archive needs.

Process outcomes Outcome 1 The need for test data is identified and understood

Outcome 2 The legal constraints for test data are identified and understood

Outcome 3 The technical constraints for test data are identified and understood

Note: those constraints might result from decisions from the test

environment design and assembly process

Outcome 4 Requirements raised by test automation needs are identified

Outcome 5 Requirements raised by safety and/or security needs are identified

Outcome 6 Requirements for global, project specific, test stage specific and test item specific test data are identified

Outcome 7 Requirements for effective and efficient provision (e.g. Tools) are identified

Outcome 8 The feasibility and acceptability of potential solutions is evaluated

Outcome 9 Potential solutions are prioritized according to evaluation results

Performance Indicators

Base Practices Activity ID Description Outcome reference

Identify the need for test data

1 Identify the need for test data e.g. by performing a test data needs assessment

Note 1: This practice might apply on strategic level

during project planning but also on operational level

during test planning and the transforming from logical

test cases to physical test cases.

Note 2: The needs document might also be an input

for the Test Environment Requirements Analysis

process as the need for test data might also contain a

need for hardware sizing,

1

Identify the legal constraints

2 Identify legal constraints such as the security of personal data and business secrets.

Note: Legal constraints might prevent an organization

from just using a copy of the production data.

2

Page 84: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 84 of 210

Analyze the legal constraints and its consequences

3 Analyze the consequences of legal constraints for test Data.

These consequences might be

1) Usage of unanonymized production data is forbidden

7) The mandatory definition of a level of anonymization of test data

8) The complete prohibition of the usage of production data

9) Documentation obligations

10) Backup Restore obligations

2

Identify technical constraints

4 Identify the technical constraints for test data provision.

Identify constraints originated from

1) The physical data model

11) Used Software releases

12) Internal and external interfaces

13) Applications relevant for test data provision

14) The technical representation of data (e.g. data base tables, message queues, internal status flags …)

15) The test environment design

3

Analyze the technical constraints and its consequences

5 Analyze the technical constraints and its impacts on potential test data provision approaches e.g. applications to be simulated

3

Identify the test data needs for test automation

6 Check test strategy and / or test planning (including level test plans) if automated regression tests are required.

Check the planned test automation approaches and their impact on test data provision

Note: Test automation approaches might be different

in different test stages/test levels.

Transform the needs into test data requirements

4

Identify the impact of safety and security issues

7 Analyze given safety and security regulations if they contain additional requirements for test data e.g. storage, archiving or maintenance of test data.

5

Analyze the need for and the impact of test data provision tools

8 Analyze options and constraints of test data provision tools available in the organization

Note: Analyze also tools that are in a late phase of the

acquisition life cycle

7

Identify potential test data provision

9 Identify potential solutions for the provisioning of Test Data

1

Page 85: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 85 of 210

approaches

Identify the organizational test data structure

10 Identify the structure of test data

Note 1: This includes the identification of

� Global test data (eg. Zip codes)

� Test stage specific test data

� Project specific test data

� Test item specific test data

� Test case specific test data

Note 2: This might rise additional constraints as it

might be forbidden or technically impossible to

effectively change global test data on test item level.

6

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

08-31 Test Plan 1,2,3,4,5,6,7,8,9 17-16 Test data specification 1,2,3,4,5,6,7,8,9

17-14 Test case specification 1,6,

04-06 System architecture design

3,8

06-07 System documentation 3,5,

6.6.2 TDM.2 Test Data Provision Planning

Process Reference

Process ID TDM.2

Process name Test Data Provision Planning

Process purpose The purpose of the Test Data Provision Planning Process is to define, approve and communicate a feasible and acceptable solution for test data provision.

Note: This solution meets the given legal, organizational and technical constraints of test

data provision.

Process outcomes Outcome 1 The feasibility and acceptability of potential solutions for test data provision is evaluated

Outcome 2 Potential solutions for test data provision are prioritized according to evaluation results

Outcome 3 The test data provisioning is planned in detail

Outcome 4 The availability of tools and services necessary to provide needed test data is planned

Outcome 5 The test data provision approach and the related plans are approved and communicated to all affected parties

Page 86: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 86 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Identify potential solutions for test data provisioning

1 Identify potential solution. Components might be

� Synthetic test data

� Production data

� Test data import via load tool

� Test data provisioning using user interfaces

� Manual / automated test data provisioning

� Test data aging

Note: The solution shall contain information about

� The tools to be used

� The implementation approach

� The need for manuals and training.

1

Evaluate the feasibility of the solution

2 For each identified solution evaluate the feasibility of implementation.

Check if identified constraints are violated

1

Evaluate the acceptability of the solution

3 Check if feasible solution fits to the needs of the intended test stage/test level.

Check if the solution supports test automation if needed

Check the level of time and cost consumption of the solution

1

Prioritize the potential solutions

4 Potential solutions are excluded from the solution space or ranked according to given criteria.

Commitment is achieved for detailed planning of the solution with the highest priority.

2

Plan the provision of the test data

5 Develop a technical plan for the test data provision including

� The used type of test data

� The provisioning approach

� Needed tools and services

� Expected effort

� Test data maintenance approach

� Backup restore

� Documentation

Synchronize Test Data Provision planning with test cycle planning

Note: Test data maintenance also includes data aging

if needed

3

Assure the availability of

6 Make sure that tools and services necessary to provide the test data are available when test data

4

Page 87: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 87 of 210

needed tools and services

provision is needed.

Define and negotiate needed SLA with test data service providers.

Approve the test data provision approach

7 Approve the test data provision approach 5

Communicate the test data provision approach

8 Communicate the test data provision approach using the predefined communication channels

5

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

17-16 Test Data Specification 1,2,3,4,5 08-42 Test Data Provision Plan

1,2,3,4,5

08-31 Test Plan 1,2,3,4,5

14-17 Test environment reservation schedule

3,5

6.6.3 TDM.3 Test Data Set Up

Process Reference

Process ID TDM.3

Process name Test Data Set Up

Process purpose The purpose of the Test Data Set Up process is to provide ready to use test data.

Note 1: The maintenance is addressed in the Testware Maintenance Process

Note 2: Any type of environment support is mentioned in the TEM process group

Note 3: If specification oriented test design techniques are utilized, the TDM 3 process

also delivers the transfer from test cases to physical test data.

Process outcomes Outcome 1 Tools and services needed to provide test data are available

Outcome 2 Test data sets are prepared according to plan

Outcome 3 Procedures to provide test data are implemented and tested

Outcome 4 Operational readiness status of test data is approved and communicated

Outcome 5 Test data operation and support teams are available as needed

Outcome 6 Training regarding the operation and usage of test data is available

Page 88: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 88 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Provide tools for test data provision

1 Provide the tools for test data provision as planned.

1

Provide services for test data provision

2 Require the services for test data provision according to agreed SLA and provide them to the test project/team.

Note: The choosing of a provider and the SLA

negotiation is subject to test service acquisition

process

1

Prepare test data 3 Prepare test data using planned tools and services.

Note 1: This can also include a test data catalogue for

manual test data editing.

Note 2: this might include

� Specification of synthetic test data

� Production samples (if allowed)

� Generated mass data

� Data aging

2

Implement test data provision procedures

4 Implement test data provision procedures as planned.

Note: Procedures might contain load tools, capture

replay tools or manual test data editing

3

Test the test data provision procedures

5 Test the test data provision

Make sure that all technical and organizational interfaces work properly.

Note: this might include a test if test team members

tasked to prepare test data manually have sufficient

skills to do so.

3

Approve the operational readiness of test data

6 Formally approve the operational readiness of test data

4

Communicate operational readiness of test data

7 Communicate operational readiness of test data according to communication plan.

4

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

08-42 Test data providion plan 1,2,3,4,5,6 13-41 Test data provision 1,2,3,4,5,6

Page 89: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 89 of 210

record

15-44 Test Data Provision Report

1,2,3,4,5,6

6.7 Test Automation Process Group (TAU)

The processes of this process group belong to the technical life cycle processes (see 5.3).

6.7.1 TAU.1 Test Automation Needs & Requirements Elicitation

Process Reference

Process ID TAU.1

Process name Test Automation Needs & Requirements Elicitation

Process purpose The Purpose of the Test Automation Needs & Requirements Elicitation Process is to establish a common understanding of test automation needs and constraints and to elicit detailed requirements for test automation.

Process outcomes Outcome 1 Sponsors and stakeholders for the organizational test automation approach are identified

Outcome 2 Organizational needs are discussed and agreed with the identified stakeholders

Outcome 3 Project needs are collected from project stakeholders

Outcome 4 Requirements for test automation are elicited and analyzed

Outcome 5 Requirements for test automation are mapped to test stages and test types

Note: Test types are e.g. performance or security test.

Outcome 6 Business cases for test automation are derived at organizational, project and test stage level.

Outcome 7 A baseline for test automation requirements is developed and approved

Outcome 8 A test automation architecture is developed and approved

Outcome 9 Changes that affect the test automation requirements are controlled

Performance Indicators

Base Practices Activity ID Description Outcome reference

Identify organizational needs for test automation

1 Identify organizational needs for test automation 2

Page 90: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 90 of 210

Identify project needs for test automation

2 Identify project needs for test automation 3

Derive test automation requirements from organizational and project needs

3 Derive test automation requirements from organizational and project needs

4

Analyze business case for test automation

4 Analyze business case for test automation 6

Map test automation requirements to test stages

5 Map test automation requirements to test stages 5

Baseline test automation requirements catalogue

6 Baseline test automation requirements catalogue 7

Approve test automation requirements catalogue

7 Approve test automation requirements catalogue 9

Approve test automation architecture design

8 Approve test automation architecture design 8

Manage changing needs and requirements

9 Manage changing needs and requirements 9

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

09-05 Test policy 1, 2 19-19 Test strategy 1, 2

19-18 Test handbook / Test manual

1, 2 04-13 Test Automation Architecture Design

8

08-31 Test plan or Test automation plan

3 - 7, 9

17-19 Test automation requirements

4 – 7, 9

Page 91: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 91 of 210

6.7.2 TAU.2 Test Automation Design

Process Reference

Process ID TAU.2

Process name Test Automation Design

Process purpose The purpose of the Test Automation Design Process is to design the test automation approach

Process outcomes Outcome 1 Potential test automation solutions are identified for each affected test stage

Outcome 2 Potential test automation solutions are identified for each affected test type

Outcome 3 Requirements for test automation tools are derived from test automation architecture and requirements

Outcome 4 Constraints for test automation solutions are identified and analyzed

Outcome 5 Potential solutions are prioritized and decided on

Outcome 6 A detailed design for the solution to be implemented is available

Outcome 7 A proof of concept for the solution to be implemented is available based on the agreed design

Performance Indicators

Base Practices Activity ID Description Outcome reference

Define potential test automation solutions for different test types and test stages

1 Define potential test automation solutions for different test types and test stages

1,2

Derive requirements for test automation tools

2 Derive requirements for test automation tools 3

Analyze constraints

3 Analyze constraints

Note: Those constraints might origin from

� Test Environment

� External interfaces

� Development Environment (like Webserver,

GUI,…)

4

Decide on solutions

4 Decide on solutions

Note: This might contain the

� Technical decision

5

Page 92: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 92 of 210

� Commercial Decision (Business case)

Develop the detailed design for test automation solutions

5 Develop the detailed design for test automation solutions

Note: This might also include support for test results

analysis

6

Perform proof of concept check of designed solution

6 Perform proof of concept check of designed solution 7

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

04-13 Test automation architecture design

3 08-31 Test plan or Test automation plan

3, 4

17-19 Test automation requirements

3 14-15 List of tools 5

08-31 Test plan or Test automation plan

1, 2 04-14 Test automation design 6

15-43 Test automation proof of concept report

7

6.7.3 TAU.3 Test automation Implementation

Process Reference

Process ID TAU.3

Process name Test automation Implementation

Process purpose The purpose of the Test Automation Implementation Process is to deliver the test automation solution according to the agreed design.

Process outcomes Outcome 1 The test automation solution is implemented according to the agreed design

Outcome 2 The correct implementation of the test automation solution is verified

Outcome 3 The test automation solution is validated

Outcome 4 The test automation solution is optimized according to verification and validation results

Outcome 5 The accepted solution is delivered to the test project

Outcome 6 A procedure to maintain the solution is implemented

Page 93: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 93 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Implement the defined solutions using the defined tools

1 Implement the defined solutions using the defined tools

Note: Solution = Tool + Domain specific business

modules (like Key-Words, Data Generation, GUI-

Mapping, technical interfaces and scripts)

1

Test the defined Solution

2 Test the defined Solution

Note: The solution might include technical, business

and integration testing.

2

Finetune and deliver the solution

3 Optimize and deliver the solution. 2

Maintain the solution

4 Maintain the solution

Note: Details see testware maintenance process

Maintenance includes technical maintenance as well

as change request management

3

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

04-14 Test automation design 1 11-12 Test Automation solution

1 - 5

04-13 Test automation architecture design

1 06-11 Test automation maintenance handbook

6

6.7.4 TAU.4 Test Case Implementation

Process Reference

Process ID TAU.4

Process name Test Case Implementation

Process purpose The purpose of the Test Case Implementation process is to translate test cases into scripts that can be used for automated test execution.

Process outcomes Outcome 1 The test procedure is defined

Outcome 2 Test data necessarily to run the test automation solution is defined

Outcome 3 Test scripts are developed and tested

Page 94: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 94 of 210

Performance Indicators

Base Practices Activity ID Description Outcome reference

Define test procedures

1 Define test procedures 1

Define Test data 2 Define Test data

Note: Details of test data provision are described in

the TDM process group

2

Produce test scripts

3 Produce test scripts

Note: this contains the translation of the test case into

a script and the debugging of the script

2

Review test scripts

4 Review the produced test scripts related to business requirements and test cases

3

Maintain test scripts

5 Maintain test scripts

Note 1: this contains business maintenance in order to

assure consistency between test scripts and test

cases

Note 2: This contains also technical maintenance

caused by changes in the test environment and the

software under test (SUT)

3

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

04-14 Test automation design 1 10-02 Test procedure 1

11-12 Test automation solution 1 03-07 Test data 2

17-14 Test case specification 3 01-08 Test script 3

08-31 Test plan or Test automation plan

4

6.7.5 TAU.5 Test Automation Usage

Process Reference

Process ID TAU.5

Process name Test Automation Usage

Process purpose The purpose of the test automation usage process is to utilize automated test cases for test execution.

Process outcomes Outcome 1 The delivered solution is integrated in the test execution approach on

Page 95: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 95 of 210

project and test stage level

Outcome 2 The automated test cases are executed

Outcome 3 Results of automated test execution are analyzed

Outcome 4 Incidents regarding test data, test environment, tools and test scripts are classified and separated from incidents regarding the test item

Outcome 5 Incidents regarding test data, test environment and test scripts are reported to the responsible service provider

Performance Indicators

Base Practices Activity ID Description Outcome reference

Integrate test automation solutions in the defined projects and test stages

1 Integrate test automation solutions in the defined projects and test stages

Note: Control of scripts will invoke the configuration

management process

1

Run automated tests

2 Run automated tests 2

Interpret and classify Test run results

3 Interpret and classify Test run results

Note: As automated tests normally run unattended, an

analysis practice is needed to identify potential

incidents. A failed test run might be caused by failure

in preparation of test data or technical problems in the

test environment.

2

Report test run results

4 Report test run results

Note: Incidents regarding the test item are analyzed

and reported in the normal testing workflow as

described in TST.4

3

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

11-12 Test automation solution 1 14-04 Test log 2

01-08 Test scripts 1, 2 15-37 Test report 3, 4, 5

03-07 Test data 1, 2

Page 96: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 96 of 210

6.7.6 TAU.6 Test Automation process monitoring

Process Reference

Process ID TAU.6

Process name Test Automation process monitoring

Process purpose The purpose of the Test Automation Monitoring Process is to constantly check if the test automation solution is delivered and used as expected and if the usage validates the estimated business case

Process outcomes Outcome 1 The delivery of test automation solutions is monitored

Outcome 2 The delivery of test automation support and the fulfillment of related SLA are monitored

Outcome 3 The usage of the test automation solution is monitored

Outcome 4 The fulfillment of the estimated business case for test automation is monitored

Performance Indicators

Base Practices Activity ID Description Outcome reference

Monitor technical implementation

1 Monitor technical implementation 1

Monitor expected business case

2 Monitor expected business case 2

Monitor usage 3 Monitor usage 2

Monitor Support 4 Monitor Support 3, 4

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

04-14 Test automation design 1 15-38 Test controlling report 1,2,3,4

17-14 Test case specification 2, 4 15-16 Improvement opportunity

4

11-12 Test automation solution 1, 3 15-33 Experience report 3,4

01-08 Test scripts 1, 2, 4

14-04 Test log 3

Page 97: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 97 of 210

15-37 Test report 3

Page 98: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 98 of 210

7 Process Capability Indicators (level 1 to 5)

Process capability indicators are the means of achieving the capabilities addressed by the considered process attributes. Evidence of process capability indicators supports the judgment of the degree of achievement of the process attribute.

The capability dimension of the process assessment model consists of six capability levels matching the capability levels defined in ISO/IEC 15504-2.

The process capability indicators for the 9 process attributes included in the capability dimension for level 1 to 5 are described.

Level 0 does not include any type of indicators, as it reflects a not implemented process or a process which fails to partially achieve any of its outcomes.

Note: ISO/IEC 15504-2 process attribute definitions and attribute outcomes are duplicated from ISO/IEC

15504-2 in italic font.

7.1 Level 1: Performed process

7.1.1 PA 1.1 Process performance attribute.

The process performance attribute is a measure of the extent to which the process purpose is achieved.

As a result of full achievement of this attribute:

a) the process achieves its defined outcomes.

7.1.1.1 Generic Practices for PA 1.1

GP 1.1.1 Achieve the process outcomes

� Perform the intent of the base practices.

� Produce work products that evidence the process outcomes.

Note 1: The assessment of a performed process is based on process performance indicators, which are

defined in Clause 5 of this document.

Note 2: Generic resources and Generic work products do not exist for the assessment of the PA 1.1

attribute.

7.1.1.2 Generic Resources for PA 1.1

� Resources are used to perform the intent of process specific base practices. [PA 1.1 Achievement a]

7.2 Level 2: Managed process

The previously described Performed process is now implemented in a managed fashion (planned, monitored and adjusted) and its work products are appropriately established, controlled and maintained.

The following attributes of the process demonstrate the achievement of this level:

Page 99: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 99 of 210

7.2.1 PA 2.1 Performance management attribute

The performance management attribute is a measure of the extent to which the performance of the process is managed.

As a result of full achievement of this attribute:

a) objectives for the performance of the process are identified;

b) performance of the process is planned and monitored;

c) performance of the process is adjusted to meet plans;

d) responsibilities and authorities for performing the process are defined, assigned and communicated;

e) resources and information necessary for performing the process are identified, made available, allocated and used;

f) interfaces between the involved parties are managed to ensure both effective communication and also clear assignment of responsibility.

7.2.1.1 Generic Practices for PA 2.1

GP 2.1.1 Identify the objectives for the performance of the process.

Note: Performance objectives may include – (1) quality of the artefacts produced, (2) process cycle time

or frequency (3) resource usage and (4) boundaries of the process.

� Performance objectives are identified based on process requirements.

� The scope of the process performance is defined.

� Assumptions and constraints are considered when identifying the performance objectives.

GP 2.1.2 Plan and monitor the performance of the process to fulfill the identified objectives.

� Plan(s) for the performance of the process are developed.

� The process performance cycle is defined.

� Key milestones for the performance of the process are established.

� Estimates for process performance attributes are determined and maintained.

� Process activities and tasks are defined.

� Schedule is defined and aligned with the approach to performing the process.

� Process work product reviews are planned.

� The process is performed according to the plan(s).

� Process performance is monitored to ensure planned results are achieved.

GP 2.1.3 Adjust the performance of the process.

� Process performance issues are identified.

� Appropriate actions are taken when planned results and objectives are not achieved.

� The plan(s) are adjusted, as necessary.

� Rescheduling is performed as necessary.

GP 2.1.4 Define responsibilities and authorities for performing the process.

� Responsibilities, commitments and authorities to perform the process are defined, assigned and communicated.

� Responsibilities and authorities to verify process work products are defined and assigned.

� The needs for process performance experience, knowledge and skills are defined.

GP 2.1.5 Identify and make available resources to perform the process according to plan.

Page 100: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 100 of 210

� The human and infrastructure resources necessary for performing the process are identified made available, allocated and used.

� The information necessary to perform the process is identified and made available.

� The necessary infrastructure and facilities are identified and made available.

GP 2.1.6 Manage the interfaces between involved parties.

� The individuals and groups involved in the process performance are determined.

� Responsibilities of the involved parties are assigned.

� Interfaces between the involved parties are managed.

� Communication is assured between the involved parties.

� Communication between the involved parties is effective.

7.2.1.2 Generic Resources for PA 2.1

� Human resources with identified objectives, responsibilities and authorities; [PA 2.1 Achievement a, d, e, f]

� Facilities and infrastructure resources; [PA 2.1 Achievement a, d, e, f]

� Project planning, management and control tools, including time and cost reporting; [PA 2.1 Achievement b, c]

� Workflow management system; [PA 2.1 Achievement d, f]

� Email and/or other communication mechanisms; [PA 2.1 Achievement d, f]

� Information and/or experience repository; [PA 2.1 Achievement b, e]

� Problem and issues management mechanisms. [PA 2.1 Achievement c]

7.2.2 PA 2.2 Work product management attribute

The work product management attribute is a measure of the extent to which the work products produced by the process are appropriately managed.

As a result of full achievement of this attribute:

a) requirements for the work products of the process are defined;

b) requirements for documentation and control of the work products are defined;

c) work products are appropriately identified, documented, and controlled;

d) work products are reviewed in accordance with planned arrangements and adjusted as necessary to meet requirements.

Note 1: Requirements for documentation and control of work products may include requirements for the

identification of changes and revision status, approval and re-approval of work products, and the creation

of relevant versions of applicable work products available at points of use.

Note 2: The work products referred to in this clause are those that result from the achievement of the

process outcomes.

7.2.2.1 Generic Practices for PA 2.2

GP 2.2.1 Define the requirements for the work products.

� The requirements for the work products to be produced are defined. Requirements may include defining contents and structure.

� Quality criteria of the work products are identified.

� Appropriate review and approval criteria for the work products are defined.

Page 101: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 101 of 210

GP 2.2.2 Define the requirements for documentation and control of the work products.

� Requirements for the documentation and control of the work products are defined. Such

� requirements may include requirements for (1) distribution, (2) identification of work products and their components (3) traceability

� Dependencies between work products are identified and understood.

� Requirements for the approval of work products to be controlled are defined.

GP 2.2.3 Identify, document and control the work products.

� The work products to be controlled are identified.

� Change control is established for work products.

� The work products are documented and controlled in accordance with requirements.

� Versions of work products are assigned to product configurations as applicable.

� The work products are made available through appropriate access mechanisms.

� The revision status of the work products may readily be ascertained.

GP 2.2.4 Review and adjust work products to meet the defined requirements.

� Work products are reviewed against the defined requirements in accordance with planned arrangements.

� Issues arising from work product reviews are resolved.

7.2.2.2 Generic Resources for PA 2.2

� Requirement management method/toolset; [PA 2.2 Achievement a, b, c]

� Configuration management system; [PA 2.2 Achievement b, c]

� Documentation elaboration and support tool; [PA 2.2 Achievement b, c]

� Document identification and control procedure; [PA 2.2 Achievement b, c]

� Work product review methods and experiences; [PA 2.2 Achievement d]

� Review management method/toolset; [PA 2.2 Achievement d]

� Intranets, extranets and/or other communication mechanisms; [PA 2.2 Achievement b, c]

� Problem and issue management mechanisms. [PA 2.2 Achievement d]

7.3 Level 3: Established process

The previously described Managed process is now implemented using a defined process capable of achieving its process outcomes.

The following attributes of the process demonstrate the achievement of this level:

7.3.1 PA 3.1 Process definition attribute

The process definition attribute is a measure of the extent to which a standard process is maintained to support the deployment of the defined process.

As a result of full achievement of this attribute:

a) a standard process, including appropriate tailoring guidelines, is defined that describes the fundamental elements that must be incorporated into a defined process;

b) the sequence and interaction of the standard process with other processes are determined;

Page 102: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 102 of 210

c) required competencies and roles for performing a process are identified as part of the standard process;

d) required infrastructure and work environment for performing a process are identified as part of the standard process;

e) suitable methods for monitoring the effectiveness and suitability of the process are determined.

Note A standard process may be used as-is when deploying a defined process, in which case tailoring

guidelines would not be necessary.

7.3.1.1 Generic Practices for PA 3.1

GP 3.1.1 Define the standard process that will support the deployment of the defined process.

� A standard process is developed that includes the fundamental process elements.

� The standard process identifies the deployment needs and deployment context.

� Guidance and/or procedures are provided to support implementation of the process as needed.

� Appropriate tailoring guideline(s) are available as needed.

GP 3.1.2 Determine the sequence and interaction between processes so that they work as an integrated system of processes.

� The standard process’s sequence and interaction with other processes are determined.

� Deployment of the standard process as a defined process maintains integrity of processes.

GP 3.1.3 Identify the roles and competencies for performing the standard process.

� Process performance roles are identified

� Competencies for performing the process are identified.

GP 3.1.4 Identify the required infrastructure and work environment for performing the standard process.

� Process infrastructure components are identified (facilities, tools, networks, methods, etc.).

� Work environment requirements are identified.

GP 3.1.5 Determine suitable methods to monitor the effectiveness and suitability of the standard process.

� Methods for monitoring the effectiveness and suitability of the process are determined.

� Appropriate criteria and data needed to monitor the effectiveness and suitability of the process are defined.

� The need to establish the characteristics of the process is considered.

� The need to conduct internal audit and management review is established.

� Process changes are implemented to maintain the standard process.

7.3.1.2 Generic Resources for PA 3.1

� Process modelling methods/tools; [PA 3.1 Achievement a, b, c, d]

� Training material and courses. [PA 3.1 Achievement a, b, c]

� Resource management system. [PA 3.1 Achievement b, c]

� Process infrastructure. [PA 3.1 Achievement a, b]

� Audit and trend analysis tools. [PA 3.1 Achievement e]

� Process monitoring method. [PA 3.1 Achievement e]

Page 103: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 103 of 210

7.3.2 PA 3.2 Process deployment attribute

The process deployment attribute is a measure of the extent to which the standard process is effectively deployed as a defined process to achieve its process outcomes.

As a result of full achievement of this attribute:

a) a defined process is deployed based upon an appropriately selected and/or tailored standard process;

b) required roles, responsibilities and authorities for performing the defined process are assigned and communicated;

c) personnel performing the defined process are competent on the basis of appropriate education, training, and experience;

d) required resources and information necessary for performing the defined process are made available, allocated and used;

e) required infrastructure and work environment for performing the defined process are made available, managed and maintained;

f) appropriate data are collected and analysed as a basis for understanding the behaviour of, and to demonstrate the suitability and effectiveness of the process, and to evaluate where continuous improvement of the process can be made.

Note Competency results from a combination of knowledge, skills and personal attributes that are

gained through education, training and experience.

7.3.2.1 Generic Practices for PA 3.2

GP 3.2.1 Deploy a defined process that satisfies the context specific requirements of the use of the standard process.

� The defined process is appropriately selected and/or tailored from the standard process.

� Conformance of defined process with standard process requirements is verified.

GP 3.2.2 Assign and communicate roles, responsibilities and authorities for performing the defined process.

� The roles for performing the defined process are assigned and communicated.

� The responsibilities and authorities for performing the defined process are assigned and communicated.

GP 3.2.3 Ensure necessary competencies for performing the defined process.

� Appropriate competencies for assigned personnel are identified.

� Suitable training is available for those deploying the defined process.

GP 3.2.4 Provide resources and information to support the performance of the defined process.

� Required human resources are made available, allocated and used.

� Required information to perform the process is made available, allocated and used.

GP 3.2.5 Provide adequate process infrastructure to support the performance of the defined process.

� Required infrastructure and work environment is available.

� Organizational support to effectively manage and maintain the infrastructure and work environment is available.

� Infrastructure and work environment is used and maintained.

GP 3.2.6 Collect and analyse data about performance of the process to demonstrate its suitability and effectiveness.

Page 104: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 104 of 210

� Data required to understand the behaviour, suitability and effectiveness of the defined process are identified.

� Data are collected and analysed to understand the behaviour, suitability and effectiveness of the defined process.

� Results of the analysis are used to identify where continual improvement of the standard and/or defined process can be made.

7.3.2.2 Generic Resources for PA 3.2

� Feedback mechanisms (customer, staff, other stakeholders); [PA 3.2 Achievement f]

� Process repository; [PA 3.2 Achievement a, b]

� Resource management system; [PA 3.2 Achievement b, c, d]

� Knowledge management system. [PA 3.2 Achievement d]

� Problem and change management system; [PA 3.2 Achievement f]

� Working environment and infrastructure; [PA 3.2 Achievement e]

� Data collection analysis system. [PA 3.2 Achievement f]

� Process assessment framework; [PA 4.1 Achievement f]

� Audit/review system. [PA 3.2 Achievement f]

7.4 Level 4: Predictable process

The previously described Established process now operates within defined limits to achieve its process outcomes.

The following attributes of the process demonstrate the achievement of this level.

7.4.1 PA 4.1 Process measurement attribute

The process measurement attribute is a measure of the extent to which measurement results are used to ensure that performance of the process supports the achievement of relevant process performance objectives in support of defined business goals.

As a result of full achievement of this attribute:

a) process information needs in support of relevant business goals are established;

b) process measurement objectives are derived from identified process information needs;

c) quantitative objectives for process performance in support of relevant business goals are established;

d) measures and frequency of measurement are identified and defined in line with process measurement objectives and quantitative objectives for process performance;

e) results of measurement are collected, analysed and reported in order to monitor the extent to which the quantitative objectives for process performance are met;

f) measurement results are used to characterise process performance.

Note 1 Information needs may typically reflect management, technical, project, process or product

needs.

Note 2 Measures may be either process measures or product measures or both.

7.4.1.1 Generic Practices for PA 4.1

GP 4.1.1 Identify process information needs, in relation with business goals.

Page 105: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 105 of 210

� Business goals relevant to establishing quantitative process measurement objectives for the process are identified.

� Process stakeholders are identified and their information needs are defined.

� Information needs support the relevant business goals.

GP 4.1.2 Derive process measurement objectives from process information needs.

� Process measurement objectives to satisfy defined process information needs are defined.

GP 4.1.3 Establish quantitative objectives for the performance of the defined process, according to the alignment of the process with the business goals.

� Process performance objectives are defined to explicitly reflect the business goals.

� Process performance objectives are verified with organizational management and process owner(s) to be realistic and useful.

GP 4.1.4 Identify product and process measures that support the achievement of the quantitative objectives for process performance.

� Detailed measures are defined to support monitoring, analysis and verification needs of process and product goals.

� Measures to satisfy process measurement and performance objectives are defined.

� Frequency of data collection is defined.

� Algorithms and methods to create derived measurement results from base measures are defined, as appropriate.

� Verification mechanism for base and derived measures is defined.

GP 4.1.5 Collect product and process measurement results through performing the defined process.

� Data collection mechanism is created for all identified measures.

� Required data is collected in an effective and reliable manner.

� Measurement results are created from the collected data within defined frequency.

� Analysis of measurement results is performed within defined frequency.

� Measurement results are reported to those responsible for monitoring the extent to which quantitative objectives are met.

GP 4.1.6 Use the results of the defined measurement to monitor and verify the achievement of the process performance objectives.

� Statistical or similar techniques are used to quantitatively understand process performance and capability within defined control limits.

� Trends of process behaviour are identified.

� Generic Resources for PA 4.1

� Management information (cost, time, reliability, profitability, customer benefits, risks etc.); [PA 4.1 Achievement a, c, d, e, f]

� Applicable measurement techniques; [PA 4.1 Achievement d]

� Product and process measurement tools and results databases. [PA 4.1 Achievement d, e, f]

� Process measurement framework. [PA 4.1 Achievement d, e, f]

� Tools for data analysis and measurement. [PA 4.1 Achievement b, c, d, e]

Page 106: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 106 of 210

7.4.2 PA 4.2 Process control attribute

The process control attribute is a measure of the extent to which the process is quantitatively managed to produce a process that is stable, capable, and predictable within defined limits.

As a result of full achievement of this attribute:

a) suitable analysis and control techniques where applicable, are determined and applied;

b) control limits of variation are established for normal process performance;

c) measurement data are analysed for special causes of variation;

d) corrective actions are taken to address special causes of variation;

e) control limits are re-established (as necessary) following corrective action.

7.4.2.1 Generic Practices for PA 4.2

GP 4.2.1 Determine analysis and control techniques, appropriate to control the process performance.

� Process control analysis methods and techniques are defined.

� Selected techniques are validated against process control objectives.

GP 4.2.2 Define parameters suitable to control the process performance.

� Standard process definition is modified to include selection of parameters for process control.

� Control limits for selected base and derived measurement results are defined.

GP 4.2.3 Analyse process and product measurement results to identify variations in process performance.

� Measures are used to analyse process performance.

� All situations are recorded when defined control limits are exceeded.

� Each out-of-control case is analysed to identify potential cause(s) of variation.

� Special causes of variation in performance are determined.

� Results are provided to those responsible for taking action.

GP 4.2.4 Identify and implement corrective actions to address assignable causes.

� Corrective actions are determined to address each assignable cause.

� Corrective actions are implemented to address assignable causes of variation.

� Corrective action results are monitored.

� Corrective actions are evaluated to determine their effectiveness.

GP 4.2.5 Re-establish control limits following corrective action.

� Process control limits are re-calculated (as necessary) to reflect process changes and corrective actions.

7.4.2.2 Generic Resources for PA 4.2

� Process control and analysis techniques; [PA 4.2 Achievement a, c]

� Statistical analysis tools/applications; [PA 4.2 Achievement b, c, e]

� Process control tools/applications. [PA 4.2 Achievement d, e]

Page 107: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 107 of 210

7.5 Level 5: Optimizing process

The previously described Predictable process is continuously improved to meet relevant current and projected business goals.

The following attributes of the process demonstrate the achievement of this level.

7.5.1 PA 5.1 Process innovation attribute

The process innovation attribute is a measure of the extent to which changes to the process are identified from analysis of common causes of variation in performance, and from investigations of innovative approaches to the definition and deployment of the process.

As a result of full achievement of this attribute:

a) process improvement objectives for the process are defined that support the relevant business goals;

b) appropriate data are analysed to identify common causes of variations in process performance;

c) appropriate data are analysed to identify opportunities for best practice and innovation;

d) improvement opportunities derived from new technologies and process concepts are identified;

e) an implementation strategy is established to achieve the process improvement objectives.

7.5.1.1 Generic Practices for PA 5.1

GP 5.1.1 Define the process improvement objectives for the process that support the relevant business goals.

� Directions to process innovation are set.

� New business visions and goals are analysed to give guidance for new process objectives and potential areas of process change.

� Quantitative and qualitative process improvement objectives are defined and documented.

GP 5.1.2 Analyse measurement data of the process to identify real and potential variations in the process performance.

� Measurement data are analysed and made available.

� Causes of variation in process performance are identified and classified.

� Common causes of variation are analysed to get quantitative understanding of their impact.

GP 5.1.3 Identify improvement opportunities of the process based on innovation and best practices.

� Industry best practices are identified and evaluated.

� Feedback on opportunities for improvement is actively sought.

� Improvement opportunities are identified.

GP 5.1.4 Derive improvement opportunities of the process from new technologies and process concepts.

� Impact of new technologies on process performance is identified and evaluated.

� Impact of new process concepts are identified and evaluated.

� Improvement opportunities are identified,

� Emergent risks are considered in identifying improvement opportunities

GP 5.1.5 Define an implementation strategy based on long-term improvement vision and objectives.

� Commitment to improvement is demonstrated by organizational management and process owner(s).

� Proposed process changes are evaluated and piloted to determine their benefits and expected impact on defined business objectives.

Page 108: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 108 of 210

� Changes are classified and prioritized based on their impact on defined improvement objectives.

� Measures that validate the results of process changes are defined to determine expected effectiveness of the process change.

� Implementation of the approved change(s) is planned as an integrated program or project.

� Implementation plan and impact on business goals are discussed and reviewed by organizational management.

7.5.1.2 Generic Resources for PA 5.1

� Process improvement framework; [PA 5.1 Achievement a, d, e]

� Process feedback and analysis system (measurement data, causal analysis results etc.); [PA 5.1 Achievement b, c]

� Piloting and trialling mechanism. [PA 5.1 Achievement c, d]

7.5.2 PA 5.2 Process optimization attribute

The process optimization attribute is a measure of the extent to which changes to the definition, management and performance of the process result in effective impact that achieves the relevant process improvement objectives.

As a result of full achievement of this attribute:

a) impact of all proposed changes is assessed against the objectives of the defined process and standard process;

b) implementation of all agreed changes is managed to ensure that any disruption to the process performance is understood and acted upon;

c) effectiveness of process change on the basis of actual performance is evaluated against the defined product requirements and process objectives to determine whether results are due to common or special causes.

7.5.2.1 Generic Practices of PA 5.2

GP 5.2.1 Assess the impact of each proposed change against the objectives of the defined and standard process.

� Objective priorities for process improvement are established.

� Specified changes are assessed against product quality and process performance requirements and goals.

� Impact of changes to other defined and standard processes is considered.

GP 5.2.2. Manage the implementation of agreed changes to selected areas of the defined and standard process according to the implementation strategy.

� A mechanism is established for incorporating accepted changes into the defined and standard process (es) effectively and completely.

� The factors that impact the effectiveness and full deployment of the process change are identified and managed, such as:

� Economic factors (productivity, profit, growth, efficiency, quality, competition, resources, and capacity );

� Human factors (job satisfaction, motivation, morale, conflict/cohesion, goal consensus, participation, training, span of control);

� Management factors (skills, commitment, leadership, knowledge, ability, organizational culture and risks);

Page 109: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 109 of 210

� Technology factors (sophistication of system, technical expertise, development methodology, need of new technologies).

� Training is provided to users of the process.

� Process changes are effectively communicated to all affected parties.

� Records of the change implementation are maintained.

GP 5.2.3 Evaluate the effectiveness of process change on the basis of actual performance against process performance and capability objectives and business goals.

� Performance and capability of the changed process are measured and compared with historical data.

� A mechanism is available for documenting and reporting analysis results to management and owners of standard and defined process.

� Measures are analysed to determine whether results are due to common or special causes.

� Other feedback is recorded, such as opportunities for further improvement of the standard process.

7.5.2.2 Generic Resources for PA 5.2

� Change management system; [PA 5.2 Achievement a, b, c]

� Process evaluation system (impact analysis, etc.). [PA 5.2 Achievement a, c]

Page 110: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 110 of 210

8 Annex A: Conformity of the Process Assessment Model

Introduction

For ease of reference, the requirements from Clause 6.3 of ISO/IEC 15504-2 are embedded verbatim in the text.

The Process Assessment Model is constructed to be an elaboration of the TestSPICE Process Reference Model, and the full set of the measurement framework from ISO/IEC 15504-2 that is elaborated in ISO/IEC 15504-5.

Requirements for Process Assessment Models (from ISO/IEC 15504-2)

Introduction

In order to assure that assessment results are translatable into an ISO/IEC 15504 process profile in a repeatable and reliable manner, Process Assessment Models shall adhere to certain requirements. A Process Assessment Model shall contain a definition of its purpose, scope and elements; its mapping to the Measurement Framework and specified Process Reference Model(s); and a mechanism for consistent expression of results.

A Process Assessment Model is considered suitable for the purpose of assessing process capability by conforming to 6.3.2, 6.3.3, and 6.3.4. [ISO/IEC 15504- 2, 6.3.1]

The purpose of this Process Assessment Model is to support assessment of process capability in the Test domain in accordance with the requirements of ISO/IEC 15504-2.

Process Assessment Model scope

1) A Process Assessment Model shall relate to at least one process from the specified Process Reference Model(s).

2) A Process Assessment Model shall address, for a given process, all, or a continuous subset, of the levels (starting at level 1) of the Measurement Framework for process capability for each of the processes within its scope. Note: It would be permissible for a model, for example, to address solely

level 1, or to address levels 1, 2 and 3, but it would not be permissible to address levels 2 and 3 without

level 1.

3) A Process Assessment Model shall declare its scope of coverage in the terms of: a) the selected Process Reference Model(s); b) the selected processes taken from the Process Reference Model(s); c) the capability levels selected from the Measurement Framework. [ISO/IEC 15504-2, 6.3.2]

The Process Assessment Model is based upon the TestSPICE Process Reference Model

A statement of compliance of the TestSPICE Process Reference Model is provided in the TestSPICE Process reference Model document. In the process dimension of the Process Assessment Model, the model provides coverage of all the processes in the TestSPICE Process Reference Model.

In the capability dimension of this Process Assessment Model, the model addresses all of the capability levels defined in the Measurement Framework in ISO/IEC 15504- 2.

Process Assessment Model elements and indicators

A Process Assessment Model shall be based on a set of indicators that explicitly addresses the purposes and outcomes, as defined in the selected Process Reference Model, of all the processes within the scope 110uthor Process Assessment Model; and that demonstrates the achievement 110uthor process attributes within the capability level scope 110uthor Process Assessment Model. The indicators focus attention on the implementation 110uthor processes in the scope 110uthor model. [ISO/IEC 15504-2, 6.3.3]

Page 111: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 111 of 210

The Process Assessment Model provides a two-dimensional view of process capability for the processes in the Process Reference Model, through the inclusion of assessment indicators as shown in Figure 5.

The Assessment Indicators used are:

� base practices and work products and

� generic practices and generic resources

They support the judgment of the performance and capability of an implemented process.

Mapping Process Assessment Models to Process Reference Models

A Process Assessment Model shall provide an explicit mapping from the relevant elements 111uthor 111uthoriz the processes 111uthor selected Process Reference Model and 111uthor relevant process attributes 111uthor Measurement Framework. The mapping shall be complete, clear and unambiguous. The mapping 111uthor indicators within the Process Assessment Model shall be to:

a) the purposes and outcomes 111uthor processes in the specified Process Reference Model;

b) the process attributes (including all 111uthor results of achievements listed for each process attribute) in the Measurement Framework.

This enables Process Assessment Models that are structurally different 111utho related 111uthor same Process Reference Model.

[ISO/IEC 15504-2, 6.3.4]

Each of the Processes in this Process Assessment Model is identical in scope to the Process defined in the Process Reference Model. Each Base Practice and Work Product is cross-referenced to the Process Outcomes it addresses.

All Work Products relate as Outputs to the Process as a whole. Each of the Process Attributes in this Process Assessment Model is identical to the Process Attribute defined in the Measurement Framework in ISO/IEC 15504-2. The Generic Practices address the characteristics from each Process Attribute. The Generic Resources relate to the Process Attribute as a whole.

Table A.1 lists the mappings of the GPs to the achievements associated with each Process Attribute.

Table A. 1 – Mapping of Generic Practices

GP Practice Name Maps To

PA 1.1: Process performance attribute

GP 1.1.1 Achieve the process outcomes. PA.1.1.a

PA 2.1: Performance management attribute

GP 2.1.1 Identify the objectives for the performance of the process.

PA.2.1.a

GP 2.1.2 Plan and monitor the performance of the process to 111uthori the identified objectives.

PA.2.1.b

GP 2.1.3 Control the performance of the process. PA.2.1.c

GP 2.1.4 Define responsibilities and authorities for performing the process.

PA.2.1.d

GP 2.1.5 Identify and make available resources to perform the process according to plan.

PA.2.1.e

GP 2.1.6 Manage the interfaces between involved parties. PA.2.1.f

PA 2.2: Work product management attribute

Page 112: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 112 of 210

GP 2.2.1 Define the requirements for the work products. PA.2.2.a

GP 2.2.2 Define the requirements for documentation and control of the work products.

PA.2.2.b

GP 2.2.3 Identify, document and control the work products. PA.2.2.c

GP 2.2.4 Review and adjust work products to meet the defined requirements.

PA.2.2.d

PA 3.1: Process definition attribute

GP 3.1.1 Define the standard process that will support the deployment of the defined process.

PA.3.1.a

GP 3.1.2 Determine the sequence and interaction between processes so that they work as an integrated system of processes.

PA.3.1.b

GP 3.1.3 Identify the roles and competencies for performing the process.

PA.3.1.c

GP 3.1.4 Identify the required infrastructure and work environment for performing the process.

PA.3.1.d

GP 3.1.5 Determine suitable methods to monitor the effectiveness and suitability of the process.

PA.3.1.e

PA 3.2: Process deployment attribute

GP 3.2.1 Deploy a defined process that satisfies the context specific requirements of the use of the standard process.

PA.3.2.a

GP 3.2.2 Assign and communicate roles, responsibilities and authorities for performing the defined process.

PA.3.2.b

GP 3.2.3 Ensure necessary competencies for performing the defined process.

PA.3.2.c

GP.3.2.4 Provide resources and information to support the performance of the defined process.

PA.3.2.d

GP 3.2.5 Provide process infrastructure to support the performance of the defined process.

PA.3.2.e

GP 3.2.6 Collect and analyse data about performance of the process to demonstrate its suitability and effectiveness.

PA.3.2.f

PA 4.1 Process measurement attribute

GP 4.1.1 Identify process information needs, in relation with business goals.

PA.4.1.a

GP.4.1.2 Derive process measurement objectives from process information needs.

PA.4.1.b

Page 113: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 113 of 210

GP 4.1.3 Establish quantitative objectives for the performance of the defined process, according to the alignment of the process with the business goals.

PA.4.1.c

GP 4.1.4 Identify product and process measures that support the achievement of the quantitative objectives for process performance.

PA.4.1.d

GP 4.1.5 Collect product and process measurement results through performing the defined process.

PA.4.1.e

GP 4.1.6 Use the results of the defined measurement to monitor and verify the achievement of the process performance objectives.

PA.4.1.f

PA 4.2 Process control attribute

GP 4.2.1 Determine analysis and control techniques, appropriate to control the process performance.

PA.4.2.a

GP 4.2.2 Define parameters suitable to control the process performance.

PA.4.2.b

GP 4.2.3 Analyse process and product measurement results to identify variations in process performance.

PA.4.2.c

GP 4.2.4 Identify and implement corrective actions to address assignable causes.

PA.4.2.d

GP.4.2.5 Re-establish control limits following corrective action.

PA.4.2.e

PA 5.1 Process innovation attribute

GP 5.1.1 Define the process improvement objectives for the process that support the relevant business goals.

PA.5.1.a

GP 5.1.2 Analyse measurement data of the process to identify real and potential variations in the process performance.

PA.5.1.b

GP 5.1.3 Identify improvement opportunities of the process based on innovation and best practices.

PA.5.1.c

GP.5.1.4 Derive improvement opportunities from new technologies and process concepts.

PA.5.1.d

GP 5.1.5 Define an implementation strategy based on long- term improvement vision and objectives.

PA.5.1.e

PA 5.2 Process optimization attribute

GP 5.2.1 Assess the impact of each proposed change against the objectives of the defined and standard process.

PA.5.2.a

GP 5.2.2 Manage the implementation of agreed changes according to the implementation strategy.

PA.5.2.b

Page 114: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 114 of 210

GP 5.2.3 Evaluate the effectiveness of process change on the basis of actual performance against process objectives and business goals.

PA.5.2.c

Expression of assessment results

A Process Assessment Model shall provide a formal and verifiable mechanism for representing the results of an assessment as a set of process attribute ratings for each process selected from the specified Process Reference Model(s).

Note: The expression of results may involve a direct translation of Process Assessment Model ratings

into a process profile as defined in this international standard, or the conversion of the data collected

during the assessment (with the possible inclusion of additional information) through further judgment on

the part of the assessor.

[ISO/IEC 15504-2, 6.3.5]

The processes in this Process Assessment Model are identical to those defined in the TestSPICE Process Reference Model. The Process Attributes and the Process Attributes Rating in this Process Assessment Model are identical to those defined in the Measurement Framework in ISO/IEC 15504- 2. As a consequence, results of Assessments based upon this Process Assessment Model are expressed directly as a set of process attribute ratings for each process within the scope of the assessment. No form of translation or conversion is required.

9 Annex B: Work product characteristics

Work product characteristics listed in this Annex can be used when reviewing potential outputs of process implementation. The characteristics are provided as guidance for the attributes to look for, in a particular sample work product, to provide objective evidence supporting the assessment of a particular process. Work products are defined using the schema in Table B.1.

Table B. 1 – Work product identification

Work product identifier # An identifier number for the work product which is used to reference the work product.

Work product name Provides an example of a typical name associated with the work product characteristics. This name is provided as an identifier of the type of work product the practice or process might produce. Organizations may call these work products by different names. The name of the work product in the organization is not significant. Similarly, organizations may have several equivalent work products which contain the characteristics defined in one work product type. The formats for the work products can vary. It is up to the assessor and the organizational unit coordinator to map the actual work products produced in their organization to the examples given here.

Work product characteristics Provides examples of the potential characteristics associated with the work product types. The assessor may look for these in the samples provided by the organizational unit.

Work Products (with the ID nn-00) are sets of characteristics that would be expected to be evident in work products of generic types as a result of achievement of an attribute. The generic work products form the basis for the classification of specific work products defined as process performance indicators.

Page 115: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 115 of 210

Specific work product types are typically created by process owners and applied by process deployers in order to satisfy an outcome of a particular process purpose.

Note: The work product catalogue is subject to further review and revision. Currently the catalogue is based

on ISO/IEC 15504 Part 5 enriched with typical test artefacts. Further revisions might lead to

� Deletion of work products

� Deletion of attributes of work products

� Deletion of Notes

� Chance of attributes of work products

� Merging of work products

� Insertion of work products

� Insertion of attributes of work products

� Insertion of Notes

Note: Where the catalogue contains gaps in the numbering compared to the generic catalogue, work

products were intendedly deleted from the catalogue because they are not used in the PAM.

The catalogue contains generic and specific work products.

WP ID WP Name WP Characteristics

01-00 Configuration item � Item which is maintained under configuration control

� may include modules, subsystems, libraries, test cases, compilers, data, documentation, physical media, and external interfaces

� Version identification is maintained

� Description of the item is available including:

� type of item

� associated configuration management library, file, system

� responsible owner

� date when placed under configuration control

� status information (i.e., development, baselined, released)

� relationship to lower level configured items

� identifies the change control records

� identifies change history

� relationship to previous versions and/or baselines (for recovery, if necessary)

� approval status information (i.e., development, baselined, released)

� revision status information (i.e., checked in, checked out, read only)

Page 116: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 116 of 210

02-00 Contract � Defines what is to be purchased or delivered

� Identifies time frame for delivery or contracted service dates

� Identifies any statutory requirements

� Identifies monetary considerations

� Identifies any warranty information

� Identifies any copyright and licensing information (patent, copyright, confidentiality, proprietary, usage, ownership, warranty and licensing rights associated with all relevant work products)

� Identifies any customer service requirements

� Identifies service level requirements

� References to any performance and quality expectations / constraints / monitoring

� Standards and procedures to be used

� Evidence of review and approval by 116uthorized signatories

� As appropriate to the contract the following are considered:

� references to any acceptance criteria

� references to any special customer needs (i.e., confidentiality requirements, security, hardware, etc.)

� references to any change management and problem resolution procedures

� identifies any interfaces to independent agents and subcontractors

� identifies customer’s role in the development and maintenance process

� identifies resources to be provided by the customer

03-00 Data � Result of applying a measure

� Available to those who need to know within defined timeframe

04-00 Design � Describes the overall product / system structure

� Identifies the required product / system elements

� Identifies the relationship between elements

� Consideration is given to:

� any required performance characteristics

� any required interfaces

� any required security characteristics

Page 117: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 117 of 210

05-00 Goals � Identifies:

� the objective to be achieved

� who is expected to achieve the goal

� any incremental supporting goals

� any conditions / constraints

� the timeframe for achievement

� Are reasonable and achievable within the resources allocated

� Are current, established for current project, organization

� Are optimized to support known performance criteria and plans

06-00 User documentation � Identifies:

� external documents

� internal documents

� current site distribution and maintenance list maintained

� Documentation kept synchronized with latest product release

� Addresses technical issues

07-00 Measure � Quantitative or qualitative attribute for a product or process

� Defines the method for collecting data

� Understood by those expected to use them

� Provides value to the organization / project

� References any relevant goals

� Non-disruptive to the work flow

� Appropriate to the process, life cycle model, organization

� Has appropriate analysis and commentary to allow meaningful interpretation by Users

Page 118: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 118 of 210

08-00 Plan

(As appropriate to the application and purpose)

� Identifies the plan owner

� Includes:

� the objective and scope of what is to be accomplished

� assumptions made

� constraints

� risks

� tasks to be accomplished

� schedules, milestones and target dates

� critical dependencies

� maintenance disposition for the plan

� Method / approach to accomplish plan

� Identifies:

� task ownership, including tasks performed by other parties (e.g. supplier, customer)

� quality criteria

� required work products

� Includes resources to accomplish plan objectives:

� time

� staff (key roles and authorities e.g. sponsor)

� materials / equipment

� budget

� Includes contingency plan for non-completed tasks

� Plan is approved

09-00 Policy � Authorized

� Available to all personnel impacted by the policy

� Establishes practices / rules to be adhered to

Page 119: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 119 of 210

10-00 Process description � A detailed description of the process / procedure which includes:

� tailoring of the standard process (if applicable)

� purpose of the process

� outcomes of the process

� task and activities to be performed and ordering of tasks

� critical dependencies between task activities

� expected time required to execute task

� input / output work products

� links between input and output work products

� Identifies:

� process entry and exit criteria

� internal and external interfaces to the process

� process measures

� quality expectations

� functional roles and responsibilities

� Approved by authorized personnel

11-00 Product � Is a result / deliverable of the execution of a process, includes services, systems (software and hardware) and processed materials

� Has elements that satisfy one or more aspects of a process purpose

� May be represented on various media (tangible and intangible)

12-00 Proposal � Defines the proposed solution

� Defines the proposed schedule

� Identifies the coverage identification of initial proposal:

� the requirements that would be satisfied

� the requirements that could not be satisfied, and provides a justification of variants

� Identifies conditions (e.g. time, location) that affect the validity of the proposal

� Identifies obligations of the acquirer and the consequences of these not being met

� Defines the estimated price of proposed development, product, or service

13-00 � Work product stating results achieved or provides evidence of activities performed in a process

� An item that is part of a set of identifiable and retrievable data

Page 120: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 120 of 210

14-00 Register � A register is a compilation of data or information captured in a defined sequence to enable:

� an overall view of evidence of activities that have taken place

� monitoring and analyses

� provides evidence of performance of a process over time

15-00 Report � A work product describing a situation that:

� includes results and status

� identifies applicable / associated information

� identifies considerations / constraints

� provides evidence / verification

16-00 Repository � Repository for components

� Storage and retrieval capabilities

� Ability to browse content

� Listing of contents with description of attributes

� Sharing and transfer of components between affected groups

� Effective controls over access

� Maintain component descriptions

� Recovery of archive versions of components

� Ability to report component status

� Changes to components are tracked to change / user requests

17-00 Requirement specification

� Each requirement is identified

� Each requirement is unique

� Each requirement is verifiable or can be assessed

� Includes statutory and regulatory requirements

� Includes issues / requirements from (contract) review

18-00 Standard � Identifies who / what they apply to

� Expectations for conformance are identified

� Conformance to requirements can be demonstrated

� Provisions for tailoring or exception to the requirements are included

19-00 Strategy � Identifies what needs and objectives or goals there are to be satisfied

� Establishes the options and approach for satisfying the needs, objectives, or goals

� Establishes the evaluation criteria against which the strategic options are evaluated

� Identifies any constraints / risks and how these will be addressed

Page 121: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 121 of 210

20-00 Template � Defines the attributes associated with a work product to be created as a consequence of a process execution

� Identifies technical elements typically associated with this product type

� Defines expected form and style

21-00 Work product � Defines the attributes associated with an artefact from a process execution:

� key elements to be represented in the work product

B.2 Generic and specific work products

Specific work product types are typically created by process owners and applied by process deployers in

order to satisfy an outcome of a particular process purpose.

Note: Generic work product types are included in the list for completeness.

WP ID WP Name WP Characteristics

01-00 Configuration item � Item which is maintained under configuration control:

� may include modules, subsystems, libraries, test cases, compilers, data, documentation, physical media, and external interfaces

� Version identification is maintained

� Description of the item is available including:

� type of item

� associated configuration management library, file, system

� responsible owner

� date when placed under configuration control

� status information (i.e., development, baselined, released)

� relationship to lower level configured items

� identifies the change control records

� identifies change history

� relationship to previous versions and/or baselines (for recovery, if necessary)

� approval status information (i.e., development, baselined, released)

� revision status information (i.e., checked in, checked out, read only)

01-01 Product configuration � Overview of the system's configuration

� Defines:

� each element and their position in the architecture of the system

� the key system interfaces

� any network considerations

� the hardware configuration

Page 122: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 122 of 210

WP ID WP Name WP Characteristics

� any system performance / parameter settings

01-02 Reusable object � Developed to be:

� reliable

� data encapsulated

� An asset and elementary component

� Designed for interoperability

� Versions traceable to point of usage

� Contains status and classification

� Modification controlled

� Modifications are downward compatible

� Specification for usage defined

� Specification for tailoring defined

01-03 Software item � Integrated software consisting of:

� source code

� software elements

� executable code

� configuration files

� Documentation, which describes:

� and identifies source code

� and identifies software elements

� and identifies configuration files

� and identifies executable code

� software life-cycle status

� archive and release criteria

� compilation of software units

� building of software item

01-04 Knowledge item � Explicit piece of experience:

� documented for sharing

� controlled and maintained

01-05 Test cases � Consisting of

� A set of input values

� execution preconditions

� expected results

� execution post conditions,

� developed for a particular objective or test condition, such as to exercise a particular program path or to verify compliance with a specific requirement

01-06 Test configuration � Overview of the test configuration

Page 123: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 123 of 210

WP ID WP Name WP Characteristics

� Defines each element and their position in the architecture of the system

� Defines the key system interfaces used in test

� Defines any network considerations

� Defines the hardware configuration

� Defines any system performance / parameter settings

01-07 test environment Component

� Every component that can be used to assemble a test environment

01-08 Test script Postponed to Version 4.0

02-00 Contract � Defines what is to be purchased or delivered

� Identifies time frame for delivery or contracted service dates

� Identifies any statutory requirements

� Identifies monetary considerations

� Identifies any warranty information

� Identifies any copyright and licensing information (patent, copyright, confidentiality, proprietary, usage, ownership, warranty and licensing rights associated with all relevant work products)

� Identifies any customer service requirements

� Identifies service level requirements

� References to any performance and quality expectations / constraints / monitoring

� Standards and procedures to be used

� Evidence of review and approval by authorised signatories

� As appropriate to the contract the following are considered references:

� to any acceptance criteria

� to any special customer needs (i.e., confidentiality requirements, security, hardware, etc.)

� to any change management and problem resolution procedures

� identifies any interfaces to independent agents and subcontractors

� identifies customer's role in the development and maintenance process

� identifies resources to be provided by the customer

02-01 Commitment / agreement � Signed off by all parties involved in the commitment / agreement

� Establishes what the commitment is for

� Establishes the resources required to fulfil the commitment, such as:

Page 124: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 124 of 210

WP ID WP Name WP Characteristics

� time

� people

� budget

� equipment

� facilities

� Demonstrates sponsorship and acceptance of the resulting change

02-02 Operational level agreement (OLA)

� Contract which defines subcontracted support services

Note: In the context of TestSPICE “support services” typically

adresses “test services” or “operation of test environment”

02-03 Service level agreement (SLA)

� Agreement (contract) specifying the provision and support of services

Note: In the context of TestSPICE “services” typically adresses “test

services” or “operation of test environment”

03-00 Data � Result of applying a measure

� Available to those who need to know within defined timeframe

03-01 Assessment data � Identifies the objective evidence gathered

� Rationale for the attribute achievement ratings

� The set of process profiles resulting from the assessment (i.e. one profile for each process assessed with attributes ratings)

� The identification of any additional information collected during the assessment that was identified in the assessment input to support process improvement or process capability determination

03-02 Asset use data � Identifies used times and dates

� Identifies the description of the asset, name of the asset or a unique identifier

03-03 Benchmarking data � Results of measurement of current performance that allow comparison against historical or target values

� Relates to key goals / process / product / market need criteria and information to be Benchmarked

03-04 Customer satisfaction data

� Relates to levels of customer satisfaction with products and services

� Results of applying field measures

� Results of customer satisfaction survey

� Interview Notes

� Meeting minutes from customer meetings

03-06 Process performance data

� Appropriate to compare process performance against expected values

� May include records, such as:

� meeting minutes

Page 125: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 125 of 210

WP ID WP Name WP Characteristics

� change records

� quality records

� May include data on:

� resource usage

� process adherence

� extent to which quality criteria are met

� extent to which task completion criteria are met

03-07 Test data � Set of input values that are associated with a set of test cases or test procedures

� Indicate scope of validity

� Identifies associated test cases

03-08 Tester satisfaction data � Relates to levels of tester satisfaction with products and services

� Results of applying field measures

� Results of tester satisfaction survey

� Interview Notes

� Meeting minutes from meetings with testers

03-09 Test asset use data � Identifies used times and dates

� Identifies the description of the test asset, name of the test asset or a unique identifier

03-10 Test performance data � Appropriate to compare test performance against expected values

� May include records, such as:

� meeting minutes

� change records

� quality records

� May include data on:

� resource usage

� process adherence

� extent to which quality criteria are met

� extent to which test completion criteria are met

04-00 Design � Describes the overall product / system structure

� Identifies the required product / system elements

� Identifies the relationship between elements

� Consideration is given to:

� any required performance characteristics

� any required interfaces

Page 126: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 126 of 210

WP ID WP Name WP Characteristics

� any required security characteristics

04-01 Database design � Definition of design characteristics:

� database management system used

� type of system (relational, hierarchical, object oriented, networked)

� format of records, tables, objects

� database access mode

� associated software (programs, user screen formats, reports)

� supported database language

� Definition of logical and physical views, models:

� records (data layouts, fields, tables, structures)

� field names and definitions

� data definitions, classes, structure, etc.

� entity / relationships

� classes, inheritance scheme

� Definition of user views:

� screen layouts

� field access

� data access

� commands

� Input / output interface considerations

� Database usage information (contents, application systems, usage restrictions, etc.)

� Identifies constraints:

� security considerations

� data access considerations

� back-up and recovery considerations

� system restart considerations

� system generations considerations

� performance considerations

04-04 High level software design

� Describes the overall software structure

� Identifies the required software elements

� Identifies the relationship between software elements

� Consideration is given to:

� any required software performance characteristics

� any required software interfaces

� any required security characteristics required

Page 127: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 127 of 210

WP ID WP Name WP Characteristics

� any database design requirements

� any required error handling and recovery attributes

04-05 Low level software design

� Provides detailed design (could be represented as a prototype, flow chart, entity relationship diagram, pseudo code, etc.)

� Provides format of input / output data

� Provides specification of data storage needs

� Establishes required data naming conventions

� Defines the format of required data structures

� Defines the data fields and purpose of each required data element

� Provides the specifications of the program structure

04-06 System architecture design

� Provides an overview of all system design

� Describes the interrelationship between system elements

� Describes the relationship between the system elements and the software

� Specifies the design for each required system element, consideration is given to things like:

� memory / capacity requirements

� hardware interfaces requirements

� user interfaces requirements

� external system interface requirements

� performance requirements

� commands structures

� security / data protection characteristics

� system parameter settings

� manual operations

� reusable components

� Mapping of requirements to system elements

04-07 Organizational structure � Describes an organization:

� structure

� roles

� responsibilities

04-08 architectural model � Provides an overview of all system design

� Describes the interrelationship between system elements

� Describes the relationship between the system elements and the software

� Specifies the design for each required system element

04-09 Role model � Provides an overview of all positions into the test team

Page 128: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 128 of 210

WP ID WP Name WP Characteristics

� Consists of detailed job descriptions for each role in the team

04-10 Job description � Provides an overview of the tasks and responsibilities of a defined role into the test team

� Defines the scope of work for each role

04-11 test environment design � Provides the design of the test environment

Postponed to Version 4.0

04-12 Test environment test design

� Provides the design of the test of the test environment

Postponed to Version 4.0

04-13 Test automation architecture design

Postponed to Version 4.0

04-14 Test automation design Postponed to Version 4.0

05-00 Goals � Identifies the objective to be achieved

� Identifies who is expected to achieve the goal

� Identifies any incremental supporting goals

� Identifies any conditions / constraints

� Identifies the timeframe for achievement

� Are reasonable and achievable within the resources allocated

� Are current, established for current project, organization

� Are optimized to support known performance criteria and plans

05-01 Assessment goals � No characteristics additional to Goals (Generic)

05-02 Business goals � Contains a description of the goal

� Identifies a requirement specification for the business need

� Identifies association and interfaces to other goals

� Identifies the level of degree of the need and effect on the business of not having that need.

05-03 Core values statement � Defines the values that govern the relationships between internal and external stakeholders

� Is authorized at the highest level

05-04 Mission statement � Identifies the reasons for the existence of the enterprise

� Informs the development of the core values and vision statement

� Is authorised at the highest level

05-05 Vision statement � Identifies the main objectives to be achieved

� Provides information on the overall strategy for the organizational unit, organization, or business

� Is authorized at the highest level

05-06 Quality goals � Establishes goals related with:

� project / process effectiveness,

Page 129: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 129 of 210

WP ID WP Name WP Characteristics

� customer satisfaction

� product quality

� people satisfaction

05-07 Test Goals � Establishes goals traceable to business needs and objectives:

� validate products for 'fit-for-use'

� prevent defects for occurring in operation

� verify compliance to external standards

� provide visibility regarding product quality

� short test execution lead-time

06-00 User documentation � Identifies:

� external documents

� internal documents

� current site distribution and maintenance list maintained

� Documentation kept synchronized with latest product release

� Addresses technical issues

06-01 Customer manual � Takes account of:

� audience and task profiles

� the environment in which the information will be used

� convenience to users

� the range of technical facilities, including resources and the product, available for developing and delivering on-screen documentation

� information characteristics

� cost of delivery and maintainability

� Includes information needed for operation of the system, including but not limited to:

� product and version information

� instructions for handling the system

� initial familiarisation information

� non-trivial examples of the use

� structured reference material, particularly for advanced features of the software

� checklists

� guides to use input devices

06-02 Handling and storage guide

� Defines the tasks to perform in handling and storing products including:

� providing for master copies of code and documentation

� disaster recovery

Page 130: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 130 of 210

WP ID WP Name WP Characteristics

� addressing appropriate critical safety and security issues

� Provides a description of how to store the product including:

� storage environment required

� the protection media to use

� packing materials required

� what items need to be stored

� assessments to be done on stored product

� Provides retrieval instructions

06-03 Installation guide � Tasks for loading / installing product sequentially order by execution requirements:

� downloading of software from delivery files

� up-loading to appropriate software to files, folders, libraries, etc.

� partial or upgrade installation instructions, where applicable

� initialization procedures

� conversion procedures

� customization / configuration procedures

� verification procedures

� bring-up procedures

� operations instructions

� Installation requirements identified:

� associated hardware, software, customer documentation

� conversion programs and instructions

� initialization programs, system generation information

� components and descriptions

� minimum configuration of hardware / software required

� backup / recovery instructions

� validation programs

� configuration parameters (e.g. size requirements, memory)

� Customer / technical support contacts

� Troubleshooting guide

� Rollback plan

06-04 Training material � Updated and available for new releases

� Coverage of system, application, operations, maintenance as appropriate to the application

� Courses listings and availability

06-05 Product operation guide � Criteria for operational use

Page 131: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 131 of 210

WP ID WP Name WP Characteristics

� Provides a description of how to operate the product including:

� operational environment required

� supporting tools and material (e.g. user manuals) required

� possible safety warnings

� start-up preparations and sequence

� frequently asked questions (FAQ)

� sources of further information and help to operate the product

� Certification and safety approvals

� Warranty and replacement instructions

06-06 Handbook � Includes information needed for operation of the test system, including but not limited to:

� product and version information

� instructions for handling the system

� initial familiarisation information

� non-trivial examples of the use

� structured reference material, particularly for advanced features of the software

� checklists

� guides to use input devices

� guides to implement the test services

06-07 System documentation � Consists of all documents relating the system and the interfaces

� handbooks

� design documents

� interface documents

06-08 Training documentation � Includes descriptions of the used test tools

� Includes examples of using the tools

� Includes sample configurations

06-09 test environment operation manual

� Consists of all documents relating the test environment and the interfaces

� handbooks

� design documents

� interface documents

06-10 Test object documentation

� Describes the content and the implementation of a test object

06-11 Test automation maintenance handbook

Postponed to Version 4.0

Page 132: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 132 of 210

WP ID WP Name WP Characteristics

07-00 Measure � Quantitative or qualitative attribute for a product or process

� Defines the method for collecting data

� Understood by those expected to use them

� Provides value to the organization / project

� References any relevant goals

� Non-disruptive to the work flow

� Appropriate to the process, life cycle model, organization

� Has appropriate analysis and commentary to allow meaningful interpretation by users

07-01 Customer satisfaction survey

� Mechanism to collect data on customer satisfaction:

� Identifies customers to be contacted

� Identifies the data to be collected from the customer

� Target date for responses

� Identifies products/services under investigation

� Methods to analyse feedback

07-02 Field measure � Identifies attributes of system's operation at field locations, such as:

� field defects

� performance against defined service level measures

� system ability to meet defined customer requirements

� support time required

� user complaints (may be third party users)

� customers requests for help

� performance trends

� problem reports

� enhancements requested

07-04 Process measure � Includes measures related to the performance of a process, such as:

� size and number of work products produced

� adherence to the process

� time needed to perform process

� effort needed to perform process

� number of defects related to the process

� Measures the impact of process change

� Measures the efficiency of the process

07-05 Project measure � Appropriate to monitor key processes and critical tasks of a project

� Includes measures related to the project on:

� project performance against established plan

Page 133: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 133 of 210

WP ID WP Name WP Characteristics

� resource utilization against established plan

� time schedule against established plan

� process quality against quality expectations and/or criteria

� product quality against quality expectations and/or criteria

� highlight product performance problems, trends

� amount of work scheduled

� actual cost against tasks completed

07-06 Quality measure � Measures quality attributes of the work products defined, such as:

� functionality

� reliability

� usability

� efficiency

� maintainability

� portability

� Measures quality attributes of the "end customer" product quality and reliability

Note: Refer ISO/IEC 9126 for detailed information on measurement

of product quality.

07-07 Risk measure � Identifies the probability of risk occurring

� Identifies the impact of risk occurring

� Identifies the change in the risk state

07-08 Service level measure � Real time measures taken while a system is operational, it measures the system's performance or expected service level

� Identifies things like:

� capacity

� throughput

� operational performance

� operational service

� service outage time

� up time

� job run time

07-09 Tester satisfaction survey � Mechanism to collect data on tester satisfaction:

� Identifies testers to be contacted

� Identifies the data to be collected from the tester

� Target date for responses

� Identifies products/services under investigation

� Methods to analyse feedback

Page 134: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 134 of 210

WP ID WP Name WP Characteristics

07-10 Activity-based measures � Provides an activity-based overview of the test progress by identifying following attributes:

� percentage of work done in test planning

� percentage of work done in test case preparation

� percentage of work done in preparing the test environment

07-11 Test case-based measures

� Provides a test case-based overview of the test progress by identifying following attributes:

� number of test cases run/not run

� test cases passed/failed

07-12 Error-based measures � Provides an error-based overview of the test progress by identifying following attributes:

� defect density

� defects found and fixed-

� failure rate

� retest results

07-13 Test object-based measures

� Provides a test object-based overview of the test progress by identifying following attributes:

� test coverage of requirements, risks or code

07-14 Cost-based measures � Provides a cost-based overview of the test progress by identifying following attributes:

� test costs, inclusive cost compared to the benefit of finding the next defect or to run the next test

08-00 Plan � Identifies the plan owner

� Includes:

� the objective and scope of what is to be accomplished

� assumptions made

� constraints

� risks

� tasks to be accomplished

� schedules, milestones and target dates

� critical dependencies

� maintenance disposition for the plan

� Method / approach to accomplish plan

� Identifies:

� task ownership, including tasks performed by other parties (e.g. supplier, customer)

� quality criteria

� required work products

� Includes resources to accomplish plan objectives:

Page 135: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 135 of 210

WP ID WP Name WP Characteristics

� time

� staff (key roles and authorities e.g. sponsor)

� materials / equipment

� budget

� Includes contingency plan for non-completed tasks

� Plan is approved

08-01 Acceptance test plan � Identified activities to be performed to test "deliverable" end customer product

� Identifies who has responsibility for performance of acceptance test activities (supplier or customer)

� Identifies the system configuration requirements for site

� Identifies the installation requirements for site

� Identifies how to validate installation activities were performed correctly

� Identifies how to validate that the deliverables (hardware / software / product) satisfied the customer requirements

� Identifies associated test scripts / test cases

� Identifies actions to be take upon acceptance of product

� Refers to Quality plan

08-02 Acquisition plan � Identifies what needs to be acquired

� Establishes the approach for acquiring the product or service; options might include:

� off-the-shelf

� develop internally

� develop through contract

� enhance existing product or combination of these

� Establishes the evaluation and supplier selection criteria

� Acceptance strategy

08-03 Process assessment plan

� The identity of the sponsor of the assessment and the sponsor's relationship to the organizational unit being assessed

� The assessment purpose including alignment with business goals

� The assessment scope including:

� the processes to be investigated within the organizational unit

� the highest capability level to be investigated for each process within the assessment scope

� the organizational unit that deploys these processes

� The context which, as a minimum, includes:

� the size of the organizational unit

Page 136: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 136 of 210

WP ID WP Name WP Characteristics

� the demographics of the organizational unit

� the application domain of the products or services of the organizational unit

� the size, criticality and complexity of the products or services

� the quality characteristics of the products

� The assessment constraints which may include:

� availability of key resources

� the maximum amount of time to be used for the assessment

� specific processes or organizational units to be excluded from the assessment

� the minimum, maximum or specific sample size or coverage that is desired for the assessment

� the ownership of the assessment outputs and any restrictions on their use

� controls on information resulting from a confidentiality agreement

� The identity of the model(s) used within the assessment

� The assessment approach or methodology

� The identity of the assessors, including the competent assessor with specific responsibilities for the assessment

� the criteria for competence of the assessor who is responsible for the assessment

� The identity of assesses and support staff with specific responsibilities for the assessment

� Any additional information to be collected during the assessment to support process improvement or process capability determination

08-04 Configuration management plan

� Defines or references the procedures to control changes to configuration items

� Defines measurements used to determine the status of the configuration management activities

� Defines configuration management audit criteria

� Approved by the configuration management function

� Identifies configuration library tools or mechanism

� Includes management records and status reports that show the status and history of controlled items

� Specifies the location and access mechanisms for the configuration management library

� Storage, handling and delivery (including archival and retrieval) mechanisms specified

08-05 Development � Floor plan

Page 137: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 137 of 210

WP ID WP Name WP Characteristics

environment plan � Environmental safety considerations

� Regulatory requirements

� Contractual requirements

� Security considerations

� Facility configuration

� Special environmental requirements (e.g. air conditioning, raised floor, power)

� Individual workspace needs defined

� Workstations requirements

� Supporting hardware / software / product

� Tools

� Communication equipment

� Disaster recovery plan

08-06 Project activity network � A graphic illustration of a project as a network diagram showing all of the project's activities, their attributes, and the relationships between them; the most common form is the PERT chart

� Activity attributes include:

� activity name

� estimated duration

� planned and actual start date

� planned and actual completion date

� resource requirements

� The relationships between the activities may include:

� predecessor activities

� successor activities

� dependency delays

08-07 System integration test plan

� Purpose of integration defined:

� validation of the integrated elements of the system

� validation of the integration of the system elements (hardware, support equipment, interfaced system)

08-08 Human Resource Management plan

� Human resource objectives / goals / policies

� Satisfaction of human resource needs:

� required skills identified

� required competencies identified

� skills acquisition and retention strategy

� staff availability and project allocation

� Human resource management:

� statutory and regulatory requirements

Page 138: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 138 of 210

WP ID WP Name WP Characteristics

� conditions and benefits

� organization reporting and communication structure

� staff development

� performance evaluation criteria

08-09 Installation and maintenance plan

� Identifies impacted site locations

� Identifies the required components for the installation with appropriate version information (consideration given to at least the following):

� released software

� type of media

� required maintenance fixes

� support software required (conversion programs, validation routines, associated system interfaces, data base management system)

� required customer documentation

� installation instructions

� required hardware and peripheral equipment

� Identifies supporting information or materials required:

� parameter information

� operation and maintenance information

� pre-conversion information, materials or installed equipment

� Type of installation (new vs. conversion of existing system, maintenance)

� Custody of master and backup copies

� Identifies go / no-go decision criteria

� Identifies verification process:

� of required tasks to prepare deliverables required

� of components required at site

� of installation procedures

� of pre-installation construction or conversion activities

� of system integration, release builds, etc.

� Identifies customer acceptance requirements

� Identifies any safety and security requirements

08-10 Software integration test plan

� Purpose of integration defined:

� validation of a subset of the system (all programs required to make a sub-system work, a feature work, etc.)

� validation of the integration of software to other system elements (hardware, support equipment, interfaced system)

Page 139: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 139 of 210

WP ID WP Name WP Characteristics

08-11 Logistics maintenance plan

� Identifies:

� impacted site locations

� backup and recovery procedures

� customer contacts and technical support personnel

� customer acceptance requirements

� any safety and security requirements

08-12 Project plan � Defines:

� work products to be developed

� life cycle model and methodology to be used

� customer requirements related to project management

� tasks to be accomplished

� task ownership

� project resources

� schedules, milestones and target dates

� estimates

� quality criteria

� Identifies:

� critical dependencies

� required work products

� project risks and risk mitigation plan

� contingency actions for non-completed tasks

08-13 Quality plan � Objectives / goal for quality

� Defines the activities tasks required to ensure quality

� References related work products

� Method of assessment / assuring quality

� References any regulatory requirements, standards, customer requirements

� Identifies the expected quality criteria

� Specifies the monitoring timeframe and quality checkpoints for the defined life cycle and associated activities planned

� Target timeframe to achieve desired quality

� Method to achieved goals:

� tasks to be performed

� ownership for tasks

� audit to be performed

� resource commitments

� Identifies the quality criteria for work products and process tasks

� Specifies the threshold / tolerance level allowed prior to

Page 140: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 140 of 210

WP ID WP Name WP Characteristics

requiring corrective actions

� Defines quality measurements and benchmark data

� Defines the quality record collection mechanism and timing of the collection

� Specifies mechanism to feed collected quality record back into process impacted by poor quality

� Approved by the quality responsible organization / function

08-14 Recovery plan � Identifies what is to be recovered:

� procedures / methods to perform the recovery

� schedule for recovery

� time required for the recovery

� critical dependencies

� resources required for the recovery

� list of backups maintained

� staff responsible for recovery and roles assigned

� special materials required

� required work products

� required equipment

� required documentation

� locations and storage of backups

� procedure for retrieving backup media

� contact information on who to notify about the recovery

� verification procedures

� cost estimation for recovery

08-15 Regression test plan � Plan for validating that existing systems / features-functions have not been impacted by a change

� Plan for validating that change has not impacted working elements of the system (interfaces, operations, etc.)

� Plan for validating that change is compatible with existing system requirements (downward compatible)

� Identifies the requirements for system element not changed

� Identifies what system elements are to be regression tested (i.e., features, functions, interfaces, fixes)

� Identifies the changes made

� Identifies the regression test cases to be executed

� Conditions for execution of regression testing

08-16 Release plan � Identifies the functionality to be included in each release

� Identifies the associated elements required (i.e., hardware, software, documentation etc.)

� Mapping of the customer requests, requirements satisfied to

Page 141: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 141 of 210

WP ID WP Name WP Characteristics

particular releases of the product

08-17 Reuse plan � Defines the policy about what items to be reused

� Defines standards for construction of reusable objects:

� defines the attributes of reusable components

� quality / reliability expectations

� standard naming conventions

� Defines the reuse repository (library, CASE tool, file, data base, etc.)

� Identifies reusable components:

� directory of components

� description of components

� applicability of their use

� method to retrieve and use them

� restrictions for modifications and usage

� Method for using reusable components

� Establishes goal for reusable components

08-18 Review plan � Defines:

� what to be reviewed

� roles and responsibilities of reviewers

� criteria for review (check-lists, requirements, standards)

� expected preparation time

� schedule for reviews

� Identifies:

� procedures for conducting review

� review inputs and outputs

� expertise expected at each review

� review records to keep

� review measurements to keep

� resources, tools allocated to the review

08-19 Risk management plan � Project risks identified and prioritized

� Mechanism to track the risk

� Threshold criteria to identify when corrective action required

� Proposed ways to mitigate risks:

� risk mitigator

� work around

� corrective actions activities / tasks

� monitoring criteria

� mechanisms to measure risk

Page 142: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 142 of 210

WP ID WP Name WP Characteristics

08-20 Risk mitigation plan � Planned risk treatment activities and tasks:

� describes the specifics of the risk treatment selected for a risk or combination of risks found to be unacceptable

� describes any difficulties that may be found in implementing the treatment

� Treatment schedule

� Treatment resources and their allocation

� Responsibilities and authority:

� describes who is responsible for ensuring that the treatment is being implemented and their authority

� Treatment control measures:

� defines the measures that will be used to evaluate the effectiveness of the risk treatment

� Treatment cost

� Interfaces among parties involved:

� describes any coordination among stakeholders or with the project's master plan that must occur for the treatment to be properly implemented

� Environment / infrastructure:

� describes any environmental or infrastructure requirements or impacts (e.g., safety or security impacts that the treatment may have)

� Risk treatment plan change procedures and history

08-21 Software test plan � Identifies strategy for verifying that features and/or functions operate as defined in the requirements

08-22 System test plan � Identifies strategy for verifying the integration of system elements as defined in the system architecture specification

� Identifies compliance criteria for system requirements

� Provides test coverage for all elements of the system:

� software

� hardware

� interfaces

� customer documentation

� installation activities

� initialization

� conversion programs

08-23 Validation test plan � Identifies the approach to performing the test

� Identifies elements to be tested

� Identify aggregates and sequence for testing

� Identify urgent release

� Identifies required system configuration (software, hardware,

Page 143: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 143 of 210

WP ID WP Name WP Characteristics

interface elements)

� Identifies the associated development owner for elements to be tested

� Identifies associated test scripts / test cases

� Sequence ordering of how testing will be executed

� Identifies requirements which will be validated by tests (i.e., customer requirements, regulatory requirements and system requirements

� Identifies the problem reporting mechanism

� Identifies the test tools and resources required (test channels, analysers, test emulators, etc.)

� Identifies the test schedule

� Identifies the test completion criteria

� Identifies audits to be performed

� Official source libraries and versions of hardware / software / product defined

08-24 Training plan � Defines current staff capabilities

� Defines the skills required

� Outlines means available to achieve training goals

08-25 Unit test plan � Identifies strategy for verifying unit functionality and non-functional requirements (i.e., a program, a block, a module, a routine) against the requirements and design

� Specifies how requirements will be verified

08-26 Documentation plan � Identifies documents to be produced

� Defines the documentation activities during the life cycle of the software product or service

� Identifies any applicable standards and templates

� Defines requirements for documents

� Review and authorization practices

� Document update/review/acceptance time constraints

� Distribution of the documents

� Maintenance and disposal of the documents

08-27 Problem management plan

� Defines problem resolution activities including identification, recording, description and classification

� Problem resolution approach: evaluation and correction of the problem

� Defines problem tracking

� Any timing constraints

� Mechanism to collect and distribute problem resolutions

08-28 Change management plan

� Defines change management activities including identification, recording, description, analysis and implementation

Page 144: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 144 of 210

WP ID WP Name WP Characteristics

� Defines approach to track status of change requests

� Defines verification and validation activities

� Change approval and implication review

08-29 Improvement plan � Improvement objectives derived from organizational business goals

� Organizational scope

� Process scope, the processes to be improved

� Key roles and responsibilities

� Appropriate milestones, review points and reporting mechanisms

� Activities to be performed to keep all those affected by the improvement program informed of progress

08-30 Verification plan � Presents how verification activities will be conducted based on the verification strategy.

08-31 Test Plan � Identifies the approach, scope, resources and schedule of intended test activities

� Identifies the test items

� Identifies the features to be tested

� Identifies the test tasks and who will do each task

� Identifies the degree of tester independence

� Identifies the test environment

� Identifies the test design techniques

� Identifies the used entry and exit criteria and the rationale for their choice

� Identifies any risks requiring contingency planning

� Identifies milestones

� Is a record of the test planning process

08-32 Project process plan � Defines the implemented project processes

08-33 Test infrastructure plan � Identifies the procedures, standards, tools and techniques that the test infrastructure process should support:

� possible test approaches

� standards (IEEE)

� test design techniques

� descriptions of hardware and software needs for the test execution

� using of stubs or drivers,

� using of tools

� Identifies the test infrastructure requirements depending:

� security

� throughput and data sharing requirements

Page 145: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 145 of 210

WP ID WP Name WP Characteristics

� backup and recovery

� remote access facility

� physical workspace and equipment

� user support requirements

� maintenance requirements

08-34 Stakeholder plan � Identifies relevant stakeholders

� Identifies how relevant information is brought to the stakeholders

� Identifies how needed commitment is achieved

08-35 Integration plan � Describes the implementation and integration of the test infrastructure requirements in the test environment

� Ensures an effective environment that supports implementation of the test processes

08-36 Communication plan � Defines communication channels , e.g. between tester and developer, tester and project management, tester and user

� Gives a structure for regular meetings within the test team and with other involved parties

08-37 Quality assurance plan � Specifies the minimum standards of quality assurance

� Includes company quality standards

� Includes Best practises

� Includes Industry standards

� Includes legal standards

� Includes Quality management standards

� Includes test standards

� Specifies how the standards are implemented in the organization

08-38 Test environment configuration plan

� Requirements

� Regulatory requirements

� Contractual requirements

� Security considerations

� Rooms

� Floor plan

� Environmental safety considerations

� Facility configuration

� Special environmental requirements (e.g. air conditioning, raised floor, power)

� Workplace

� Individual workspace needs defined

� Workstations requirements

Page 146: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 146 of 210

WP ID WP Name WP Characteristics

� Components

� Network

� Hardware

� Software (including stubs, test drivers, test automation tools and test utilities)

� test environment operating Tools (e.g Monitoring)

� Communication equipment

� User Management

� User Access rights ( User registration, Privilege Management, User Password Management, Review of User Access rights, Removal of access rights)

� Misc

� Disaster recovery plan

08-40 Staffing plan � Defines the staff necessary to run a test project

08-41 Test environment disassembly plan

� Defines the disassembly and the archiving of a test environment

� Network Components

� Hardware Components

� Software Components

� Test data

� User Access Rights

� Archiving

� Reassembly

� Defines reuse or disposal of components

08-42 Test data provision plan � Defines the provision of test data

� Content

� Security requirements

� Anonymization approach

� Test data type (production, produced in environment, synthetic)

� Test data format (e.g. script, SQL, load …)

� Delivery procedure

� Constraints

� Target environments

09-00 Policy � Authorized

� Available to all personnel impacted by the policy

� Establishes practices / rules to be adhered to

09-01 Personnel policy � Defines recruitment policy

� Defines training policy

Page 147: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 147 of 210

WP ID WP Name WP Characteristics

� Defines career opportunities for individuals in the organization

� Defines team building strategy

� Defines reward and recognition strategy

� Defines performance appraisal strategy

09-02 Quality policy � Established by the top management

� Appropriate to the organization

� Addresses product and process quality goals

� Supports the establishment and review of quality objectives

� Commitment to comply with requirements

� Commitment to improve the effectiveness of the quality management system

09-03 Reuse policy � Identifies reuse requirements

� Establishes the rules of reuse

� Documents the reuse adoption strategy including goals and objectives

� Identifies the reuse program

� Identifies the name of the reuse sponsor

� Identifies the reuse program participants

� Identifies the reuse steering function

� Identifies reuse program support functions

09-04 Supplier selection policy � Establishes practices / rules to be adhered to evaluate and select subcontractors on the basis of their ability to meet subcontract requirements

� Defines the type and extent of control exercised over suppliers

� Establishes the need for, and maintenance requirements of, records associated with supplier selection

09-05 Test policy � Describes the organization's philosophy towards testing

� Includes a definition of testing

� Includes a definition of debugging (fault localization and repair)

� Includes basic viewpoints regarding testing and the testing profession

� Describes the objectives and added value of testing

� Defines quality levels to be achieved

� Defines the level of independence of the test organization

� Includes a high level test process definition

� Identifies the key responsibilities of testing

� Describes the organizational approach to and objectives of test process improvement

Page 148: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 148 of 210

WP ID WP Name WP Characteristics

10-00 Process description � A detailed description of the process / procedure which includes:

� tailoring of the standard process (if applicable)

� purpose of the process

� outcomes of the process

� task and activities to be performed and ordering of tasks

� critical dependencies between task activities

� expected time required to execute task

� input / output work products

� links between input and output work products

� Identifies process entry and exit criteria

� Identifies internal and external interfaces to the process

� Identifies process measures

� Identifies quality expectations

� Identifies functional roles and responsibilities

� Approved by authorised personnel

10-01 Life cycle model � High level description of activities performed at each life cycle phase

� Sequencing of the life cycle phases

� Identifies critical life cycle phase dependencies

� Identifies required inputs, outputs to each life cycle phase

� Identifies the key decision points (milestones) in the model

� Identifies the quality control points in the model

10-02 Test procedure � Identifies:

� test name

� test description

� test completion date

� Identifies potential implementation issues

� Identifies the person who completed the test procedure

� Identifies prerequisites

� Identifies procedure steps including the step number, the required action by the tester and the expected results

� Used in testing related to:

� software and system installation

� software integration

� software

� system integration

� system

Page 149: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 149 of 210

WP ID WP Name WP Characteristics

10-03 Customer support procedure

� Defines the availability and coverage support provided:

� hot-line number

� hours of availability

� appropriate expertise

� cost

� Defines a schema for classification of customer request and/or problems:

� definition of request type

� definition of priority / severity

� definition of response time expectations, by type and severity

� Standards for what information to retain from a customer, such as:

� company and location

� contact information details

� description of the request

� reference to supporting information sent (dumps, files)

� customer system site configuration information (product, release, version, last update)

� impacted system(s)

� impact to operations of existing systems

� criticality of the request

� expected customer response / closure requirements

� Definition of customer escalation procedures

� Identifies customer support tools available and procedures for using them, such as:

� mechanism used to record customer requests

� status reports

� ability to reproduce customers hardware / software / product environment

� ability to reproduce problems, including available systems

� test emulators

� test scripts

� telecommunication connections

� dump analysis tools

10-04 Quality manual � Provide consistent information, both internally and externally, about the organization's quality management system

� Scope of the quality management system, including details of and justification for any exclusions,

� Identifies the documented procedures established for the

Page 150: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 150 of 210

WP ID WP Name WP Characteristics

quality management system, or reference to them

� Description of the interaction between the processes of the quality management system

10-05 Test support procedure � Defines the availability and coverage support provided:

� hot-line number

� hours of availability

� appropriate expertise

� cost

� Defines a schema for classification of support request and/or problems:

� definition of request type

� definition of priority / severity

� definition of response time expectations, by type and severity

� Standards for what information to retain from a tester, such as:

� contact information details

� description of the request

� reference to supporting information sent (dumps, files)

� test system configuration information (product, release, version, last update)

� impacted system(s)

� impact to operations of existing systems

� criticality of the request

� expected tester response / closure requirements

� Definition of escalation procedures

� Identifies support tools available and procedures for using them, such as:

� mechanism used to record support requests

� status reports

� ability to reproduce test hardware / software / product environment

� ability to reproduce problems, including available systems

� test emulators

� test scripts

� telecommunication connections

� dump analysis tools

10-06 Test environment intake test procedure

� Defines the procedure to check if the test environment is ready for testing or not

� Defines a list of checks to be carried out during the intake test

Page 151: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 151 of 210

WP ID WP Name WP Characteristics

of the test environment

� Consists of the test environment intake checklist and any other information needed for performing the test environment intake test

� Is documented in the Test environment intake test procedure specification

10-07 test environment intake Procedure

� Defines intake of test environment

Postponed to Version 4.0

10-08 test object intake procedure

� Defines intakes of test objects

Postponed to Version 4.0

11-00 Product � Is a result / deliverable of the execution of a process, includes services, systems (software and hardware) and processed materials

� Has elements that satisfy one or more aspects of a process purpose

� May be represented on various media (tangible and intangible)

11-01 Software product � An aggregate of software items

� A set of computer programs, procedures, and possibly associated documentation and data

11-03 Product release information

� Coverage for key elements (as appropriate to the application):

� Description of what is new or changed (including features removed)

� System information and requirements

� Identifies conversion programs and instructions

� Release numbering implementation may include:

� the major release number

� the feature release number

� the defect repair number

� the alpha or beta release; and the iteration within the alpha or beta release

� Identifies the component list (version identification included):

� hardware / software / product elements, libraries, etc.

� associated documentation list

� New / changed parameter information and/or commands

� Backup and recovery information

� List of known problems, faults, warning information, etc.

� Identifies verification and diagnostic procedures

� Technical support information

� Copyright and license information

� The release Note may include an introduction, the

Page 152: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 152 of 210

WP ID WP Name WP Characteristics

environmental requirements, installation procedures, product invocation, new feature identification and a list of defect resolutions, known defects and workarounds

11-04 Product release package � Includes the hardware / software / product

� Includes and associated release elements such as:

� system hardware / software / product elements

� associated customer documentation

� parameter definitions defined

� command language defined

� installation instructions

� release letter

11-05 Software unit � Follows established coding standards (as appropriate to the language and application)

� Follows data definition standards (as appropriate to the language and application):

� Entity relationships defined

� Data base layouts are defined

� File structures and blocking are defined

� Data structures are defined

� Algorithms are defined

� Functional interfaces defined

11-06 System � All elements of the product release are included

� Any required hardware

� Integrated product

� Customer documentation

� Fully configured set of the system elements:

� parameters defined

� commands defined

� data loaded or converted

11-07 Temporary solution � Problem identification

� Release and system information

� Temporary solution, target date for actual fix identified

� Description of the solution:

� limitations, restriction on usage

� additional operational requirements

� special procedures

� applicable releases

� Backup / recovery information

� Verification procedures

Page 153: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 153 of 210

WP ID WP Name WP Characteristics

� Temporary installation instructions

11-08 System element � A discrete part of a system

� Implemented to fulfil specified requirements

� May include software items, hardware items, manual operations, and other systems, as necessary

11-09 Release information � Coverage for key elements (as appropriate to the application):

� Description of what is new or changed (including test features removed)

� Test information and requirements

� Identifies conversion programs and instructions

� Release numbering implementation may include:

� the major release number

� the feature release number

� the defect repair number

� Identifies the component list (version identification included):

� test elements, libraries, etc.

� associated test documentation list

� Backup and recovery information

� List of known problems, faults, warning information, etc.

� Technical support information

� Copyright and license information

� The release Note may include an introduction, the environmental requirements, installation procedures, product invocation, new feature

11-10 Release package � Includes the testware

� Includes and associated release elements such as:

� associated test documentation

� parameter definitions defined

� installation instructions

� release letter

11-11 Testware � Produced during the test process required to plan, design, and execute tests

� Includes documentation, scripts, inputs, expected results, set-up and clear-up procedures, files, databases, environment, and any additional software or utilities used in testing.

� Review material

� Test artefacts

11-12 Test automation solution

� Consist of

� source code

Page 154: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 154 of 210

WP ID WP Name WP Characteristics

� software elements

� executable code

� configuration files

� Documentation, which:

� describes the handling of the script

� the automated test cases

11-13 Test documentation � Consists of all documents generated through the test process

� test plan

� test schedule

� test case specifications

� test logs

� test reports

� Contains also all information to test infrastructure and test environment

� test configuration

� list of tools

11-14 Test service � Consists of planning, preparation, execution and analysis of the test process

11-15 Test product � Deliverable assembled testware

11-16 Test object � Container for different types of testware that enables

� Test planning

� Test design

� Test execution

11-17 Test Environment Postponed to Version 4.0

12-00 Proposal � Defines the proposed solution

� Defines the proposed schedule

� Identifies the coverage identification of initial proposal:

� the requirements that would be satisfied

� the requirements that could not be satisfied, and provides a justification of variants

� Identifies conditions (e.g. time, location) that affect the validity of the proposal

� Identifies obligations of the acquirer and the consequences of these not being met

� Defines the estimated price of proposed development, product, or service

12-01 Request for proposal � Reference to the requirements specifications

� Identifies supplier selection criteria

� Identifies desired characteristics, such as:

Page 155: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 155 of 210

WP ID WP Name WP Characteristics

� system architecture, configuration requirements or the requirements for service (consultants, maintenance, etc.)

� quality criteria or requirements

� project schedule requirements

� expected delivery / service dates

� cost / price expectations

� regulatory standards / requirements

� Identifies submission constraints:

� date for resubmission of the response

� requirements with regard to the format of response

12-02 Retirement request � Identifies the name of the component / project for retirement

� Identifies a basic description

� Identifies the proposed date of retirement

� Identifies the duration of the life of the component / project

� Identifies the person who will approve the retirement

12-03 Reuse proposal � Identifies the project name

� Identifies the project contact

� Identifies the reuse goals and objectives

� Identifies the list of reuse assets

� Identifies the issues / risks of reusing the component including specific requirements (hardware, software, resource and other reuse components)

� Identifies the person who will be approving the reuse proposal

12-04 Supplier proposal response

� Defines the suppliers proposed solution

� Defines the suppliers proposed delivery schedule

� Identifies the coverage identification of initial proposal:

� identifies the requirements that would be satisfied

� identifies the requirements that could not be satisfied, and provides a justification of variants

� Defines the estimated price of proposed development, product, or service

13-00 Record � Work product stating results achieved or provides evidence of activities performed in a process

� An item that is part of a set of identifiable and retrievable data

13-01 Acceptance record � Record of the receipt of the delivery

� Identifies the date received

� Identifies the delivered components

� Records the verification of any customer acceptance criteria defined

Page 156: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 156 of 210

WP ID WP Name WP Characteristics

� Signed by receiving customer

13-03 Back-up / recovery record

� Date of back-up

� Listing of what was backed-up with associated versions

� Listing of where it was backed-up to

� Identifies associated system attributes and configuration at time of back- up

� Identifies associated recovery procedures

13-04 Communication record � All forms of interpersonal communication, including:

� letters

� faxes

� e-mails

� voice recordings

� telexes

13-05 Contract review record � Scope of contract and requirements

� Possible contingencies or risks

� Alignment of the contract with the strategic business plan of the organization

� Protection of proprietary information

� Requirements which differ from those in the original documentation

� Capability to meet contractual requirements

� Responsibility for subcontracted work

� Terminology

� Customer ability to meet contractual obligations.

13-06 Delivery record � Record of items shipped / delivered electronically to customer

� Identifies:

� who it was sent to

� address where delivered

� the date delivered

� Record receipt of delivered product

13-07 Problem record � Includes a description of the problem

� Describes a defect (a non-fulfilment of a requirement related to an intended or specified use)

� Identifies:

� the name of submitted and associated contact details

� the group / person(s) responsible for providing a fix

� classification of the problem (criticality, urgency, relevance etc.)

� the severity of the problem (critical, major, minor)

Page 157: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 157 of 210

WP ID WP Name WP Characteristics

� the status of the reported problem

� the target release(s) problem will be fixed in

� the date "opened"

� the expected closure date

� any closure criteria

� re-inspection actions

� system configuration information (such as: release versions, system software, hardware configuration, etc.)

� any associated defect reports, customer requests, duplicate problems, associated fixes�

� the components of the product affected

� any associated support information (dumps, files, etc.)

� the applicable software product release and version information

13-08 Installation record � Record of what was installed

� Release and system configuration information recorded

� Special site specific information recorded

� Identifies any acceptance testing performed

� Installation performance information captured:

� number of faults found after the installation or conversion

� time to install

� Ability to bring up system after installation conversion

� Record of customer approval

13-09 Meeting support record � Agenda and minutes that are records that define:

� purpose of meeting

� attendees

� date, place held

� reference to previous minutes

� what was accomplished

� identifies issues raised

� any open issues

� next meeting, if any

13-10 Configuration management record

� Status of the work products / items and modifications

� Identifies items under configuration control

� Identifies activities performed e.g. backup, storage, archiving, handling and delivery of configured items

� Supports consistency of the product

13-11 Personnel performance review record

� Relevant information about personnel including:

� appraisal history

Page 158: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 158 of 210

WP ID WP Name WP Characteristics

� describes achievements, or lack of

� disciplinary history

13-12 Personnel record � Relevant information about personnel including:

� name, address, date of birth, marital status

� grade, and pay

� qualifications

� education

� skills

� experience

� training

13-14 Progress status record � Record of the status of a plan(s) (actual against planned) such as:

� status of actual tasks against planned tasks

� status of actual results against established objectives / goals

� status of actual resource allocation against planned resources

� status of actual cost against budget estimates

� status of actual time against planned schedule

� status of actual quality against planned quality

� Record of any deviations from planned activities and reason why

13-15 Proposal review record � Scope of proposal and requirements

� Possible contingencies or risks

� Alignment of the proposal with the strategic business plan of the organization

� Protection of proprietary information

� Requirements which differ from those in the original documentation

� Capability to meet contractual requirements

� Responsibility for subcontracted work

� Terminology

� Supplier ability to meet obligations

� Approved

13-16 Change request � Identifies purpose of change

� Identifies request status (new, accepted, rejected)

� Identifies requester contact information

� Impacted system(s)

� Impact to operations of existing system(s) defined

Page 159: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 159 of 210

WP ID WP Name WP Characteristics

� Impact to associated documentation defined

� Criticality of the request, date needed by

13-17 Customer request � Identifies request purpose, such as:

� new development

� enhancement

� internal customer

� operations

� documentation

� informational

� Identifies request status information, such as:

� date opened

� current status

� date assigned and responsible owner

� date verified

� date closed

� Identifies priority / severity of the request

� Identifies customer information, such as:

� company / person initiating the request

� contact information and details

� system site configuration information

� impacted system(s)

� impact to operations of existing systems

� criticality of the request

� expected customer response / closure requirements

� Identifies needed requirements / standards

� Identifies information sent with request (i.e., RFPs, dumps, etc.)

13-18 Quality record � Defines what information to keep

� Defines what tasks / activities / process produce the information

� Defines when the data was collected

� Defines source of any associated data

� Identifies the associated quality criteria

� Identifies any associated measurements using the information

� Identifies any requirements to be adhered to create the record, or satisfied by the record

13-19 Review record � Provides the context information about the review:

� what was reviewed

Page 160: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 160 of 210

WP ID WP Name WP Characteristics

� lists reviewers who attended

� status of the review

� Provides information about the coverage of the review:

� check-lists

� review criteria

� requirements

� compliance to standards

� Records information about:

� the readiness for the review

� preparation time spent for the review

� time spent in the review

� reviewers, roles and expertise

� Identifies the required corrective actions:

� risk identification

� prioritized list of deviations and problems discovered

� the actions, tasks to be performed to fix the problem

� ownership for corrective action

� status and target closure dates for identified problems

13-20 Risk action request � Date of initiation

� Scope

� Subject

� Request originator

� Risk management process context:

� this section may be provided once, and then referenced in subsequent action requests if no changes have occurred

� process scope

� stakeholder perspective

� risk categories

� risk thresholds

� project objectives

� project assumptions

� project constraints

� Risks:

� this section may cover one risk or many, as the user chooses

� where all the information above applies to the whole set of risks, one action request may suffice

� where the information varies, each request may cover the risk or risks that share common information

Page 161: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 161 of 210

WP ID WP Name WP Characteristics

� risk description(s)

� risk probability

� risk consequences

� expected timing of risk

� Risk treatment alternatives:

� alternative descriptions

� recommended alternative(s)

� justifications

� Risk action request disposition:

� each request should be annotated as to whether it is accepted, rejected, or modified, and the rationale provided for whichever decision is taken

13-21 Change control record � Used as a mechanism to control change to baselined products / products in official project release libraries

� Record of the change requested and made to a baselined product (work products, software, customer documentation, etc.):

� identifies system, documents impacted with change

� identifies change requester

� identifies party responsible for the change

� identifies status of the change

� Linkage to associated customer requests, internal change requests, etc.

� Appropriate approvals

� Duplicate requests are identified and grouped

13-22 Traceability record � Identifies requirements to be traced

� Identifies a mapping of requirement to life cycle work products

� Provides the linkage of requirements to work product decomposition (i.e., requirement, design, code, test, deliverables, etc.)

� Provides forward and backwards mapping of requirements to associated work products throughout all phases of the life cycle

Note: This may be included as a function of another defined work

product (example: A CASE tool for design decomposition may have a

mapping ability as part of its features)

13-23 Training record � Record of employee's training

� Identifies employee's name

� Identifies any courses taken (date, hours, course title)

� Identifies current skills / capabilities / experience level, lists:

� formal education

Page 162: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 162 of 210

WP ID WP Name WP Characteristics

� certifications / examination results

� in-house training

� mentoring

� Identifies future training needs

� Identifies current status of training requests

13-24 Validation results � Validation check-list

� Passed items of validation

� Failed items of validation

� Pending items of validation

� Problems identified during validation

� Risk analysis

� Recommendation of actions

� Conclusions of validation

� Signature of validation

13-25 Verification results � Verification check-list

� Passed items of verification

� Failed items of verification

� Pending items of verification

� Problems identified during verification

� Risk analysis

� Recommendation of actions

� Conclusions of verification

� Signature of verification

13-26 Assessment record � Identifies:

� the date of the assessment

� the assessment input

� the objective evidence gathered

� the assessment approach used

� the set of process profiles resulting from the assessment

� any additional information collected during the assessment that was identified in the assessment input to support process improvement or process capability determination

13-27 Retirement notification � Notifies customers and users of plans to retire products

13-28 Support request � Identifies request purpose, such as:

� new development

� enhancement

� operations

Page 163: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 163 of 210

WP ID WP Name WP Characteristics

� documentation

� informational

� Identifies request status information, such as:

� date opened

� current status

� date assigned and responsible owner

� date verified

� date closed

� Identifies priority / severity of the request

� Identifies test information, such as:

� person initiating the request

� contact information and details

� test configuration information

� impacted system(s)

� impact to operations of existing systems

� criticality of the request

� expected response / closure requirements

� Identifies needed requirements / standards

� Identifies information sent with request (i.e., RFPs, dumps, etc.)

13-29 Release approval record � Content information of what is to be shipped or delivered

� Identifies:

� who it is intended for

� address where to delivered

� the date released

� Record of supplier approval

13-30 Error Note � Includes Date when the error occurred and author.

� Includes expected and actual results.

� Includes identification of the test item (configuration item) and environment.

� Includes software or system life cycle process in which the error was observed.

� Includes description of the error to enable reproduction and resolution, including logs, database dumps or screenshots.

� Includes scope or degree of impact on stakeholder(s) interests.

� Includes severity of the impact on the system.

� Includes urgency/priority to fix.

� Includes status of the incident (e.g. open, deferred, duplicate,

Page 164: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 164 of 210

WP ID WP Name WP Characteristics

waiting to be fixed, fixed awaiting retest, closed).

� Includes Conclusions, recommendations and approvals.

� Includes global issues, such as other areas that may be affected by a change resulting from the error

� Includes change history, such as the sequence of actions taken by project team members with respect to the error to isolate, repair, and confirm it as fixed.

� Includes references, including the identity of the test case specification that revealed the problem

13-31 Issue record � Includes Date when the issue occurred and author.

� Includes expected and actual results.

� Includes identification of the test item (configuration item) and environment.

� Includes software or system life cycle process in which the issue was observed.

� Includes description of the issue to enable reproduction and resolution, including logs, database dumps or screenshots.

� Includes scope or degree of impact on stakeholder(s) interests.

� Includes severity of the impact on the system.

� Includes urgency/priority to fix.

� Includes status of the incident (e.g. open, deferred, duplicate, waiting to be fixed, fixed awaiting retest, closed).

� Includes Conclusions, recommendations and approvals.

� Includes global issues, such as other areas that may be affected by a change resulting from the issue

� Includes change history, such as the sequence of actions taken by project team members with respect to the issue to isolate, repair, and confirm it as fixed.

� Includes references, including the identity of the test case specification that revealed the problem

13-32 Test environment intake checklist

� Consists of a list of test cases to ensure that the test environment is ready for starting the test

� Ensures the compliance of the test environment with the entry criteria in the test plan

13-33 Test object intake checklist

� Consists of a list of test cases to ensure that the test objects are ready for test

� Ensures the compliance of the test objects with the entry criteria in the test plan

Page 165: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 165 of 210

WP ID WP Name WP Characteristics

13-35 Staff profile � Relevant information about personnel skills and experience:

� qualifications like testing certificates

� test skills

� development skills

� test management skills

� test experience

� training

13-36 Test environment release protocol

� Documents the delivery of a release of the test environment, showing the delivered configuration of hard- and software

13-40 Personnel profile � Relevant information about personnel skills and experience:

� qualifications like testing certificates

� test skills

� development skills

� test management skills

� test experience

� training

13-41 Test Data Provision Record

� Documents provision of test data according to test data provision plan

14-00 Register � A register is a compilation of data or information captured in a defined sequence to enable:

� an overall view of evidence of activities that have taken place

� monitoring and analyses

� provides evidence of performance of a process over time

14-01 Change history � Historical records of all changes made to an object (document, file, software module, etc.):

� description of change

� version information about changed object

� date of change

� change requester information

� change control record information

14-02 Corrective action register � Identifies the initial problem

� Identifies the ownership for completion of defined action

� Defines a solution (series of actions to fix problem)

� Identifies the open date and target closure date

� Contains a status indicator

� Indicates follow up audit actions

14-03 Hardware assets register � Identifies the key characteristics and identification of the hardware items, including:

Page 166: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 166 of 210

WP ID WP Name WP Characteristics

� description of the equipment

� unique means of identification (e.g. Serial number)

� make i.e. manufacturer

� model details

� when acquired

� product configuration details

� storage or deployment location

� acquisition cost

� equipment status (e.g. in service, retired)

14-04 Test log � Register of test results over the software product life cycle

� Identifies what elements were tested

� Identifies date tests were executed

� Identifies responsible person for the test results

14-05 Preferred suppliers register

� Subcontractor or supplier history

� List of potential subcontractor / suppliers

� Qualification information

� Identifies their qualifications

� Past history information when it exists

14-06 Schedule � Identifies the tasks to be performed

� Identifies the expected and actual start and completion date for required tasks

� Allows for the identification of critical tasks and task dependencies

� Identifies task completion status, vs. planned date

� Has a mapping to scheduled resource data

14-08 Tracking system � Ability to record customer and process owner information

� Ability to record related system configuration information

� Ability to record information about problem or action needed:

� date opened and target closure date

� severity / criticality of item

� status of any problem or actions needed

� information about the problem

� the action owners

� priority of problem resolution

� Ability to record proposed resolution or action plan

� Ability to provide management status information

� Information is available to all with a need to know

� Integrated change control system(s) / records

Page 167: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 167 of 210

WP ID WP Name WP Characteristics

14-09 Work breakdown structure

� Defines tasks to be performed, and their amendments

� Documents ownership for tasks

� Documents critical dependencies between tasks

� Documents inputs and output work products

� Documents the critical dependencies between defined work products

14-10 Work product distribution register

� List of current list of receivers and their delivery address

� Identifies media expected for delivery (manual, CD-ROM, email, etc.)

14-11 Work product list � Identifies:

� name of work product

� work product reference ID

� work product revision

� when updated

� work product status

� when approved

� reference to approval source

� file reference

14-12 Test environment intake test log

� Register of test results over the test environment intake process

� Identifies what elements were tested

� Identifies date tests were executed

� Identifies responsible person for the test results

14-13 Risk logbook � Documents actual risks and risk histories

� Includes the actual risks and their changes

� Includes a list of actual problems

� Includes a risk history

14-14 Test schedule � Identifies the test tasks to be performed

� Identifies the expected and actual start and completion date for required test tasks

� Allows the identification of critical test tasks and task dependencies

� Identifies test task completion status, vs. planned date

� Has a mapping to scheduled resource data

� Identifies baselines for test milestones

14-15 List of tools � Identifies tools supporting the test

� Includes the name of the tool

� Includes a description of the tool

� Includes a configuration description

Page 168: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 168 of 210

WP ID WP Name WP Characteristics

� Determines who is using the tool

14-16 Test asset register � Identifies the key characteristics and identification of the testware, including:

� testware type (e.g. Test Case)

� testware title

� testware version

� test documentation - number of items

� storage location

� test status (active, retired)

14-17 Test environment reservation schedule

� Identifies the test tasks to be performed on the test environment

� Identifies specific test environments components needed and reserved for a period

� Identifies the person or group who makes a reservation

14-18 Test set � Defines a set of several test cases for a component or system under test

14-19 Test suite � Consists of test sets where post condition of one test is often used as the precondition for the next one.

15-00 Report � A work product describing a situation that:

� includes results and status

� identifies applicable / associated information

� identifies considerations / constraints

� provides evidence / verification

15-01 Analysis report � What was analyzed

� Who did the analysis

� The analysis criteria used:

� selection criteria or prioritization scheme used

� decision criteria

� quality criteria

� Records the results:

� what was decided / selected

� reason for the selection

� assumptions made

� potential risks

� Aspects of correctness to analyse include:

� completeness

� understandability

� testability

� verifiability

Page 169: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 169 of 210

WP ID WP Name WP Characteristics

� feasibility

� validity

� consistency

� adequacy of content

15-03 Configuration status report

� Identifies:

� the number of items under configuration management

� risks associated to configuration management

� the number of configuration management items lost and reason for their loss

� problem and issues related to configuration management

� receiving parties

� baselines made

15-04 Market analysis report � Contains information about:

� what was analysed

� the selection criteria and prioritization scheme used

� the analysis criteria used

� Records the results which identify the:

� market opportunities and market window

� business drivers

� cost / benefit

� potential customers and their profiles information

� any assumptions made

� alternate solutions considered and/or rejected

� risks and/or constraints (regulatory issues)

� Defines the product offering and target release / launch date

15-05 Evaluation report � States the purpose of evaluation

� Method used for evaluation

� Requirements used for the evaluation

� Assumptions and limitations

� Identifies the context and scope information required:

� date of evaluation

� parties involved

� context details

� evaluation instrument (check-list, tool) used

� Records the result:

� data

� identifies the required corrective and preventive actions

� improvement opportunities, as appropriate

Page 170: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 170 of 210

WP ID WP Name WP Characteristics

15-06 Project status report � A report of the current status of the project

� Schedule:

� planned progress

� actual progress

� reasons for variance from planned progress

� threats to continued progress

� contingency plans to maintain progress

� Budget:

� planned expenditure

� actual expenditure

� reasons for variance between planned and actual expenditure

� expected future expenditure

� contingency plans to achieve budget goals

� Quality goals:

� actual quality measures

� reasons for variance from planned quality measures

� contingency plans to achieve quality goals

� Project issues:

� issues which may affect the ability of the project to achieve its goals.

� contingency plans to overcome threats to project goals

15-07 Reuse evaluation report � Identifies reuse opportunities

� Identifies investment in Reuse

� Identifies current skills and experience

� Identifies reuse infrastructure

� The evaluation report must represent current status in implementation of the reuse program

15-08 Risk analysis report � Identifies the risks analysed

� Records the results of the analysis:

� potential ways to mitigate the risk

� assumptions made

� constraints

15-09 Risk status report � Identifies the status of an identified risk:

� related project or activity

� risk statement

� condition

� consequence

Page 171: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 171 of 210

WP ID WP Name WP Characteristics

� changes in priority

� duration of mitigation, when started

� risk mitigation activities in progress

� responsibility

� constraints

15-10 Test incident report � Identifies:

� a summary of the report

� the originator

� the date originated

� the status

� the severity

� the Application

� the Function in which the defect was found

� the Build the defect was found in

� the related test procedure

� an analysis including a description, person assigned to the analysis and the complexity

� a resolution including a description, person assigned to the resolution, the complexity and the date expected for resolution

� verification including a description, person assigned to the verification, expected verified date (retest date)

15-11 Defect report � Identifies the defects

� A summary of each defect

� Identifies the tester who found each defect

� Identifies the severity for each defect

� Identifies the affected function(s) for each defect

� Identifies the date when each defect originated

� Identifies the date when each defect was resolved

� Identifies the person who resolved each defect

15-12 Problem status report � Presents a summary of problem records:

� by problem categories / classification

� Status of problem solving:

� development of solved vs. open problems

15-13 Assessment report � States the purpose of assessment

� Method used for assessment

� Requirements used for the assessment

� Assumptions and limitations

� Identifies the context and scope information required:

Page 172: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 172 of 210

WP ID WP Name WP Characteristics

� date of assessment

� organizational unit assessed

� sponsor information

� assessment team

� attendees

� scope / coverage

� assesses information

� assessment Instrument (check-list, tool) used

� Records the result:

� data

� identifies the required corrective actions

� improvement opportunities

15-14 Customer satisfaction report

� States the purpose of customer satisfaction evaluation

� Method used for evaluation

� Requirements used for the evaluation

� Assumptions and limitations

� Identifies the context and scope information required:

� date(s) / period of the evaluation

� organizational unit assessed

� scope / coverage

� customer data

� evaluation instrument (check-list, tool) used

� Records the result:

� data

� identifies the required corrective actions

� improvement opportunities

15-15 Human resource needs analysis

� Definition of the need:

� skills and competencies in organization and projects needed

� responsibilities to be performed

� requirements to be satisfied

� Constraints:

� cost limitations

� date / schedule requirements

15-16 Improvement opportunity � Identifies what the problem is

� Identifies what the cause of a problem is

� Suggest what could be done to fix the problem

� Identifies the value (expected benefit) in performing the

Page 173: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 173 of 210

WP ID WP Name WP Characteristics

improvement

� Identifies the penalty for not making the improvement

15-17 Personnel performance evaluation

� No characteristics additional to Evaluation report (Generic)

15-18 Process performance report

� No characteristics additional to Evaluation report (Generic)

15-20 Service level performance

� No characteristics additional to Evaluation report (Generic)

15-21 Supplier evaluation report

� No characteristics additional to Evaluation report (Generic)

15-22 Training evaluation report � Training effectiveness surveys

� Training program performance assessment

� Analysis of training evaluation forms

15-23 Test item transmittal report

� Identifies the report

� Transmitted items

� Location of items

� Status of items

� Approvals

15-24 Audit report � States the purpose of audit

� Method used for audit

� Requirements that are the basis for the audit

� Assumptions and limitations

� Identifies the context and scope information required:

� date of audit

� organizational unit audited

� sponsor information

� audit team

� attendees

� scope / coverage

� participant's information

� audit instrument (checklist, tool) used

� Records the result:

� the non-conformances identified

� the required corrective actions

15-25 Tester satisfaction report � States the purpose of tester satisfaction evaluation

� Method used for evaluation

� Requirements used for the evaluation

� Assumptions and limitations

Page 174: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 174 of 210

WP ID WP Name WP Characteristics

� Identifies the context and scope information required:

� date(s) / period of the evaluation

� organizational unit assessed

� scope / coverage

� tester data

� evaluation instrument (check-list, tool) used

� Records the result:

� data

� identifies the required corrective actions

� improvement opportunities

15-26 needs assessment report � Definition of the need:

� reason for acquiring , developing, or enhancing a test tool, a test service or testware or for outsourcing test activities

� features and functions desired

� requirements to be satisfied

� Constraints:

� cost limitations

� date / schedule requirements

� specific support hardware / software / service required

� interfaces requirements

� associated equipment required

� regulatory standards and/or requirements

� operational impacts

� patent, copyright and licensing issues

� Business case:

� expected benefit

� expected cost (including projected installation, conversion and/or maintenance) vs. profit expectations

� market window, target delivery dates

15-27 Review report � States the purpose of review

� Method used for review

� Requirements used for the review

� Assumptions and limitations

� Identifies the context and scope information required:

� date of review

� parties involved

� context details

Page 175: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 175 of 210

WP ID WP Name WP Characteristics

� review instrument (check-list, tool) used

� Records the result:

� data

� identifies the required corrective actions

� improvement opportunities

15-28 Review results � Includes data

� Includes requires corrective actions

15-29 Acceptance report � Based on test closure report

� Includes Go-/No-Go Decision

� Includes the reasons for the acceptance decision

15-30 Test Summary Report � Summarizes all test activities and test results

� Contains an evaluation of the corresponding test items against exit criteria

� Contains an evaluation of the test process

15-31 Analysis report � Identifies the result of an review, assessment or evaluation

� Identifies the used criteria for analysing

� Identifies the required corrective actions

� Identifies improvement opportunities

15-32 Test closure report � Includes a list of delivered work products

� Includes a statement to completeness and correctness of the delivered work products

� Includes a statement if the test end criteria are reached

� Includes the finally reached test coverage

15-33 Experience report � Contains the experiences out of the test project

� Contains the experiences to the test process

� Identifies improvement opportunities

15-34 Correction report � Contains information to corrected defects

� Includes following data:

� defect ID

� defect description

� defect priority

� correction date

� person in charge

� retest results

15-35 CCB meeting report � Includes a decision log regarding test incidents

� Includes a decision log regarding test environment incidents

15-36 Issue report � Identifies the issues

� Contains a summary of each issue

Page 176: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 176 of 210

WP ID WP Name WP Characteristics

� Identifies the priority of each issue

� Includes the status of each issue

� Includes status-changes of each issue

� Includes the decisions of the CCB towards each issue

15-37 Test report � Identifies test cases, that have to be adapted/ revised

� Identifies test activities that have to be replicated.

� Delivers a judgment over the test object referring to the acceptance criteria

15-38 Test Controlling Report � Contains information on the test status based on the used and defined measures

� Contains a list of actual issues and problems

� Includes corrective actions if necessary

15-39 Test Object Delivery Report

� Delivered by development to test

� Identifies test items, including:

� Configuration

� current status

� other necessary delivery information

15-40 Result of impact-analysis � Identifies components and functions affected by software or hardware changes

� Identifies test cases belonging to this components or functions

� Gives a statement about the regression test volume

15-41 test environment report � Includes an overview over the components of the test environment

� Identifies configurations in the test environment

� Identifies used software in the test environment including software version

� Documents problems in the tests environment

� Documents necessary work-arounds

15-42 test environment disassembly report.

This report includes e.g. the following items:

� Registration of components and licences, which are ready to be assembled or used in other test environments

� Machine cleaning report

� HW and network infrastructure disassembly report

15-43 Test automation proof of concept report

Postponed to Version 4.0

15-44 Test data provision report Documents the delivery of test data

� Discrepancies between test data provision plan and delivered test data

� Approval of provision record

Page 177: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 177 of 210

WP ID WP Name WP Characteristics

16-00 Repository � Repository for components

� Storage and retrieval capabilities

� Ability to browse content

� Listing of contents with description of attributes

� Sharing and transfer of components between affected groups

� Effective controls over access

� Maintain component descriptions

� Recovery of archive versions of components

� Ability to report component status

� Changes to components are tracked to change / user requests

16-01 Assessment results repository

� Storage and location of work products for each assessment is unique

� Effective controls over access

� Ability to report status of work products applied to each assessment

� Recovery of archive versions of assessment records

16-02 Asset repository � Effective controls over access

� Type of asset maintained

� Supporting hardware, software and product applications

� Identifies appropriate version control

� Ability to identify where the asset has been used for traceability issues

16-03 Configuration management library

� Correct creation of products from the library

� Can recreate any release or test configuration

� Ability to report configuration status

16-04 Knowledge repository � Classification of knowledge

� Contains definitions of knowledge items

� Search mechanism to find knowledge

� Ability to identify where the asset has been used

16-05 Reuse library � Ability to identify associated system information:

� type of object maintained

� supported hardware / software / product applications

� associated hardware / software / product configuration information

� required parameter information

16-06 Process repository � Contains process descriptions

� Supports multiple presentations of process assets

16-07 Test asset repository � Effective controls over access

Page 178: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 178 of 210

WP ID WP Name WP Characteristics

� Type of test asset maintained

� Supporting hardware, software and product applications

� Identifies appropriate version control

� Ability to identify where the test asset has been used for traceability issues

16-08 Test archive/

Archive of testware

� Effective controls over access

� Type of testware maintained

� Supporting hardware, software and product applications

� Identifies appropriate version control

� Ability to identify where the testware has been used for traceability issues

17-00 Requirement specification

� Each requirement is identified

� Each requirement is unique

� Each requirement is verifiable or can be assessed

� Includes statutory and regulatory requirements

� Includes issues / requirements from (contract) review

17-01 Asset specification � Identifies unique, reusable solutions

� Identifies the context (e.g. development, run-time)

� Identifies the work products constituting the asset (e.g. requirements, designs, code, test cases, models)

� Rules and instructions for use

17-02 Build list � Identifies aggregates of the software application system

� Identifies required system elements (parameter settings, macro libraries, data bases, job control languages, etc.)

� Necessary sequence ordering identified for compiling the software release

� Input and output source libraries identified

17-03 Customer requirements � Purpose / objectives defined

� Includes issues / requirements from (contract) review

� Identifies any:

� time schedule / constraints

� required feature and functional characteristics

� necessary performance considerations / constraints

� necessary internal / external interface considerations / constraints

� required system characteristics / constraints

� human engineering considerations / constraints

� security considerations / constraints

� environmental considerations / constraints

Page 179: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 179 of 210

WP ID WP Name WP Characteristics

� operational considerations / constraints

� maintenance considerations / constraints

� installation considerations / constraints

� support considerations / constraints

� design constraints

� safety / reliability considerations / constraints

� quality requirements / expectations

17-04 Delivery instructions � Delivery requirements

� Sequential ordering of tasks to be performed

� Applicable releases identified

� Identifies all delivered components with version information

� Identifies any necessary backup and recovery procedures

17-05 Documentation requirements

� Purpose / objectives defined

� Proposed contents (coverage) defined

� Intended audience defined

� Identifies supported hardware / software / product release, system information

� Identifies associated hardware / software / product requirements and designs

� satisfied by document

� Identifies style, format, media standards expected

� Includes definition of the intended distribution requirement

� Includes storage requirements

17-06 Domain interface

specification

� Identifies

� domain assets

� subset domains

� domain interface requirements

� domain dependencies

� interfaced domains

� functions, features, properties and capabilities of the domain

� domain vocabulary

� domain architecture

17-08 Interface requirements � Defines:

� relationships between two products, processes or process tasks

� criteria and format for what is common to both

� critical timing dependencies or sequence ordering

Page 180: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 180 of 210

WP ID WP Name WP Characteristics

17-09 Product requirements � Identifies any:

� required feature and functional characteristics

� necessary performance considerations / constraints

� necessary internal / external interface considerations / constraints

� required system characteristics / constraints

� human engineering considerations / constraints

� security considerations / constraints

� environmental considerations / constraints

� operational considerations / constraints

� maintenance considerations / constraints

� associated documentation considerations / constraints

� installation considerations / constraints

� support considerations / constraints

� design constraints

� safety / reliability considerations / constraints

� quality requirements / expectations

� Includes storage requirements (products)

17-10 Service requirements � Identifies any:

� performance expectations

� time schedule / constraints

� tasks to be performed

� responsibilities

� Identify the method of communication, project reporting expected

� quality expectations / controls

17-11 Software requirements � Identifies standards to be used

� Identifies any software structure considerations / constraints

� Identifies the required software elements

� Identifies the relationship between software elements

� Consideration is given to:

� any required software performance characteristics

� any required software interfaces

� any required security characteristics

� any database design requirements

� any required error handling and recovery attributes

17-12 System requirements � System requirements include: functions and capabilities of the system; business, organizational and user requirements; safety, security, human-factors engineering (ergonomics),

Page 181: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 181 of 210

WP ID WP Name WP Characteristics

interface, operations, and maintenance requirements; design constraints and qualification requirements (ISO/IEC 12207)

� Identifies the required system overview

� Identifies any interrelationship considerations / constraints between system elements

� Identifies any relationship considerations / constraints between the system elements and the software

� Identifies any design considerations / constraints for each required system element, including:

� memory / capacity requirements

� hardware interfaces requirements

� user interfaces requirements

� external system interface requirements

� performance requirements

� commands structures

� security / data protection characteristics

� system parameter settings

� manual operations

� reusable components

17-13 Test design specification � Identifies the test design

� Features to be tested

� Approach refinements

� Test identification

� Feature pass/fail criteria

17-14 Test case specification � Identifies the test case

� Test items

� Input specifications

� Output specifications

� Environmental needs

� Special procedural requirements

� Intercase dependencies

17-15 Test asset specification � Identifies unique, reusable solutions

� Identifies the context

� Identifies the test work products constituting the test asset (test cases)

� Rules and instructions for use

17-16 Test data specification � Identifies criteria for finding suitable test data

� Identifies specific test data needs for tests items

17-17 test environment � Test environment requirements include: functions and

Page 182: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 182 of 210

WP ID WP Name WP Characteristics

requirements specification

capabilities of the test environment; business, organizational and tester requirements; safety, security, human-factors engineering (ergonomics), interface, operations, and maintenance requirements; design constraints and qualification requirements (ISO/IEC 12207)

� Identifies the required test environment overview

� Identifies any interrelationship considerations / constraints between environment elements

� Identifies any relationship considerations / constraints between the system elements and the software

� Identifies any design considerations / constraints for each required system element, including:

� memory / capacity requirements

� hardware interfaces requirements

� user interfaces requirements

� external system interface requirements

� performance requirements

� commands structures

� security / data protection characteristics

� system parameter settings

� manual operations

� reusable components

17-18 test procedure specification/ Test scenario

� Specifies a sequence of actions for the execution of a test.

17-19 Requirement specification for test automation/ Requirements for test automation

� Identifies:

� required tools

� necessary performance considerations / constraints

� environmental considerations / constraints

� quality requirements / expectations regarding the test objects

17-20 Test infrastructure requirements

� Test Infrastructure requirements may include:

� security

� throughput and data sharing requirements

� backup and recovery

� remote access facility

� physical workspace and equipment

� test support requirements

� maintenance requirements

17-21 Test environment intake test procedure

� Specifies the sequence of tests that have to be performed to ensure the readiness of the test environment for testing

Page 183: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 183 of 210

WP ID WP Name WP Characteristics

specification

17-22 Requirement specification for test tools

� Identifies any

� required feature and functional characteristics

� necessary performance considerations / constraints

� security considerations / constraints

� environmental considerations / constraints

� operational considerations / constraints

� maintenance considerations / constraints

� associated documentation considerations / constraints

� installation considerations / constraints

� support considerations / constraints

� design constraints

� safety / reliability considerations / constraints

� quality requirements / expectations

17-24 Requirements documentation

� Consists of :

� customer requirements

� environmental requirements

� infrastructural requirements

17-25 Requirements for regression tests

� Identifies:

� required tools

� necessary performance considerations / constraints

� environmental considerations / constraints

� quality requirements / expectations

17-26 Test Requirements Specification

� contain requirements for the test process, tool and /or results

� legal requirements

� contractual requirements

� external standards

� internal standards

17-27 test environment Intake procedure Requirements

� Includes the detailed requirements for the test environment Intake Procedure

17-28 test environment OLA Requirements

� Contains requirements for the test environment Operational Level Agreement (OLA)

17-29 test environment SLA Requirements

� Contains Requirements for the test environment Service Level Agreement (SLA)

17-30 test environment disassembly requirements specification

� Contains Requirements for the test environment disassembly

Postponed to Version 4.0

Page 184: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 184 of 210

WP ID WP Name WP Characteristics

17-31 test environment test case specification

Postponed to Version 4.0

18-00 Standard � Identifies who / what they apply to

� Expectations for conformance are identified

� Conformance to requirements can be demonstrated

� Provisions for tailoring or exception to the requirements are included

18-01 Acceptance criteria � Defines:

� interfaces

� schedule

� messages

� documents

� meetings

� joint review

18-02 Assessment method standard

� Overview of activities

� Initiating the assessment

� Planning the assessment

� Team briefing

� Data collection

� Data validation

� Process attribute ratings

� Reporting the results

18-03 Coding standard

� Coverage for software includes, but is not limited to (as appropriate to the application):

� data naming conventions

� defines required languages, compilers, data base management systems, etc.

� format of code, structure, comments required

� standard data structures, types, classes

� best practices

� required usage of tools: data dictionaries

� associated CASE tools

� compatibility requirement for existing software and/or hardware

� security considerations

� performance considerations

� standard error messages, codes

� Interface standards:

� human - machine interfaces

Page 185: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 185 of 210

WP ID WP Name WP Characteristics

� external system interfaces

� peripheral equipment, hardware

� Storage and retrieval of source code and object modules

� Quality and reliability standards

18-05 Personnel performance criteria

� Defines expectations for personnel performance:

� establishes what adequate work performance (required deliverables, completeness expected, accuracy, quality, etc.)

� identifies what constitutes the completeness of the defined tasks

� Establishes personnel reliability attributes

18-07 Quality criteria � Defines expectations for quality:

� establishes what is an adequate work product (required elements, completeness expected, accuracy, etc.)

� identifies what constitutes the completeness of the defined tasks

� establishes life cycle transition criteria and the entry and exit requirements for each process and/or activity defined

� establishes expected performance attributes

� establishes product reliability attributes

18-08 Supplier selection criteria � Defines expectations for a supplier:

� supplier profile

� supplier capability,

� logistics

� development approach

� in-house development or subcontracting

� compliance to acquisition requirements

18-09 Release criteria � Defines expectations for test service release:

� release type and status

� required elements of the release

� product completeness including documentation

� adequacy and coverage of testing

� limit for open defects

� change control status

18-10 Engineering standards � Standards given by development e.g. coding standards

18-11 Evaluation criteria � Base on rating levels for selected measures e.g.

� rating levels for the weighting of a requirement

� rating levels for the coverage grades of a requirement

18-12 Evaluation methods � Overview of activities

Page 186: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 186 of 210

WP ID WP Name WP Characteristics

� Initiating the evaluation

� Planning the evaluation

� Team briefing

� Data collection

� Data validation

� Process attribute ratings

� Reporting the results

18-13 Laws � Legal requirements that must be kept

19-00 Strategy � Identifies what needs and objectives or goals there are to be satisfied

� Establishes the options and approach for satisfying the needs, objectives, or goals

� Establishes the evaluation criteria against which the strategic options are evaluated

� Identifies any constraints / risks and how these will be addressed

19-01 Asset management

strategy

� Asset storage and retrieval mechanism

� Asset classification scheme

� Criteria for asset acceptance

� Communication mechanism

� Criteria for certification

� Criteria for retirement

� Identifies the scope of assets by name and brief description

� Identifies the assets configuration management activities

� Identifies the asset provision activities

� Identifies the asset maintenance activities

19-02 Process strategy � Describes process deployment in the organizational unit

� Identifies goals for process definition, implementation and improvement

� Determines required support to enable the strategy

19-03 Knowledge management strategy

� The attributes for Strategy (Generic) apply

19-05 Reuse strategy � Identifies the goals for reuse

� Identifies the commitment for creating reusable components

� Determines which product lines and type of artefacts should be supported with reuse

� Identifies system and hardware / software / product elements which can be reused within the organization

� Identifies the reuse repository and tools

Page 187: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 187 of 210

WP ID WP Name WP Characteristics

19-06 Maintenance strategy � The attributes for Strategy (Generic) apply

19-07 Software development

methodology

� Identifies the approach / method used to develop software

� Identifies the life cycle model (waterfall, spiral, serial build, agile, etc.) used to develop software

� Provides a high level description of the process, activities, and controls

19-08 Training strategy � Establishes the options (acquisition, development) and approach for satisfying training needs

� Establishes the evaluation criteria against which the strategic options are evaluated

� Identifies any constraints / risks and how these will be addressed

19-10 Verification strategy � Verification methods, techniques, and tools

� Work product or processes under verification

� Degrees of independence for verification

� Schedule for performing the above activities

� Identifies what needs there are to be satisfied

� Establishes the options and approach for satisfying the need

� Establishes the evaluation criteria against which the strategic options are evaluated

� Identifies any constraints / risks and how these will be addressed

19-11 Validation strategy � Validation methods, techniques, and tools

� Work products under validation

� Degrees of independence for validation

� Schedule for performing the above activities

� Identifies what needs there are to be satisfied

� Establishes the options and approach for satisfying the need

� Establishes the evaluation criteria against which the strategic options are evaluated

� Identifies any constraints / risks and how these will be addressed

19-12 Audit strategy � Purpose

� Scope

� Milestones

� Audit criteria

� Audit team

� Identifies what needs there are to be satisfied

� Establishes the options and approach for satisfying the need

� Establishes the evaluation criteria against which the strategic

Page 188: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 188 of 210

WP ID WP Name WP Characteristics

options are evaluated

� Identifies any constraints / risks and how these will be addressed

19-13 Test Asset management strategy

� Test asset storage and retrieval mechanism

� Test asset classification scheme

� Criteria for test asset acceptance

� Communication mechanism

� Criteria for certification

� Criteria for retirement

� Identifies the scope of test assets by name and brief description

� Identifies the test assets configuration management activities

� Identifies the test asset provision activities

� Identifies the test asset maintenance activities

19-14 Release strategy � The attributes for Strategy (Generic) apply

19-15 Test methodology � Defines a standard set of test processes

� Includes the purpose of each process and interactions between them

19-16 Risk management strategies

� Determines strategies for risk identification for project and product risks

� Determines strategies for risk analyse

� Identifies criteria for risk analyse like impact and occurrence possibility

� Determines strategies for risk mitigation based on the analyse results:

� preventive actions to reduce likelihood or impact

� prepare contingency plans

� define escalation ways to transfer risks

� Defines a “Trouble-shooting” Infrastructure

� Establishes an early warning system for specific risks including

� risk Assessments

� risk Monitoring

� risk Checks

19-17 regression test strategy � Defines the focus of the regression testing, e.g. which items and/or features

� Identifies test cases to be selected and executed

� Identifies type of testing to be performed

� Determines whether Manual testing or test automation tools are used

Page 189: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 189 of 210

WP ID WP Name WP Characteristics

19-18 Test handbook/ Test manual

� Defines the used test levels

� Defines the test activities in each test level, including:

� test entry and exit criteria

� test procedure

� independency of testing

� standards

� test environment

� test design techniques

� regression test strategy

� level of test automation

� level of reuse

� test process

� used measures

� issue Management

� Describes the risks covered by the tests

19-19 Test strategy � Defines the goals of the test process

� Determines the test stages

� Defines the goals, responsibilities and main activities of each test stage

� Determines test case design techniques to be used at each test level

� Defines test types to be carried out

� Determine overall test model

� Identifies the legal requirements

� Identifies the organizational requirements

20-00 Template � Defines the attributes associated with a work product to be created as a consequence of a process execution

� Identifies technical elements typically associated with this product type

� Defines expected form and style

20-01 Risk Template � List containing all known risks containing following elements:

� risk description

� reason for risk

� risk impact description

� level of occurrence possibility (Low, Medium, High)

� level of impact (Low, Medium, High)

� possible Actions

21-00 Work product � Defines the attributes associated with an artefact from a process execution:

Page 190: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 190 of 210

WP ID WP Name WP Characteristics

� key elements to be represented in the work product

21-01 Test Work Products � Defines the attributes associated with a test artefact from a process execution:

� key elements to be represented in the test work product

10 Annex C: Terminology

TestSPICE uses the terminology as defined in the following standards, considering this order:

1) ISO/IEC 29119 Part 1

2) ISTQB Glossary

3) ISO/IEC 33001 “Concepts & Terminology”

4) IEEE610 Standard Glossary of Software Engineering Terminology

Terms that are not defined in these standards, are listed here:

Term Standard Description

Component

BS7925-1 A Minimal software or hardware item for which a separate specification is available One of the parts that makes up a system. A component may be hardware or software and may be subdivided into other components

Note: Example: A steering unit might be a component if you test the function of a complete machine, but in itself constructed of other components that have to be tested if you test the steering unit.

Embedded software BS7925-1 Software in an embedded system, dedicated to controlling the specific hardware of the system

Embedded system BS7925-1 A system that interacts with the real physical world using actors and sensors

Functional requirement BS7925-1 The required functional behaviour of a system

HiL Hardware in the loop BS7925-1 A test level where real hardware is used and tested in a simulated environment

Integration BS7925-1 A process combining components into larger assemblies

MiL Model in the loop BS7925-1 A test level where the simulation model of the system is tested dynamically in a simulated environment

Model based development BS7925-1 A development method where the system is first described as a model. The code is then generated automatically from the models.

Rapid prototyping BS7925-1 A test level where a simulated embedded system is tested while connected to the real environment

Page 191: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 191 of 210

Simulator BS7925-1 A device, or computer program, or system used during software verification that behaves or operates like a given system when provided with a set of controlled inputs

SiL Software in the loop BS7925-1 A test level where the real software is used and tested in a simulated environment or with experimental hardware

System testing BS7925-1 A process of testing an integrated system to verify that it meets specified requirements

Note: This annex is subject to change in further versions of TestSPICE. In these versions the Standard

Glossary of Software Testing Terms will be included.

11 Annex D: Key Concepts Schematic

The following schematic has been developed to indicate key concepts in terms of processes and work products that flow through the testing processes within the TestSPICE PRM. It relates to the terminology described in Annex C Terminology.

The schematic highlights that a system may consist of hardware, software and mechanics. This has to be taken into consideration during test planning test specification and test execution

Software may consist of a number of software components. A software component has a component specification. The lowest level of a software component that is not sub-divided into other components is termed a software unit. Software units are integrated into software items to form the software to be tested. Software may be integrated with hardware and mechanics to form the system to be tested.

12 Annex E: Bidirectional Traceability

Note 1: In the context of TestSPICE bidirectional traceability means traceability between requirement,

test case specification, test data and test results. This relationship must be discernable in either direction

(i.e., to and from a test case specification, to and from a requirement, to and from test data, to and from

a test result).

Note 2: The Concept is subject to further development.

Page 192: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 192 of 210

13 Annex F: Reference Standards and Relevant Documents

Annex F provides a list of reference standards and guidelines that support implementation of the processes in the PAM (included the PRM).

Standard/Document Description

IEEE 1233-1998 Guide for Developing System Requirements Specifications

IEEE 1471-2000 Recommended Practice for Architectural Description

IEEE 830-1998 Recommended Practice for Software Requirements Specifications

IEEE 829-1998 Standard for Software Test Documentation (under revision - to be published as Standard for software and system test documentation)

IEEE 1058-1998 Standard for Software Project Management Plans

IEEE 610.12-1998 Standard Glossary of Software Engineering Terminology

IEEE 828-1998 Standard for Software Configuration Management Plans

IEEE 730-1998 Standard for Software Quality Assurance Plans

IEEE 1016-1998 Recommended Practice for Software Design Descriptions

ISO/IEC 15504-1: 2004 Software Engineering - Process assessment – Part 1: Concepts and Vocabulary

ISO/IEC 15504-2: 2003 Information Technology - Process assessment – Part 2: Performing an Assessment

BS 7925-1 Glossary of Terms used in Software Testing

ISO/IEC 29119 Part 1 Definitions & Vocabulary

ISO/IEC 29119 Part 2 Test Process

ISO/IEC 29119 Part 3 Test Documentation

ISO/IEC 29119 Part 4 Test Techniques

ISO/IEC 33001 Information technology -- Process assessment -- Concepts and terminology

ISTQB CT EL – Syllabus 2011

ISTQB Certified Tester Expert Level Syllabus Test Management (Managing Testing, Testers, and Test Stakeholders) Version 2011

ISTQB CT AL TM – Syllabus 2012

ISTQB Certified Tester Advanced Level Syllabus Test Manager Version 2012

ISTQB CT AL TA – Syllabus 2012

ISTQB Certified Tester Advanced Level Syllabus Test Analyst Version 2012

ISTQB CT AL TTA – Syllabus 2012

ISTQB Certified Tester Advanced Level Syllabus Technical Test Analyst Version 2012

ISTQB CT EL ITP – Syllabus 2011

ISTQB Certified Tester Expert Level Syllabus Improving the Testing Process (Implementing Improvement and Change) Version 2011

ISTQB CT FL – Syllabus 2011

ISTQB Certified Tester Foundation Level Syllabus Version 2011

Page 193: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 193 of 210

14 Annex G: Usage of TestSPICE in multiple test stages

TestSPICE does not contain any test stage or test method as a process. In an assessment this will be an issue for the definition of the assessment scope and the assessment planning.

As the quality of test plans and available test documentation may vary in every test stage it has to be decided if a process can be easily assessed using one instance or if several instances have to be assessed to discover variances in process capability and the related risk.

Recommendations for scoping:

Improvement Context

Evaluate the improvement context. Due to ISO/IEC 15504 part 4 it is crucial to know the drives of the improvement and also first to develop a capability profile “as should” before assessing capabilities as is. This procedure also helps to understand assessment needs and effort. If a process does not have an expected level for a certain test stage, it might not be relevant so you do not need to assess the process but if it has relevance for the whole organization or as instance for a defined test stage / test step the assessment effort is justified.

EXAMPLE: It makes a difference if an improvement is internally or customer driven. Customer more likely want a complete view on processes. Internally driven improvements more likely focus on processes that are relevant for the expected business value. It also makes a difference if all work is done inhouse or if it is outsourced. Even if test policy and test strategy have to be in place in both cases (but with different content) we see core testing processes at the one hand and test service acquisition processes at the other hand.

Key Questions: Who wants the improvement project? Who are Sponsor and relevant stakeholders? Who wants and who sponsors the assessment? Is an “As should” capability profile available? Who are the relevant stakeholders that have to help to develop this profile? Are there other external or internal drivers for the improvement and/or assessment project? Are there binding governmental, public or internal standards and procedures that have to be taken into account?

Develop a profile with the expected capabilities. Develop an org chart that shows sponsorship for the improvement and the assessment as well as the organizational position of the most relevant stakeholders.

Test Strategy / Test Policy / Glossary

Evaluate the test strategy, the test policy and the glossary of the organization that undergoes the assessment. It is important to know the defined test stages / test steps from the test strategy and to enhance the understanding of the organizations view on testing by analyzing the used terminology.

EXAMPLE: It makes a difference if an organization talks about functional test, integration test, E2E test, user acceptance test, system integration test, unit test, security test, performance test or usability test just as methods or a real test stages. If you do not understand the approach of the assessed organization scoping and planning of the assessment will fail.

Key Questions: What are the test stages mentioned in the test strategy? What do they mean? To what organizational units do the test stages belong? Are the owners of these test stages under organizational or contractual control of the improvement or the assessment sponsor? Will these stakeholders contribute to upcoming improvement projects?

Organizational processes

Evaluate the organizational framework for the intended improvement of the test process and for the planned assessment.

EXAMPLE: It makes a difference if an organization does the complete testing or if the testing work is done by various organizations. In the first scenario you need the organizational processes one time, in the other one you might need information about the capability of all involved organizations in order to implement necessary improvements.

Key Questions: Who is responsible for test policy and test strategy, who is responsible for test planning, are there organizational or contractual barriers between the responsible persons?

Page 194: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 194 of 210

Decide on assessment of organizational processes

Technical Processes

Evaluate the organizational and technical framework for the intended improvement of the test process and for the planned assessment.

EXAMPLE: It makes a difference if an organization runs all test and test automation environments or if the technical responsibilities for the testing work are linked to various organizations. In the first scenario you need the technical processes one time, in the other one you might need information about the capability of all involved organizations in order to implement necessary improvements.

Key Questions: Who is responsible for test environment management, test data management and test automation, who is responsible for test planning, are there organizational or contractual barriers between the responsible persons?

Decide on assessment of technical processes

Test Stages/Test Steps

As TestSPICE does not depend to Test Stages / Test Steps, it is recommended to assess the core testing processes at each test stage

Example: It makes a difference to describe test cases for a unit test or to describe test cases for a functional or an E2E test. It is also a difference between functional and non functional (e.g. performance) testing.

Key Questions: Which test stages/steps are mentioned in the test strategy on organization or project level. Are there Glossarys / Naming conventions available to assure a consistent terminology is used by all involved parties?

Decide which test stages will be assessed and if complete core processes need to be assessed in order to support the purpose of the intended improvement project.

Test Planning

Test planning might be done in various instances. Depending of test stages, their owners and their stakeholders.

Example: It makes a difference if test plan is centrally written executed and maintained by a central test manager or if test are planned by the development unit (unit tests) an outsourcing partner and an internal integration and user acceptance team.

Key Questions: Is there one detailed test plan in place or is there a master plan, containing central test and qa milestones with detail plans developed by the owners of the test stages.

Develop a proper understanding of the planning approach.

The concept of planning, execution of control based on the different levels and stages is shown in the following picture:

Page 195: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 195 of 210

Figure 10 - Concept of stages

Test Methods / Test Tools

TestSPICE does not depend on specific test methods or test tools. Methods and tools show a way on how to do test case writing or how to automate tests. TestSPICE addresses the question what has to be achieved (Process Purpose, Outcomes, Process Attributes) or what should be done (Practices) in order to perform a successful test.

Assessment delivery

Be aware of the organization and the number of team members. Avoid asking the same question again and again to the same person. So each interview or evidence check must have a real chance to deliver additional information. Take also into account that especially in immature organizations- a workshop might be more effective and efficient than a series of interviews.

It makes a difference if there is a one or more dedicated test team(s) or if there is a cross functional team with a test expert in it or if there is a team where each person does everything. Your goal is to gather evidences as efficient as possible.

Who has valid information about the test process? Who has detailed information about test strategy, test planning test stages, technical testing issues or other information seen as relevant for the intended assessment and improvement. Do all information provider understand process assessment and improvement? Is the organizational culture more familiar with single or with team work?

Having checked all these topics an experienced lead assessor (Competent or principal status) will be able to develop a valid assessment plan.

Page 196: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 196 of 210

15 Annex H: TestSPICE Support for agile projects

TestSPICE is software development life cycle (SDLM) independent. But it can’t be ignored, that more and more organizations change their management attitude from classical project management to a somewhat agile style. These organizations also need some guidance to improve. The problem is the absence of a common accepted agile maturity framework.

In order to support agile organizations this annex provides a basic set of agile management processes.

The Agile Management Process Group (AMP) deals with processes necessary to implement agile management in an organization It consists of the following processes:

AMP.1 Backlog Management

AMP.2 Impediment Management

AMP.3 Service Class and WIP Limit Management

AMP.4 Technical debt Management

AMP.5 Knowledge debt Management

AMP.6 Definition of Done (DoD) Management

AMP.7 Organizational Capacity Management

15.1.1 AMP.1 Backlog Management

Process ID AMP.1

Process name Backlog management

Process purpose The purpose of the Backlog Management process is to make sure that relevant items like requirements are collected, described, properly stored, prioritized estimated and solutions are delivered.

Process outcomes Outcome 1 The backlog is visible at a defined storage place

Outcome 2 The items in the backlog are agreed between the relevant stakeholders

Outcome 3 The backlog contains the complete set of items

Note: no shadow backlogs are used

Outcome 4 The items in the backlog are prioritized

Note: the classification of prioritizing shall be clear. It shall be defined in

which manner the backlogs are to be prioritized (e. g. regarding

business value, complexity, etc.)

Outcome 5 The items in the backlog are estimated

Page 197: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 197 of 210

Outcome 6 The assignment of items to an iteration (or increment) is visible

Outcome 7 Delivery of items are visible

Base Practices Activity ID Description Outcome reference

Collection of backlog items

1 Collect the items for the backlog from all relevant stakeholders

1, 2

Understanding of the items

2 Derive a common understanding of the items as basis for solution delivery

2

Acceptance criteria

3 Agree on acceptance criteria and document them 2

Uniqueness of backlog

4 Assure uniqueness of backlog 3

Prioritization of items

5 Prioritize backlog items 4

Estimation of items

6 Estimate backlog items 5

Assignment of the items to iterations

7 Assign backlog items to iterations (or increment) 6

Documentation of prioritization, assignment and delivery of items

8 Document priorities, estimates, assignments and delivery in the backlog

7

Visualization of backlog

9 Visualize the backlog 1

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

Backlog Backlog 1, 2, 3

Backlog Items Documentation of acceptance criteria

2

Agreement for acceptance criteria

2

Backlog Items 4, 5, 6

Item delivery record 7

Page 198: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 198 of 210

15.1.2 AMP.2 Impediment Management

Process ID AMP.2

Process name Impediment Management

Process purpose The purpose of Impediment Management process is to improve the performance of a development or test team by constantly and consequently removing impediments that hamper the progress or the performance of an agile team.

Process outcomes Outcome 1 Impediments are unique identified

Outcome 2 The impact of an impediment is analyzed

Outcome 3 The root cause of an impediment is analyzed

Outcome 4 The potential solutions of the impediment are derived, prioritized and agreed with the team

Outcome 5 The stakeholders need to implement a solution are identified

Outcome 6 A solution of the impediment is agreed with the relevant stakeholders

Outcome 7 The impediment is implemented

Outcome 8 The impediment is tracked to closure

Base Practices Activity ID Description Outcome reference

Impediments identification

1 Identify the impediments 1

Communication in team

2 Communicate the impediment with the team. 1

Detailed description of impediment

3 Develop a detailed impediment description 1, 2

Impact Analyze 4 Analyze the impact of the impediment 2

Analyze of impediment causes

5 Analyze the potential root causes of the impediment 3

Evaluation of feasible solutions

6 Develop suitable and feasible solutions for the impediment and prioritize these solutions

4

Consideration of stakeholder needs

7 Identify stakeholders needs to implement the solution 5

Negotiation of agreement

8 Negotiate mutual solution agreement with the relevant stakeholder

6

Implementation of the impediment

9 Implement the agreed solution in team 7

Page 199: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 199 of 210

Closure of impediment

10 Track impediment to closure 8

Communication of the status

11 Communicate impediment status 8

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

Impediment Log Impediment Log 1, 2, 3

Page 200: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 200 of 210

15.1.3 AMP.3 Service Class and WIP Limit Management

Process ID AMP.3

Process name Service Class and WIP Limit management

Process purpose The purpose of the Service Class and WIP (work in progress) Limit Management process is to identify and structure the workload of a team and to identify actual as well as potential bottlenecks in the development process.

Process outcomes Outcome 1 The services of a team are identified

Outcome 2 The services of a team are analyzed and structured

Outcome 3 The capacity of the team is split across agreed service classes

Outcome 4 WIP limit for each service class is defined

Outcome 5 The handling of emergency requests is defined

Outcome 6 Queuing mechanisms for the service classes of a team are defined

Outcome 7 Requests, work in progress and delivered items are properly visualized

Outcome 8 Utilization of team capacity is readjusted as needed

Base Practices Activity ID Description Outcome reference

Identification of team services

1 Identify the services a team is working on, with the team

1

Analyzing of the services

2 Analyze the services of a team if they deliver value to the organization

2

Agreement of Services

3 Agree on services with the team 3

Assignment the services to service classes

4 Assign team services to service classes

Note: Service classes like e. g. : Analyze, Implement,

Test and Done

3

Defining of capacity and priority of services

5 Define priority, capacity and WIP limit for each service class

(WIP is generally used by Kanban agile method)

4

Emergency requests

6 Define the handling of emergency requests 5

queuing mechanisms

7 Define the queuing mechanisms for each service class 6

Virtualization 8 Visualize requests, queues, work in progress and delivered items

7

Page 201: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 201 of 210

Tracking of services

9 Track service utilization 7

Readjusting of services

10 Readjust team services if needed. 8

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

Service Class description Service Class description 1, 2, 3

WIP limit WIP limit 4

Definition of handling for emergency requests

5

Definition of queuing mechanisms for each service class

6

Visualization of requests, queues, WIP and delivered items

7

Page 202: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 202 of 210

15.1.4 AMP.4 Technical debt Management

Process ID AMP.4

Process name Technical debt Management

Process purpose The purpose of the Technical debt Management process is to identify technical problems that create a business risk and to implement solutions to deal with technical debt.

Process outcomes Outcome 1 Technical debts are identified

Outcome 2 The impact of technical debts are analyzed

Outcome 3 Root causes for technical debts are analyzed

Outcome 4 Corrective actions are identified and prioritized

Outcome 5 Preventive actions are identified and prioritized

Outcome 6 Technical debts and their impacts as well as corrective and preventive actions and their benefits are effectively communicated

Outcome 7 Corrective and preventive actions and their impacts on team capacity are agreed with the relevant stakeholders (e.g. development team, product owner etc.)

Outcome 8 Corrective actions are implemented as agreed

Outcome 9 Preventive actions are implemented as agreed

Outcome 10 Technical debts are tracked to closure

Base Practices Activity ID Description Outcome reference

Identification of technical debts

1 Identify technical debt in team and with the relevant stakeholder

1

Impact analysis 2 Analyze the business impact of technical debt 2

Communication 3 Communicate the technical debt and its business impact

2

Analysis of the root of the debts

4 Analyze the root cause of technical debt 3

Definition of corrective actions

5 Derive corrective actions 4

Prioritization of corrective actions

6 Prioritize corrective actions according to their business value

4

Definition of preventive actions

7 Derive preventive actions 5

Prioritization of 8 Prioritize preventive actions according to their 5

Page 203: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 203 of 210

preventive actions

business value

Communication 9 Communicate possible corrective and preventive actions

6

Agreement for actions

10 Agree on corrective and/or preventive actions with the relevant stakeholders

7

Implementation of corrective actions

11 Implement corrective actions 8

Implementation of preventive actions

12 Implement preventive actions 9

Tracking of technical debts

13 Track technical debt to closure 10

Monitoring 14 Monitor the effectiveness of preventive actions 10

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

Identification of technical debts 1

Analysis of the impact and root causes of technical debts

2, 3

Identification and prioritization of corrective actions

4

Identification and prioritization of preventive actions

5

Communication of technical debts and their impacts

6

Agreement for corrective and preventive actions with the stakeholders

7

Implementation of corrective and preventive actions

8, 9

Closure of technical debts 10

15.1.5 AMP.5 Knowledge debt Management

Process ID AMP.5

Process name Knowledge debt Management

Process purpose The purpose of the knowledge debt Management process is to assure that “just enough documentation” is produced and that relevant knowledge is properly shared.

Page 204: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 204 of 210

Process outcomes Outcome 1 The needs for documentation and knowledge sharing are identified

Outcome 2 Gaps are identified

Outcome 3 The impact of the gaps are analyzed

Outcome 4 Potential knowledge bottlenecks are identified

Outcome 5 The need for corrective actions is identified

Outcome 6 Corrective actions are agreed with all stakeholders

Outcome 7 Gaps are tracked to closure

Base Practices Activity ID Description Outcome reference

Documentation 1 Identify the needs for documentation 1

knowledge sharing

2 Identify the needs for knowledge sharing 1

Current situation 3 Analyze the current situation 2

Gaps 4 Identify gaps 2

Impact analysis Analyze the impact of the gaps 3

Information bottlenecks

5 Identify teams or individuals that tend to become information bottlenecks

4

corrective actions 6 Identify the need for corrective actions 5

Agreement for corrective actions

7 Agree on corrective actions 6

Tracking of the gaps

8 Track the gaps to closure 7

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

Identification of needs for documentation and knowledge sharing

1

Identification of gaps 2

Analysis of the impact of the gaps

3

Identification of potential 4

Page 205: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 205 of 210

knowledge bottlenecks

Identification of the need for corrective actions

5

Agreement for corrective actions with all stakeholders

6

Closure of the gaps 7

Page 206: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 206 of 210

15.1.6 AMP.6 Definition of Done (DoD) Management

Process ID AMP.6

Process name Definition of Done (DoD) Management

Process purpose The purpose of the Definition of Done Management process is to assure that the delivered functionalities and activities are really ready to ship.

Process outcomes Outcome 1 It is defined for which following level the DoD is to be developed:

� DoD for a functionality or an activity (user story or a product backlog)

� DoD for a sprint (a list of collected features for the sprint)

� DoD for a release version

Outcome 2 The team member and stakeholder have already provided the relevant DoD

Outcome 3 The list of DoD for the relevant level is developed and communicated in team and with stakeholder

Outcome 4 Criteria for Done are analyzed, collected and communicated in team

Outcome 5 Effort caused by each DoD is estimated

Outcome 6 The DoD are agreed between the team the individuals and the stakeholders

Outcome 7 The DoD are integrated in the delivery and capacity planning

Base Practices Activity ID Description Outcome reference

Level for DoD 1 Identify for which level the DoD shall be defined 1

Stakeholder 2 Identify stakeholders that might provide criteria for done

2

List of DoD 3 The team member and stakeholder provide the relevant list for DoD.

The list of DoD could include e. g. the following items:

� Test cases

� Implementation

� Data base implementation

� Unit testing /automated testing

� Regression test

� Integration test

� Load and performance test

� Acceptance test

� Documentation

� Deployment

2

Page 207: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 207 of 210

� Communication

Communication 4 Collect and communicate the list of DoD in team and with the stakeholder

3

Discussion about DoD

5 Discuss potential DoD with all involved parties 4

Criteria for Done 6 Collect criteria for done 4

DoD 7 Develop DoD from collected criteria 4

Agreement for DoD

8 Agree on DoD in team and with stakeholder 5

Effort 9 Analyze and estimate the effort caused by DoD 5

Additional competencies

10 Analyze the need for additional competencies on team and individual level

4, 6

Integration of DoD in the iteration

11 Integrate the DoD in the iteration planning approach 6

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

Definition of level for DoD 1

Providing of the list of DoD by team and stakeholder

2, 3

Definition of criteria for Done 4

Estimation of effort for each DoD

5

Agreement for the list of DoD 6

Integration of DoD in the delivery and capacity planning

7

Page 208: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 208 of 210

15.1.7 AMP.7 Organizational Capacity Management

Process ID AMP.7

Process name Organizational Capacity Management

Process purpose The purpose of the Organizational Capacity Management process is to align the available team capacity with the delivery needs of the organization in order to avoid micro management.

Process outcomes Outcome 1 The organizational need for delivered software is analyzed

Outcome 2 Global priorities are set by senior management

Outcome 3 The capacity of the available teams is analyzed based on their ability to deliver

Outcome 4 Gaps between the organizational need and the ability to deliver are identified

Outcome 5 Strategies to deal with these gaps are developed, agreed, communicated and implemented

Outcome 6 Micro management attempts are identified and worked upon

Outcome 7 Team utilization and frequent delivery is constantly monitored

Base Practices Activity ID Description Outcome reference

Organizational need

1 Develop the organizational need for software portfolio 1

Need for changes 2 Forecast the need for changes and enhancements 1

Prioritization of portfolio

3 Prioritize the organizational portfolio 2

Analyzing of team ability

4 Analyze the ability of the available teams based on historical data

3

Capacity gaps 5 Identify capacity gaps 4

Strategies to close the gaps

6 Identify strategies to close the identified gaps

Note: Those strategies might contain additional

training, additional resources, process streamlining or

need prioritization

5

Micro-management

7 Identify the micro-management hampers.

Note: Micro-management hampers the ability of teams

to frequently deliver shippable software

6

Monitoring of team delivery

8 Monitor team utilization and delivery

7

Page 209: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 209 of 210

Work Products

Inputs Outputs

Result type Outcome reference Result type Outcome reference

Analysis of the organizational need

1

Prioritization of the needs 2

Analysis of the capacity of the teams

3

Identification of the gaps 4

Definition and implementation of strategies to deal with these gaps

5

Identification of micro- management attempts

6

Monitoring of team frequent delivery

7

Page 210: The TestSPICE PAM V3 - Ningapi.ning.com/.../TheTestSPICEPAMV3.0.pdf · The TestSPICE PAM – (TestSPICE SIG) Release Date 2014_10_10 Version 3.0 – Released Page 6 of 210 1 About

The TestSPICE PAM – (TestSPICE SIG)

Release Date 2014_10_10 Version 3.0 – Released Page 210 of 210

16 Annex I: Planning and delivering technical artifacts

This Annex will be completed latest with the 4.0 PAM It contains hints to the integration of Test Strategy and Test Planning Processes at the one hand and the technical test processes as describes in the TEM, the TDM and the TAU group.

17 Annex J: Integrated Scoping

This Annex will be completed latest with the 4.0 PAM It contains hints to integrated scoping e.g. with SPICE or with Automotive SPICE®.

Since the relaunch of ISO/IEC 15504 Part 5 (ISO/IEC 15504 Part 5:2012), it became clear, that lots of processes of the 2.0 pam became obsolete.

As an additional driver INTACS restructured the Syllabus of the provisional assessor certification training. In the new syllabus processes have to be explained in detail. So –from a training perspective- a decision to drop all processes that are 1:1 invoked from ISO/IEC 15504 Part 5:2008 is reasonable.

Last but not least there is a new upcoming ISO Standard for integrated scoping.

One of the most crucial points when planning and executing an assessment that is based on a combined scope, is to adopt processes to the context of the assessed organization, the project(s) and the assessment goal.

18 Annex K: Mapping from TestSPICE to ISO/IEC 29119

19 Annex L: Exemplar Test Strategies

20 Annex M: Interpretation of TestSPICE Practices and Outcomes for Domains


Recommended