+ All Categories
Home > Documents > Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project...

Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project...

Date post: 10-Jun-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
48
Programme Integrating and Strengthening the European Research Strategic Objective Networked businesses and governments Integrated Project / Programme Title Advanced Technologies for Interoperability of Heterogeneous Enterprise Networks and their Application Acronym ATHENA Project No 507849 ATHENA – Project Name Piloting including Technology Testing Coordination and Pilot Infrastructure ATHENA - Project No B5 Deliverable B5.1 Methodology & Technology Testing Report Work package – B5.1 Leading Partner: CRF Security Classification: Confidential (CO) October, 2004 Version 1.0
Transcript
Page 1: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

Programme

Integrating and Strengthening the European Research Strategic Objective

Networked businesses and governments Integrated Project / Programme Title

Advanced Technologies for Interoperability of Heterogeneous Enterprise Networks and their Application

Acronym

ATHENA Project No

507849 ATHENA – Project Name

Piloting including Technology Testing Coordination and Pilot Infrastructure

ATHENA - Project No

B5

Deliverable B5.1

Methodology & Technology Testing Report

Work package – B5.1 Leading Partner: CRF

Security Classification: Confidential (CO)

October, 2004

Version 1.0

Page 2: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page ii / 44

Deliverable process schedule No Process step Responsible Timing Comments

1

Initial planning of process including identification of individual contributors and peers. Detailed planning of timeline.

CRF Start 01.10.2004

[Contributor I] – Maria Anastasiou [Contributor II] – Paolo Paganelli [Contributor III] – Helge Grenager Solheim [Internal Peer I] – Claudia Guglielmina [Internal Peer II] – Frank Lillehagen [External Peer ] – Nenad Ivezic [Delegate] – Klaus FISCHER [TC Member] – Ruggaber Rainer [Timing confirmed] –

2

Initial drafting of the Deliverable including structure, comments and first basic content to be sent to to-be-contributors.

CRF 04.10.2004

3

First round of contributions. Work package member and others to contribute first iteration to owner of the Deliverable

FORMULA, INTRACOM, TXT, COMPUTAS

08.10.2004

4 Owner to consolidate first round input and distribute to contributors CRF 11.10.2004

Internal peers and AL involved/copied : [Internal Peer I] – Ricardo Goncalves [Internal Peer II] - Frank Lillehagen [Action Line Coordinator ] - Claudia Guglielmina

5 Final round of contributions to be sent to owner

FORMULA, INTRACOM, TXT, COMPUTAS,SAP

15.10.2004

6 Final consolidation of input and finalisation of “technical” document to be send to Quality check

CRF 19.10.2004

7 Quality check – e.g. Peer Review COMPUTAS, TXT, NIST

22.10.2004 27.10.2004 29.11.2004 13.01.2005 19.01.2005

Quality assurance by “internal peers” including cross reading by ATHENA external but partner company internal peers particularly in the beginning of the programme [Internal Peer I] – Claudia Guglielmina [Internal Peer II] – Frank Lillehagen [External Peer ] – Nenad Ivezic

8 Final editing Programme office 22.02.2005

Final editing particularly formatting and if necessary (particularly with important Deliverables) English mother tongue proofreading

9 Final Approval from PCC Delegate and one member of the Technology Council

PCC Delegate and TC member 11.03.2005

Formal approval (Note: this is not another revision of the content of the document. As it is distributed in pdf-format, comments will be external to the document. Any substantive comments lead to a rejection of the document) [PCC Delegate] – Klaus FISCHER [TC Member] – Rainer RUGGABER

10 Submission to Commission Programme Committee 12.03.2005 Final stage of process.

Page 3: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page iii / 44

Table of contents

1 Summary.............................................................................................................................. 1

2 Introduction .......................................................................................................................... 2 2.1 Objectives .......................................................................................................................................... 3 2.2 Method of work .................................................................................................................................. 3 2.3 Structure of the document.................................................................................................................. 4

3 Testing Methodology............................................................................................................ 5 3.1 Test Case and Test Scenario definition ............................................................................................. 5 3.2 Test Procedure .................................................................................................................................. 7 3.3 Validation Methodology and Evaluation Criteria................................................................................. 8 3.3.1 Technical Validation Methodology ..................................................................................................... 8 3.3.2 Business Validation Methodology .................................................................................................... 15 3.3.3 NIST Content-Level Conformance Testing Methodology ................................................................. 21 3.4 Test and analysis of results.............................................................................................................. 31

4 Piloting ............................................................................................................................... 32 4.1 Plan.................................................................................................................................................. 32 4.2 Pilot Setting...................................................................................................................................... 33 4.3 Training ............................................................................................................................................ 33 4.4 Test execution.................................................................................................................................. 33 4.5 Results Analysis............................................................................................................................... 34 4.6 Reporting ......................................................................................................................................... 35

5 Pilots .................................................................................................................................. 36 5.1 Overview on domain ........................................................................................................................ 36 5.2 Objectives ........................................................................................................................................ 36 5.3 Plan.................................................................................................................................................. 36 5.4 Organization..................................................................................................................................... 36 5.5 Architectures, tools and services ..................................................................................................... 36 5.6 Laboratories ..................................................................................................................................... 36 5.7 Test formalization - Contents ........................................................................................................... 36

6 Conclusion ......................................................................................................................... 40

7 Glossary and References .................................................................................................. 42

Page 4: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page iv / 44

List of Figures Figure 1 Test Scenarios definition process ............................................................................................. 5 Figure 2 Requirements gathering process .............................................................................................. 7 Table - Example of Test Results ............................................................................................................. 8 Figure 3 ISO 9126 Quality Model ............................................................................................................ 9 Figure 4 Business Validation: Solution evolution and Acceptability Area ............................................. 18 Figure 5 Business evaluation table – Method 2 .................................................................................... 20 Figure 6 : Reference Functional Architecture Diagram for a Content-Level Conformant Application

(Source NIST)...................................................................................................................... 22 Figure 7: Diagram Showing Focus of the BOD Validation Test Phase (Source NIST)......................... 22 Figure 8: Diagram Showing Focus of the BOD Input Mapping Test Phase (Source NIST).................. 24 Figure 9: Diagram Showing Focus of the BOD Output Mapping Test Phase (Source NIST)............... 27 Figure 10: Diagram Showing Focus of the Behavioral Test Phase (Source NIST) .............................. 29 Figure 11 Overview on testing phases.................................................................................................. 31 Figure 12 Activities Plan........................................................................................................................ 32 Figure 12 Plan overview........................................................................................................................ 32 Figure 13 First phase – functional test .................................................................................................. 33 Figure 14 Second phase – Test Scenarios execution .......................................................................... 34 Figure 15 Internal and general results integration................................................................................. 34 Table - Synopsis.................................................................................................................................... 41

Page 5: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 1/ 44

1 Summary

“Methodology and Technology Testing Report” aims to provide guidelines to test, validate and pilot the interoperability in all aspects of business and to prove the overall benefits.

After an essential introduction, in chapter 3 the Validation methodology will be outlined. Two methods are being proposed: the “Technical Method” concentrates on the evaluation of relevant characteristics provided by the Solution, whereas the “Business Method” looks at the impact on the overall business process. The former is process independent; the latter obviously depends on the specific process. Two others chapters deal with Piloting, regarding the aspects of the implementation of use case Solutions in a real context of industrial users.

The interoperability needs described in the To-Be interoperability scenarios, translated and elicited by activity B4 and AL A projects (services, knowledge and software components) will provide the basis on which to build the solutions.

The solutions will be tested, evaluated an then validated by means of a series of activities devoted to demonstrate that project objectives materializing in the solutions have been reached.

In order to evaluate and then validate the solutions an activity of Technology Testing, based on a network of coordinated laboratories has to be planned. The Technology Testing consists of a series of planned tasks that will define the organization necessary to perform the test and a series of methods and procedures to evaluate the test pilots.

A Validation methodology will lead and support the stakeholder during the execution of test and in evaluation phases.

The involved partners, who will control the achieved output against expected results, will monitor all the activities. The results will be formalised in several predefined reports.

To demonstrate the reached interoperability via pilots’ implementations has a multiplicity of meanings. It means for an industrial company increase the capability of competitiveness not only acting on typical business variables as time cost and quality but also stimulating innovation and creativity. In this logic a part of industrial validation on interoperability results should be intended.

Page 6: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 2/ 44

2 Introduction

The ease by which the functional information circulates determines the readiness, agility and quality of a decision making process. When these three characteristics are in place in an automated context it is possible to achieve maximum productivity and maximum value. Agile processes reinforce the ability to react with promptness to the changeable market conditions and customer needs, at the same time ensuring the conformity to existing regulations. Optimised processes increase business performances, conferring a competitive advantage. Business Process Reengineering with its methods and tools helps understanding problems and to find out the right and possible solution. Nevertheless, some problem remains and constitutes a common factor as well revealed by undertaken studies and disclosed in as-is scenarios by industrial partners participating in ATHENA (AIDIMA, EADS, INTRACOM, CRF).

Many processes are still carried out manually making difficult their integration and standardization as well as their adaptation to variable conditions. Many complex processes are based on contents deriving from systems that are not strictly interconnected and unable to share information. All of this limits productivity, response times, and working cycles, obstructs the norms observation and makes it difficult to identify the way in which processes management is performed.

The research activity to be carried out has among its own objectives to generate an interoperability awareness, to develop a new approach and to consolidate in a sound way a solution that can fill the process management gaps inside the organization and among the organizations.

The research results will be validated, refined, and further elaborated through pilots and testing activities in concrete industrial contexts. This means that Industrial companies, having the objective to solve its own process deficiencies, after providing the user requirements and guiding ATHENA RTD activities, will validate the research results via pilot implementations.

Interoperability will be appropriately assessed and validated using concrete business scenarios where process deficiencies and business issues1 related to interoperability (software and infrastructure, operative applications used, operations etc.) are present.

The interoperability solutions will be then validated in the scenarios identified and proposed by AL B projects, following the specifications included in the relevant profiles defining and implementing a suite of software services to cover the whole life cycle of the integrated interoperability paradigm and validating it against the requirements defined in interoperability scenarios of AL B.

To validate the interoperability solutions ad hoc methodology has to be defined in order to cover all the above-mentioned aspects and solve the needs that in a company have implications on operative, tactical and strategic level.

The main objective of ATHENA Piloting Methodology is to meet the needs of the Industrial end users with a consistent validation process.

This methodology represents an approach to industrial piloting. It will be coupled with solutions development and customization methodologies from action line A to support the full lifecycle of industrial interoperability. In this context, validation methods and data from validation projects will help companies to: • Assess the benefits of an approach, methodology and/or configuration of technologies;

• Compare and select among different solutions;

• Identify conditions for success and pitfalls they may encounter when implementing the solution;

• Learn how they may apply various solutions through practical examples and success stories. This overall methodology will be made available and further extended through the Enterprise

Interoperability Centre (EIC). Validation results will guide the definition of criteria for assessing the interoperability of enterprise applications and tools. This will contribute to the following key ATHENA results: Interoperability Impact Analysis Model and Guidelines and Best practices. Ongoing validation will also guide the further industrial development of action line A prototypes into commercial products and services.

The aim of this document is to provide principles and guidelines for an inherent and effective 1 For “Business issues” we intend a collection of industrial needs that in the B4 project are also called specific requirements, they represent one of the most tangible results regarding the reached Interoperability.

Page 7: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 3/ 44

validation process. The process should start with a prescribed validation plan and supported by a methodology able to cover all the steps needed to ensure that process and products conforms to all necessary specifications to run your business.

Validation methodology includes all types of processes, smart process controls, gap analysis, network communications, and higher-level business management systems.

At the beginning two kinds of Validation were foreseen. The first is Validation of technical requirements for the solutions that is a work that will be undertaken within the AL A projects themselves and the second, for which the scopes of this document are addressed is the process demonstrating that the solutions developed meet end user needs.

2.1 Objectives The “Methodology & Technology Testing Report” concerns a set of methods, processes and

related activities having the aim to coordinate pilot, experiment, verify and validate the interoperability in all aspects of business, and to prove the overall benefits.

The phase of Interoperability piloting and test will start after the development activity, and will have its highlight during the validation process, articulated on several piloting phases made by different steps, providing signals on whether and how well the Interoperability objectives have been achieved.

In several meetings achieved the issue of validation has been introduced in order to identify what exactly has to be piloted in B5, how the pilots have to be conducted with reference to prototypes integration in the end users environment.

The related activities carried out in B4 (Dynamic Requirements Definition and To-Be interoperability scenarios) and Action line A projects (generic solutions prototypes), provide the input to the development of an interoperability solution.

In fact, some of the results coming out from the AL A projects will include generic solutions prototypes that will be executable and put into service. These prototypes will be deployed for the pilots in B5. The generic solutions prototype should require a certain degree of customisation in order to be piloted in the Industrial environment.

The goal of the piloting is to prove that solutions meet specific requirements defined in the scenarios. The various Test Scenarios that will be defined for piloting will constitute the basis of providing piloting evaluation criteria.

In order to conduct and coordinate the Interoperability piloting a sequence of activities has been planned: • Definition of methods, methodologies to use during the technology testing

• Determination of the organization necessary to perform technology testing

• Identification of laboratory networks for piloting Since this task will be active for a long period, all these activities will follow an evolutionary and

cycle refinement approach as much as the program schedule will allow. In order to coordinate the test network, during the test activities execution each test team (§ 4.3)

will practise the guideline of Verification and Validation, where for Validation we intend “confirmation by examination and provision of objective evidence that particular requirements for a specific intended use are fulfilled” (ISO/IEC 9126). With the validation the focus moves to assessing whether the final interoperability Solution meets supposed needs when placed in its expected environment.

2.2 Method of work The work reported in this document is the result of discussions and interactions among partners

involved in activity B4 (Industrial users) and has been improved thanks to the collaboration with representatives of AL A and others of AL B to ensure the contents sharing, increase the common understanding of activities and to reach an agreement on the results and how to prove that these results will cover the expectations.

With regard to the area of interoperability and liaisons with important organizations that have previously dealt with this issue, an activity of co-operation started with the NIST and Athena B5. The exchange of information mainly concerns the technology testing.

NIST has shared his experience and provided contributions regarding Technology

Page 8: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 4/ 44

Testing on the basis of results of another project: Content-Level Conformance Testing Methodology developed in support of the Inventory Visibility and Interoperability.

The contributions of NIST that enrich the Technology Testing and the project contents are reported in chapter 3.3.3.

Ascertained the objectives of B5.1 as Technology Testing planning, establish a laboratory network for test and a common method of work, monitoring the progress of activities, relevant importance recovers the aspect of validation and its handling. It is fundamental to emphasize that AL A technical validation results will feed the validation activities performed in user scenarios (B4) by means of B5 activities.

2.3 Structure of the document The present deliverable for which is foreseen a repeated planned edition (M09 – M18) contains in

this issue a preliminary approach for Technology Testing and will be enriched on the next version of tangible validation results, considerations, improvements and recovery action if needed.

The document (DB5.1 M09) is articulated in three main parts. The first part is dedicated to the description of methodological aspects regarding the validation of

initial and final solutions. It consists of a series of methods considering technical and business aspects both relevant to an industrial perspective. Moreover, this part deals with the methodology to support the testing, the organization to execute the activities, the preparation of test procedures and evaluation criteria on solution.

The second part concerns the definition and setting of resources, planning and results management regarding the piloting activities.

The third section provides tools to support the test execution and piloting activities.

Page 9: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 5/ 44

3 Testing Methodology

This section describes the methodology for carrying out the testing activities, the appropriate organisation, and the preparation of the test procedures and the analysis of the results.

The testing methodology applies to the solution as well as to the pilots, to be described in the next chapter.

Based on available best practices and previous approaches coming out from other projects, we have defined the first assumption to validate the results.

On the basis of ISO 9126 guideline (Technical and Business Validation Methodology) we will define a specific procedure in order to validate the Interoperability Solutions (before defined as “Solution”).

3.1 Test Case and Test Scenario definition The first activity that the industrial user should perform is the definition of a Test Case. The Test Case definition (Fig. 1) process starts from a Business Scenario (Automotive,

Aeronautics, Furniture, Telecom) with the aim to identify the scenario useful to instantiate the specific Test Scenarios on which all the testing process will be conducted. The overall process concerns the validation activities starting from the TO-BE Scenario definition and involves interrelated activities of AL A and AL B projects.

Figure 1 Test Scenarios definition process

As depicted in Figure 1, during the project, each Industrial End User has to define its own Test Case starting from an As-Is process (AS-IS Scenario) and then the To-Be process (TO-BE Scenario). In fact, from these scenario specific requirements will be gathered, these will be consolidated and elaborated with generic requirements (AL A) in a final set (Project B4). Finally, the Test Scenario will be built (Figure 2) taking into account a defined To Be Scenario. This means that a Test Scenario (WDB5.5) is tightly linked to a To Be Scenario but not necessarily will include the whole processes described:

As-Is Scenario → To-Be Scenario → Test Case→ Test Scenario The Test Scenario represents a real situation on which we will implement the Pilot and perform

USE CASETEST CASE

S1

S3

S2

S4

S1 S2

S3 S4S1 S2

S3 S4TO-BE USE CASE

S1

S1

S1

AS-IS SCENARIO

TO-BE SCENARIO

TEST SCENARIO

SPECIFICREQUIREMENTS

Al A

PROJECTSFINAL SET OF

REQUIREMENTS

USE CASETEST CASE

S1

S3

S2

S4

S1

S3

S2

S4

S1 S2

S3 S4S1 S2

S3 S4TO-BE USE CASE

S1

S1

S1

AS-IS SCENARIO

TO-BE SCENARIO

TEST SCENARIO

SPECIFICREQUIREMENTS

Al A

PROJECTSFINAL SET OF

REQUIREMENTS

Page 10: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 6/ 44

the Validation process. The following step in the test scenario definition is the definition of the metric and the target that should be specified according with the Test Scenario, in fact the objective for the testing scenario is to perform an assessment of the of a specific issue versus one ore more solutions.

In order to define a metric the following attributes have to be specified: • SCALE: the scale on which the TEST will be measured (0¸5, 0¸19,..): 0÷5 by default

• DATE: the date or on which the test is performed

• MEASURE: the measurement to be used for the assessment

• TARGET: the level on the SCALE expected

• VALUE: the value coming out from the test execution The metric described will be instanced for each test scenario and the judgments will be used

during the validation phase. For example, for the following test scenario: the schemas, as generated by the modelling tool, have to be recognized by the AIF execution engines; a useful metric will be:

Scale Measure Target Value

0÷100 Percentage of error generated by the AIF execution engines 5 -

This assessment objective is part of the mapping approach defined in the B4.6 activity. The result

of the testing scenario will be used to give back a worth to the Interoperability Issues Vs. Solutions matrix, and in particular to evaluate for example an Interoperability Issue against one or more solutions (see the example below).

Metric1

Target

Metric2

Metric3

Metric4

Metricn

Test Scenario Assessment Criteria and Target

Metric 5

<Value>

<Value>

<Value>

<Value>

<Value>

<Value>

<Value><Value><Value><Value>TP5

<Value><Value><Value><Value>TP4

<Value><Value><Value><Value>TP3

<Value><Value><Value><Value>TP2

<Value><Value><Value><Value>TP1

Metric 4Metric 3Metric 2Metric 1

Test Scenario 1

Issue1

Solution10

Solution9

Solution8

Solution7

Solution6

Solution5

Solution4

Solution3

Solution2

Solution1

Issue2

Issue3

Issue4

Issue5

Issue6

Issue7

Issue8

Issue9

Issue10

Interoperability Issues Vs. Solutions

4 1

2 3

3

1

4 2

1

3

2

1 3

4 3

Page 11: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 7/ 44

More in detail: • Each Test Scenario should be as much realistic as possible or, in other words, should represent a

real work situation.

• The Test Scenario as a whole should allow the assessment of those user requirements specified in the projects of Action line A and B.

Figure 2 Requirements gathering process

As an example the Test Scenario could be a real situation in which a collaborative process will be modelled using an Enterprise modelling and Ontology tool to generate the detailed Cross-Organisational business processes and process ontology schemas. After the activity previously described the data schemas and processes generated from the modelling activities will be loaded into AIF [ATHENA Interoperability Framework] execution engine for the activity of validation: • The schemas, as generated by the modelling tool, has to be recognized by the AIF execution

engines;

• From the schemas the suitable e-Services will be discovered;

• Two different e-services that need to co-operate exchanging information and commands will be able of reconciling different formats, structures and organizations;

• Processes and e-Services will be rendered seamless interoperable for the given scenario.

3.2 Test Procedure To assess an instance of test scenario a group of test procedures has to be defined. The test

procedure is the step-by-step process needed in order to fill the metric assigned to the test procedure in the test scenario context.

Metric 5

<Value>

<Value>

TP5

<Value>TP4

<Value>TP3

<Value>TP2

<Value>TP1

Metric 4Metric 3Metric 2Metric 1

Test Scenario 1

P2 Select Generic

Requirements

P1 Select Specific

Requirements

P6Develop

Generic ITProducts

P7 Develop Specific IT Products

P3 Elaborate ATHENA

requirements

P4Search for Generic

Solutions

P5 Develop Generic

Solutions

Scenarios

External world

Generic Requirements

Information

Generic Requirements

Specific Requirements

ATHENA Requirements

List ofpotential genericsolution Generic

Solutions

Market

TestScenarios

P2 Select Generic

Requirements

P1 Select Specific

Requirements

P6Develop

Generic ITProducts

P7 Develop Specific IT Products

P3 Elaborate ATHENA

requirements

P4Search for Generic

Solutions

P5 Develop Generic

Solutions

Scenarios

External world

Generic Requirements

Information

Generic Requirements

Specific Requirements

ATHENA Requirements

List ofpotential genericsolution Generic

Solutions

Market

TestScenarios

Page 12: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 8/ 44

The first step is to assign of one the target already specified with the Test Scenario, the next step

is the definition of all steps needed to assess the final target. The actions are sequential, that is: the result of an action is the input of the next action.

The figure below shows how to correlate Test Scenario vs. Interoperability Issue, in order to guarantee that the piloting activities address all requirements/issue rose during the scenario analysis, and the relationship between testing scenario and testing procedure.

TS1

Issuen

Issue2

Issue1

TS2

TSm

Test Case: TC1

x

x

x

x TS1 TP1 TP2 TP3 TP4TS2 TP1 TP5 TP6TSm TP2 TP7 TP8 TP9

For each test procedure a final report will be produced in order to give an appropriate feedback

concerning the result of each test in the context of a test scenario.

TEST SCENARIO: TS<i> TEST PROCEDURE TEST RESULT

TP1 <Metric> <Value> TP2 <Metric> <Value>

TPn <Metric> <Value>

The 100% of test procedures have been performed with a satisfactory result.

Table - Example of Test Results

3.3 Validation Methodology and Evaluation Criteria In this section, the criteria under which to evaluate the solution are identified and described.

Qualitative metrics will be used in order to facilitate the validation task by end-users for the first solution prototype. Quantitative (including qualitative meanings) metrics shall instead be adopted for the evaluation of the intermediate and final solution (supported by Business Validation Methodology).

To evaluate the preliminary, intermediate and final solutions, we propose a set of different methodologies combining methods, procedures and evaluation criteria. With these methodologies we will intend to satisfy the basic characteristics of a method (clearness, completeness, coherence, impartiality, replicability) and to look at two complementary aspects: the Technical and Business aspects. Both aspects are relevant from an industrial perspective and will constitute the basis for a complete validation activity supported by two distinct methodologies.

3.3.1 Technical Validation Methodology

The methodology used for technical validation will draw from the ISO/IEC 9126 methodology "Information technology - Software Product Evaluation - Quality characteristics and guidelines for their use".

The objective of this standard is to provide a framework for the evaluation of software quality. ISO/IEC 9126 does not provide requirements for software, but it defines a quality model that is applicable to every kind of software. It defines six product quality characteristics and provides a suggestion of quality sub-characteristics (Figure 3).

Page 13: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 9/ 44

Figure 3 ISO 9126 Quality Model

Page 14: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 10/ 44

The sub-characteristics adopted by ISO/IEC 9126, are the following:

Characteristics Sub

characteristics Definitions

Suitability Attributes of product that bear on the presence and appropriateness of a set of functions for specified tasks.

Accurateness Attributes of product that bear on the provision of right or agreed results or effects.

Interoperability Attributes of product that bear on its ability to interact with specified systems.

Compliance Attributes of product that make the software adhere to application related standards or conventions or regulations in laws and similar prescriptions.

Functionality

Security Attributes of product that bear on its ability to prevent unauthorized access, whether accidental or deliberate, to programs or data.

Maturity Attributes of product that bear on the frequency of failure by faults in the software.

Fault tolerance Attributes of product that bear on its ability to maintain a specified level of performance in case of software faults or of infringement of its specified interface. Reliability

Recoverability Attributes of product that bear on the capability to re-establish its level of performance and recover the data directly affected in case of a failure and on the time and effort needed for it.

Understandability Attributes of product that bear on the users’ effort for recognizing the logical concept and its applicability.

Learnability Attributes of product that bear on the users’ effort for learning its application. Usability

Operability Attributes of product that bear on the users’ effort for operation and operation control.

Time behaviour Attributes of product that bear on response and processing times and on throughput rates in performances its function. Efficiency Resource

behaviour Attributes of product that bear on the amount of resource used and the duration of such use in performing its function.

Analysability Attributes of product that bear on the effort needed for diagnosis of deficiencies or causes of failures, or for identification of parts to be modified.

Changeability Attributes of product that bear on the effort needed for modification, fault removal or for environmental change.

Stability Attributes of product that bear on the risk of unexpected effect of modifications.

Maintainability

Testability Attributes of product that bear on the effort needed for validating the modified software.

Adaptability

Attributes of product that bear on the opportunity for its adaptation to different specified environments without applying other actions or means than those provided for this purpose for the software considered.

Installability Attributes of product that bear on the effort needed to install the software in a specified environment.

Conformance Attributes of product that make the software adhere to standards or conventions relating to portability.

Portability

Replaceability Attributes of product that bear on opportunity and effort using it in the place of specified other software in the environment of that software.

In ATHENA such sub-characteristics should be interpreted with respect to the identified reference

architecture and Athena Interoperability Framework (D.A4.1 Requirements for Interoperability

Page 15: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 11/ 44

Framework, product-based and process-based Interoperability Infrastructures, Interoperability Life-cycle Services) and not just in the specific pilot, in particular, we would like to take into account the assessment of solutions in terms of architecture, knowledge, business modelling, and ontology.

The criterion used to evaluate the solution is based on the use of qualitative metrics, in order to facilitate the validation. The characteristics will constitute the base on which the testing scenario for the technical validation will be executed and are related to five different levels: Business, Knowledge, Application, Data and Quality.

The following characteristics will constitute the base for the metrics that will be used for the test scenario in the technical validation context.

Characteristics Sub characteristics Definitions

Decision Model Attributes of solution that bear on the ability to support decisions and the degree of responsibility of each operating unit, role and position.

Business Model Attributes solution that bear on the ability to describe the commercial relationship between a business enterprise and the products and/or services it provides in the market.

Business Process

Attributes of solution that bear on the ability to support business processes modelling as set of activities that represent the end-to-end flow of materials, information and business commitments that implement an Enterprise Business Model.

Business Level

Semantic concepts

Attributes of solution that bear on the ability to support semantic concepts and relationships at Business level defined by the calling function shall be recognised and managed by the Called functions, in order to properly allow inter-operability among them

Org. – People

Attributes of solution that bear on the ability to support a structured representation of the enterprise Business Units, Departments, Roles, Positions from a hierarchical and functional view point.

Skill – Competen. Attributes of solution that bear on the capability to support a structured representation of an organisation skill and competencies.

Knowledge Assets

Attributes of solution that bear on the capability to model the capital of an organisation formalised in terms of procedures, norms, rules, and references.

Knowledge Level

Semantic concepts

Attributes of solution that bear on the capability to represent semantic concepts and relationships at Knowledge Level defined by the calling functions in order to be recognised and managed by the called functions to allow interoperability among them

Understandability Attributes of solution that bear on the users’ effort for recognizing the logical concept and its applicability.

Learnability Attributes of solution that bear on the users’ effort for learning its application.

Operability Attributes of solution that bear on the users’ effort for operation and operation control.

Application Level

Semantic concepts

Attributes of solution that bear on semantic concepts at applications level defined by the calling functions shall in order to be recognised and managed by the called functions to allow interoperability among them.

Page 16: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 12/ 44

Characteristics Sub

characteristics Definitions

Time behaviour Attributes of solution that bear on response and processing times and on throughput rates in performances its function.

Resource behaviour

Attributes of solution that bear on the amount of resource used and the duration of such use in performing its function.

Semantic concepts

Attributes of solution that bear on a shared understanding of concepts and inter-relationships concerned with the data issues (product, process, commercial, knowledge).

Relationships concepts

Attributes of solution that bear on relationships at Data Level between the called function in order to be recognised and managed by the called functions to allow interoperability among them.

Data Level

Multi-lingual concepts

Attributes of solution that bear on multi-lingual and multi cultural issues.

Security Attributes of solution that bear on the capability to protected and transfer securely data, and the capability to protect private processes.

Performance Attributes of solution that bear on the capability to response and processing times and on throughput rate in performances its function.

Availability Attributes of solution that bear on the capability of providing an always available and accessible solution

Portability Attributes of solution that bear on the capability to be used on different hardware platforms, operating solutions, and runtime environments with little changes of the solution

Quality Level

Scalability

Attributes of solution that bear on the capability to serve a large number of users, and to execute a large number of processes in parallel and an effective scheduling to grant a process access to enterprise resource, and to process a large amount of data.

The metrics are the necessary means for objective application of solution assessment. The

metrics used in the testing scenario definition phase give concreteness for target to achieve by means of a measurable judgement.

Each test team (§ 4.3) should provide a Report Summary for each test scenario following the table presented in Table 6.

Characteristics Sub characteristics Test Scenario 1 Test Scenario 2 Test Scenario n

Decision Model Business Model Business Process Business Level Semantic Concepts

Org. – People Skill – Competen. Knowledge Assets

Knowledge Level

Semantic concepts

Page 17: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 13/ 44

Characteristics Sub characteristics Test Scenario 1 Test Scenario 2 Test Scenario n

Understandability Learnability Operability

Application Level

Semantic concepts

Time behaviour Resource behaviour

Semantic concepts

Relationships concepts

Data Level

Multi-lingual concepts

Security Performance Availability Portability

Quality Level

Scalability In order to evaluate the qualitative data, will be used a weighted average method, according to the

following scheme:

Importance

items Importance value

Not interesting 0

Low importance 1

Medium importance 2

High importance 3

Judgments scale

Judgement Associated values

Insufficient 0

Sufficient 1

Good 2

Very good 3

Excellent 4

Superlative 5

1. Select a set of sub-characteristic related with the testing scenario will be evaluated 2. For each selected sub-characteristic will be assigned a predefined importance

3. Each selected sub-characteristic will be evaluated with a predefined

Page 18: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 14/ 44

Assessment and acceptability criteria The evaluation scale used, ranging for example from 0 to 5, should have a minimum acceptable

value required around 3, a critical value around 2 and an optimum value around 4. Evaluation and the comparison of the expected results are the most important items of this phase,

because it provides bi-directional indications during the development evolution. At the end of the first round of the piloting activity, a final test report will be issued containing all

information that evaluate and sum up the piloting report coming from the different test team. In the table below we will show the final results.

Characteristics Sub characteristics Importance Value Expected

Decision Model Business Model Business Process Business Level Semantic Concepts

Org. – People Skill – Competen. Knowledge Assets

Knowledge Level

Semantic concepts

Understandability Learnability Operability

Application Level

Semantic concepts

Time behaviour Resource behaviour

Semantic concepts

Relationships concepts

Data Level

Multi-lingual concepts

Security Performance Availability Portability

Quality Level

Scalability

4. The items clustered in attributes, will be weighted in an appropriate way and in the same way the relative importance within the proper cluster

Page 19: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 15/ 44

3.3.2 Business Validation Methodology

The business validation methodology will be based on a business perspective, it main goal is to attempt to prove that interoperability has been reached and to identify and evaluate the real benefits deriving from the introduction of a solution in the enterprise.

Within and among the enterprises for example, turning data into actionable information2 is a must for reducing costs, improving the quality of decision-making and reducing time to market. Interoperability then, is instrumental for processes automation and streamlining the operations to: • Reduce costs

• Increase flexibility

• Enable traceability

• Establish early warning systems

• Improve responsiveness to market signals Besides, Interoperability should allow:

• Quick, easy access to all information required to make decisions

• Access to information from every system within and among collaborative enterprises

• Seamless data exchange among all systems/applications

• Application of business logics between systems/applications

• Quick, cost-effective addition of new applications or new requirements to the information system

• Easy, safe sharing of relevant information in relative-time with key customers and suppliers

• Fast adaptation to changes in business requirements or demands

• Enlarge and improve innovation and creativity Whereas it is relatively easy to identify the costs and evaluate the investment on IT, this is not the

case with the identification and evaluation of benefits achieved by an organization deriving from the introduction of a new solution able to cover and solve interoperability issues.

From a high level perspective, the aspects to be considered are: a) Consistency: is the solution (system, products, services, and tools) consistent with enterprise

strategy? b) Impact: what is the impact on enterprise behaviour? c) Benefits: what is the economic impact of the solution on product/process/business

performances? It is difficult to evaluate the benefits deriving from the introduction of a new solution because a lot

of functionalities are innovative and the impact on the processes could be pervasive. For this reason a Business methodology should be tailored to the specific situation in order to

verify and assess the impact of an interoperability solution on the involved processes as described by industrial users and if necessary to extend the analysis to related processes.

The suggested methods to evaluate the interoperability from a business perspective and covering several aspects at different levels are: • Method 1: Solution oriented (identification of real applicability areas)

• Method 2: Process performance oriented (monitoring and evaluation on process performance indicators)

2 For actionable information we intend an enterprise context in which all information regarding process, HR, product etc, managed among different systems and applications are automatically exchanged, updated and returned available and usable.

Page 20: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 16/ 44

Method 1 Rationale

The first method aims at identifying the applicability range of solutions. A solution can be usefully applied to one or more application scenarios (Supply Chain Management, Collaborative Product Development, e-Procurement, Portfolio Management) and/or to more than one or more processes (e.g. Target Setting, Sourcing,..) associated to the specific scenario (Collaborative Product Development and Supply Chain Management).

Objectives Assess solutions vs. interoperability issues by means of test scenarios. In detail: Identify if a Solution (S1 , S2, Sn) covers the issue (in) associated to a scenario- process,

belonging to an industrial end user Evaluate how well an issue (in) is satisfied by a Solution

Procedure A list of open interoperability issues, representing problems that have an impact on a business

processes will be gathered and used for the test scenario definition. During the testing activity when the selected solution will be tested a predefined matrix should be used.

By means of test scenarios and test procedures, we will assess if the proposed solution covers a specific interoperability issue (related to scenario, interested processes, for a given enterprise) and evaluate how well this solution satisfies the issue.

An example of this matrix is provided by the figure below:

Metric The test should provide two kinds of information:

• Coverage level of the solution with respect of the scenario

• Coverage level of a set issues by comparing more solutions Two indexes are proposed to capture both information types. The two indexes consolidate

elementary judgments described in what follows. Overall, the results will provide the following indications: • On which issue the solution has an impact;

Page 21: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 17/ 44

• On which scenario the solution has an impact;

• On which process the solution has an impact;

• How many processes and business aspects have been covered by the solution;

• To what extent the solution is suitable from an industrial point of view;

• Feedback regarding the current solution and indication for further improvement.

Test and analysis of results The solutions will be tested in several steps. The industrial end users will record the results of their

test in the matrix. Data to be recorded is a Judgment that measures the degree of relationship between the Solution and the issues.

The result of “coverage degree” will be calculated as reported:

∑=

=T

iCDTCi

1/

where

Ci = binary value which states whether a relation exists between solution and interoperability issues

≠=

=0 10 0

i

ii Vif

VifC

Vi= integer value (ranging from [0,3]) that express how well the solution deals with the issue

T=number of issues considered

CD= coverage degree index i=issue

The result of “coverage quality” will be calculated as reported:

CQTViT

i=∑

=

3/1

where

Vi = integer value (ranging from [0,3]) that express how well the solution deals with the issue

i=issue

T=number of issues considered

CQ= coverage quality index

The results coming out from the activity of analysis should allow validating the solution. Given that CD index is normalised can be interpreted like a percentage value. For each percentage value reported in table below we have proposed to link a quality judgement. A proposed scale for example is the following:

Page 22: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 18/ 44

COVERAGE DEGREE Percentage of solved issues Judgment

50% Insufficient 60% Sufficient 70% Good 80% Very good 90% Excellent

100% Superlative

COVERAGE QUALITY Kind of relation between

solution and issues Level of relation

No relation 0 Weak relation 1

Medium relation 2 Strong relation 3

Once the variability spectrum of the numerical results has been defined and particular judgements

are proposed, we have to define the criteria that allow to validate the solution. A criterion can consist of suitable threshold choice:

• Coverage degree threshold: e.g. 0,7

• Coverage quality threshold: e.g. 0,7

The Solution has to be compared during its evolution and a large spectrum of variables should be considered. A more detailed analysis should be performed if the solution is applicable to more than one process.

An example that takes in to account these two aspects can be summarized in the graph below.

Figure 4 Business Validation: Solution evolution and Acceptability Area

Page 23: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 19/ 44

Method 2

Rationale This method aims at assessing the impact that the introduction of an interoperability solution will

have on a business processes by evaluating some appropriate business performance. The method will use qualitative indicators and will apply to whole test case, whereas the previous one (method 1) will apply only to test scenario.

Objectives

• To identify if the introduction of an interoperability solution have improved the relevant business aspects (figure “Activity” and related “Relevant Requirements”),

• To monitor the evolution of the key performance indicator related to these aspects

• To measure how well the envisaged interoperability solution has improved a process It is important to estimate how well the performance for the selected business process will

improved by the proposed interoperability solution: a new solution might be useful only for a particular scenario (process) and/or suitable for a particular process.

The results deriving from this activity can be useful for technological partners providing signals and directions on multiple aspects: • Actual and further requirements;

• Application Scenario;

• Solution improvement.

Procedure A set of variables will be chosen in order to explain in detail distinguish business aspects that

should be analysed and subsequently evaluated in order to validate the solution. The steps that will be observed are:

A matrix will be used to report the company scenario, involved process and business issues in order to gather all the necessary information useful to perform the business evaluation.

The activities that will be performed consist in comparing the solution (and provided functionalities) versus the issues associated to the test case by means of the key performance indicators.

In this way we are able to identify: • On which issues the solution has an impact

• On which scenario the solution has an impact

Metric Two qualitative indicators will be used:

• Trend indicator: an arrow to indicate the trend at the time of testing

• Judgment: a number to tell if the target has been reached, if we move away from target (worsening) or if we stay in a constant situation. A colour will be associated to each item in order to identify how many objectives have passed the test.

An optional judgment associated to each requirement – scenario/process can be used to identify

1. Identify the business relevant aspects for involved process 2. Identify the key value generating activity in the given process 3. Report the process of reference for this activity 4. Gather the relevant requirement for the activity (traceable functional requirement) 5. Identify the driver for activity 6. Identify variables or parameters to quantify the driver 7. Establish a Target Value for the parameter 8. Formalize a target indicator 9. Identify the issue that must be considered as mandatory

Page 24: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 20/ 44

the importance of each requirement.

Test and analysis of results The testing will provide trend indications concerning the variables to be measured, relative to

target values. To have an immediate answer on the results, symbols (arrows) will be used to signal the trend

relative to the target indicators. A Target indicator declares the expected behaviour for the measured value (e.g. in Figure 5 the ‘n.

of SSTS3 /month’ should decrease). At the date of last test, the following aspects should be analysed:

a) Target indicator: in order to identify if the target has been reached b) Number of objectives achieved c) The comparison with mandatory issues

Parameters of validation The Business Validation will provide successful result if:

a) Condition 1: e.g. all mandatory objectives have been reached b) Condition 2: e.g. at least 70% of objectives has been reached

Here below is reported an example of testing form that will be used during the activity of Business Validation.

Figure 5 Business evaluation table – Method 2

3 SSTS : “Sub-System Technical Specification”, term used in Request for quotation.

Company

Business Analyst

Date

Item Process ActivityRelevant

RequirementDriver Parameter

Measured Value

Target Value

Trend IndicatorTarget

Indicator

Business Relevant Aspect

Judgement Mandatory

1Target Setting

SSTS issue

The new version of SSTS is automatically notified to Engineering Dep.

number of SSTS

n. SSTS/month 2000 1600Time

reduction0 Y

2 SourcingRFQ

management

The RFQ should be automatically sent to 1 Tier 2 Tier observing detailed specif.

number of RFQ

n. RFQ/poject 3200 2100

Cost Reduction

Time Reduction

1 N

3Collaborative

DesignPSI - VTS updating

The 1 TIER should update product performance data

n. of updated product

objectives on PSI

n.rev./system 200 300Quality

Knowledge-1 N

n … … … … … …. … … …

CRF

D.L.

22/03/2005

Indicators

increase Issue Value

constantTarget

Reached1

decrease Costant 0

Out of target

-1

Page 25: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 21/ 44

3.3.3 NIST Content-Level Conformance Testing Methodology

This section outlines the NIST Content-Level Conformance Testing methodology developed in support of the Inventory Visibility and Interoperability (IV&I) project at the Automotive Industry Action Group (AIAG). The methodology, however, is general and addresses conformance of an application implementation to a data interchange standard message.

The purpose of this section is to identify scope, activities, roles, and commitments for the content-level conformance testing participants. The content-level conformance testing is ‘a preparation process’ to achieve the interoperability testing goal. While this methodology is focused on content-level conformance, other methodologies will address other parts of the ‘interoperability stack’ such as messaging, business process, security, etc.

This section will describe in considerable detail each of the conformance testing phases designed to assess validity of a data interchange standard message (referred to as BOD, for Business Object Document) in the context of a target interoperable data exchange among vendor tools. (See [NIST-1] for additional definitions and an illustration of this methodology application.) The following is an outline for the discussion of the testing methodology: • We start by describing a reference functional architecture for a conformant tool – this helps us

explain the scope of each conformance test phase.

• Next, we describe a format to describe details of each conformance test phase: the test focus diagram, the test claim, the test assumptions, the message exchange diagram, the participants’ responsibility list, and the test report form.

• Then, we describe each of the conformance test phases in the above format.

3.3.3.1 A reference functional architecture

We introduce a reference functional architecture for an application tested for content-level conformance to explain the scope of each conformance test phase. Figure 6 shows one such reference functional architecture. This conceptual reference architecture states that an interoperable application is capable of supporting the following functions: • Receive a message (over a transport) and Retrieve a BOD from the message envelope.

• Validate a BOD content

• Create a Confirmation Message (i.e., ConformBOD) instance

• Store a BOD instance

• Render a BOD instance for viewing via a User Interface (UI)

• Generate the same BOD instance from locally stored data

• Prepare the BOD instance for transmission and forward the instance within a message envelope As indicated in the figure by dotted lines, the first and the last functionality are not of concern for

content-level conformance testing and both of the functions are assumed correct.

Page 26: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 22/ 44

Figure 6 : Reference Functional Architecture Diagram for a Content-Level Conformant Application (Source NIST)

3.3.3.2 Presentation Format for Each Conformance Testing Phase

Each conformance testing phase is presented next to include a collection of information: • A diagram showing test focus with respect to the reference functional architecture

• A positive test claim for this test phase

• Assumptions specific for this testing phase

• A message exchange diagram showing interaction among participant roles;

• Responsibilities for each participant with reference to the message exchange diagram

• An example test report form detailing test procedure details

3.3.3.3 Conformance Test Phase 1: BOD Validation Test Phase

This test phase is performed as a prerequisite for subsequent test phases. Its purpose is to ensure that the candidate system can generate valid BODs and has implemented the message data fields in conformance with “occurrence in test” test specification (for definition of the occurrence in test specification, see Figure 6). As such, this step factors out the BOD generation mismatch as a non-conforming factor from the subsequent test phases:

(A) Test Focus Diagram

Figure 7: Diagram Showing Focus of the BOD Validation Test Phase (Source NIST)

Receive msg &

Retrieve BOD Validate

BOD Store BOD

Generate BOD

Prepare & Send msg

Render BOD

Confirm BOD

Receive msg &

Retrieve BOD

Validate OK Store BOD

Generate BOD

Prepare & Send msg

Render BOD

Confirm BOD

Legend: Assumed Not Required Function Transport Capability for Test Under Test Capability

Page 27: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 23/ 44

(B) Positive Test Claim Upon a successful completion of the BOD Validation Test Phase, the following claim may be

stated: The BOD Validation Test Phase demonstrates that the BOD adequately supports application data sharing as evidenced by • <the participant tool> successfully generating a BOD instance based on some internally correctly

stored application data and

• <the participant tool> successfully preparing and sending the message carrying the BOD instance such that

• the BOD instance supports the target Business Case for application data exchange,

• the BOD instance conforms to the agreed upon message interoperability specification for the application interoperability test phase,

• the BOD instance conforms to the selected external validation rules, and

• the BOD instance is valid when parsed by the <validating parser name> as evidenced by the record of test actions.

(C) Assumptions Only a single BOD instance is tested for each test case, as generated for each participating tool.

(D) Message Exchange Diagram

(E) Participants’ Responsibilities Each application under test

• Sends the BOD instance message to the Testbed node (1)

• Receives a report of BOD instance conformance to the BOD Schema and additional structural and syntactic constraints (e.g., encoded in Schematron rules) (2)

In turn, the testing node (i.e., testbed) node • Receives the BOD instance message from each participant (1)

• Logs the message in a transaction repository

• Runs a validating parser on the message.

• Runs additional validation rules (e.g., encoded in Schematron)

• Issues a report of BOD instance conformance with validation rules (2)

Application

Testbed (1) BOD (2)

Report

Page 28: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 24/ 44

(F) Test Report Form The following is an example test report that may be used to track the conformance testing phase

results.

Test Phase Name: BOD Validation

BOD Name: <name> Application Standard Spec: <spec-id>

Participant Name: <participant>

Date & Time: <date/time>

External Validation Rules: < bod rules id>

Role Name: <interoperability role>

Business Case: <business case id>

Parser Used: <parser name>

Test Case: <number> Transport Used: <WS or ebMSH or> Test Action Time Notes

Received BOD msg <time> Received BOD msg; retrieved BOD instance; logged at Transaction Store at <link>

BOD instance validation wrt. BOD schema in <spec-id>

<time> Issue notes, for example • All required data present (yes/no) • Validation errors

BOD instance validation wrt. External Validation Rules

<time> Validation Rules used <link> Issue notes: • <note 1> at <link> • …

Validation Report sent/posted

<time> Validation Report sent/posted at <link>

Follow-on actions: <notes on follow on actions>

3.3.3.4 Conformance Test Phase 2: Information Input Mapping Test Phase

The purpose of this phase is to test the capability of a participating tool to correctly map an input BOD instance onto the tool’s local store:

Test Focus Diagram

Figure 8: Diagram Showing Focus of the BOD Input Mapping Test Phase (Source NIST)

Receive msg &

Retrieve BOD

Validate OK Store BOD

Generate BOD

Prepare & Send msg

Render BOD

Confirm BOD

Legend: Assumed Not Required Function Transport Capability for Test Under Test Capability

Page 29: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 25/ 44

(A) Positive Test Claim Upon a successful completion of the BOD Input Mapping Test Phase, the following claim may be

stated: The BOD Input Mapping Test Phase demonstrates that the BOD adequately supports an

interoperable application data sharing for a business case as evidenced by • <the participant tool> successfully mapping an input BOD instance based on a provided valid

BOD instance that was (1) correctly validated by the tool, (2) correctly rendered by the tool User Interface, and (3) correctly interpreted by a technician responsible for filling the application form and

• <the participating tool> successfully receiving the BOD message and retrieving the BOD instance

such that • the BOD instance supports the target Business Case for BOD exchange,

• the BOD instance conforms to the agreed upon application specification for the test phase,

• the BOD instance conforms to the selected external validation rules, and

• the BOD instance is valid when parsed by the <validating parser name> as evidenced by the record of test actions.

(B) Assumptions This phase is almost identical to the next Phase 3 with the following differences:

• A human (technician) is required in the testing loop; the task for the human is to interpret information presented in the application UI and to derive manually the BOD Instance, based on the information presented in the GUI.

• Instead of syntactic, semantic information mapping onto the tool data structures is tested here; the point we make is that an interpretation process (performed by the human technician, in this case) needs to take place. This interpretation process is fundamental to validating information mapping. Suffice it to say that without this interpretation process, there could be no assurance that the tool may correctly present data to the user and, hence, for an application BOD to meet the end user requirements.

(C) Message Exchange Diagram

(D) Participants’ Responsibilities Each application in the test :

• Receives a BOD instance message from the Testbed node (1)

• Stores the BOD instance locally

Application

Testbed

(1) BOD

(2) BOD Form

(3) Report

UI

BOD Form

Page 30: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 26/ 44

• Displays the BOD instance in an application UI or otherwise enables inspection of the BOD instance

The application user (i.e., technician) • Uses an XML editor of choice to fill the supplied BOD Form

• Creates a BOD instance (in the BOD Form) based on the information rendered in the UI

• Sends the manually filled BOD instance back to the testbed node (2)

• Receives a report of the information mapping test (3) In turn, the testbed node,

• Sends a test instance message (1)

• Receives the manually completed BOD instance from the participant (2)

• Logs the message in a transaction repository

• Compares the outgoing to the incoming instance for equality

• Issues a report of the information mapping test (3) The BOD instance message sent from the testbed will be instantiated with all fields where their

occurrence in test is identified as one or more (as defined in [NIST1]). The candidate system will be tested with BOD instance messages for each test case to fulfill a business case.

(E) Test Report Form The following is an example test report that may be used to track the conformance testing phase

results.

Test Phase Name: BOD Input Mapping Test

BOD Name: <bod name>

Application Standard Spec: <spec-id>

Participant Name: <participant>

Date & Time: <date/time>

External Validation Rules: <bod rules id>

Role Name: <interoperability role>

Business Case: <business case id>

Parser Used: <parser name>

Test Case: <number> Transport Used: <WS or ebMSH>

Test Action Time Notes

Sent BOD msg <time> Sent BOD instance within a message; logged at Transaction Store at <link>

Received BOD instance filled by a technician

<time> The BOD is stored at <link>

Comparison for equality of the received BOD and the original BOD instance.

<time> All fields compared between the form and the BOD instance (at above links). Issue notes: • <BOD element/value – Form

element/value> - non-equality issue • …

Comparison report sent/posted

<time> Comparison Report sent/posted at <link>

Follow-on actions: <notes on follow on actions>

Page 31: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 27/ 44

3.3.3.5 Conformance Test Phase 3: Information Output Mapping Test Phase

The purpose of this phase is to test the capability of a participating tool to correctly transfer a received valid BOD instance:

(A) Test Focus Diagram

Figure 9: Diagram Showing Focus of the BOD Output Mapping Test Phase (Source NIST)

(B) Positive Test Claim Upon a successful completion of the BOD Output Mapping Test Phase, the following claim may

be stated: The BOD Output Mapping Test Phase demonstrates that the BOD adequately supports

application data sharing as evidenced by • <the participant tool> successfully generating a BOD instance based on a provided valid BOD

instance that was (1) correctly validated by the tool and (2) correctly mapped onto the tool data store and

• <the participating tool> successfully (1) receiving the BOD message and retrieving the BOD instance and (2) preparing and sending the BOD message carrying the BOD instance

such that • the BOD instance supports the target Business Case for BOD exchange,

• the BOD instance conforms to the agreed upon application interoperability specification for the test phase,

• the BOD instance conforms to the selected external validation rules, and

• the BOD instance is valid when parsed by the <validating parser name> as evidenced by the record of test actions.

(C) Assumptions This test phase may be considered as redundant, under the assumption that the two previous

conformance tests imply a valid behavior of the tools under this test. However, this test is designed to assess the complete process of consuming and automatically generating a BOD instance.

Receive msg &

Retrieve BOD

Validate OK Store BOD

Generate BOD

Prepare & Send msg

Render BOD

Confirm BOD

Legend: Assumed Not Required Function Transport Capability for Test Under Test Capability

Page 32: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 28/ 44

(D) Message Exchange Diagram

(E) Participants’ Responsibilities Each participating application under test in the test phase:

• Receives a BOD instance message from the Testbed node (1)

• Stores the BOD instance locally

• Generates the same BOD instance from the local store

• Sends the BOD instance back to the testbed node (2)

• Receives a report of the information output mapping test (3) In turn, the testbed node:

• Sends a test BOD instance message (1)

• Receives the generated BOD instance message from the participant (2)

• Logs the message in a transaction repository

• Compares the outgoing to the incoming BOD instance for equality

• Issues a report of the information mapping test (3)

(F) Test Report Form The following is an example test report that may be used to track the conformance testing phase

results.

Test Phase Name: BOD Output Mapping Test

BOD Name: <bod name>

Application Standard Spec: <spec-id>

Participant Name: <participant>

Date & Time: <date/time>

External Validation Rules: <bod rules id>

Role Name: <interoperability role>

Business Case: <business case id>

Parser Used: <parser name>

Test Case: <number> Transport Used: <WS or ebMSH>

Test Action Time Notes

Sent BOD msg <time> Sent BOD instance within BOD message; logged at Transaction Store at <link>

Received BOD msg <time> Received BOD msg; retrieved BOD instance; logged at Transaction Store at <link>

Comparison for equality of the received and the original

<time> All fields compared between the form and the BOD instance (at above links).

Application

Testbed

(1) BOD

(2) BOD

(3) Report

Page 33: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 29/ 44

BOD instance. Issue notes: • <BOD element/value – Form

element/value> - non-equality issue • …

Comparison report sent/posted

<time> Comparison Report sent/posted at <link>

Follow-on actions: <notes on follow on actions>

3.3.3.6 Conformance Test Phase 4: Behavioral Test Phase

The purpose of this step is to test transactional behavior (i.e., request/response) of a participating tool (either middleware or transformer) with respect to the boundary BOD instances (i.e., when incorrectly specified elements are contained in the BOD):

(A) Test Focus Diagram

Figure 10: Diagram Showing Focus of the Behavioral Test Phase (Source NIST)

(B) Positive Test Claim Upon a successful completion of the BOD Behavioral Test Phase, the following claim may be

stated: The BOD Behavioral Test Phase demonstrates that the BOD adequately supports application data

sharing as evidenced by • <the participant tool> successfully generating a ConfirmBOD instance based on a valid or invalid

BOD instance that was (1) correctly determined to be valid or invalid by the tool and

• <the participating tool> successfully (1) receiving the BOD message and retrieving the BOD instance and (2) preparing and sending the ConfirmBOD message carrying the ConfirmBOD instance

such that • the BOD instance supports the target Business Case for BOD exchange,

• the BOD instance conforms to the agreed upon interoperable application specification for the test phase,

• the BOD instance conforms to the selected external validation rules, and

• the BOD instance is valid when parsed by the <validating parser name> as evidenced by the record of test actions.

Receive msg &

Retrieve BOD

Store BOD

Generate BOD

Prepare & Send msg

Render BOD

Confirm BOD

Legend: Assumed Not Required Function Transport Capability for Test Under Test Capability

Validate OK

Validate NG

Page 34: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 30/ 44

(C) Assumptions Obviously, both the validation and the ConfirmBOD generation/issuing functions are together

responsible for the result of this test phase.

(D) Message Exchange Diagram

(E) Participants’ Responsibilities Each application under test in this test phase:

• Receives a BOD instance that contains a boundary condition violation (i.e., a BOD instance with some element incorrectly specified) (1)

• Computes a response to the BOD instance, indicating either a correct BOD or not

• Sends a Confirm BOD instance with proper attributes to the Testbed node (1). In turn, the testbed node,

• Sends a BOD instance that contains a boundary violation

• Receives the Confirm BOD instance message from each participant (1)

• Logs the message in a transaction repository

• Checks the actual confirm attribute value with respect to the expected one

• Generate a summary conformance report.

(F) Test Report Form The following is an example test report that may be used to track the conformance testing phase

results.

Test Phase Name: BOD Behavioral Test Phase

BOD Name: <bod name>

Application Standard Spec: <spec-id>

Participant Name: <participant>

Date & Time: <date/time>

External Validation Rules: <bod rules id>

Role Name: <interoperability role>

Business Case: <business case id>

Parser Used: <parser name>

Boundary field/condition: <field>

Transport Used: <WS or ebMSH>

Test Action Time Notes

Sent BOD msg <time> Sent BOD instance within BOD message; logged at Transaction Store at <link>

Received ConfirmBOD <time> ConfirmBOD received and stored at <link> Issue notes, for example • Expected error notification, received …

Application

Testbed

(1) BOD

(2) Confirm

Page 35: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 31/ 44

Test report sent/posted <time> Test report sent/posted at <link>

Follow-on actions: <notes on follow on actions>

3.3.3.7 Summary

The ultimate objective of any application interoperability project, whether a cutting edge or an applied one, is to demonstrate effective and efficient interoperability of applications. The content-level conformance testing methodology4 described in this section is applicable to any such project as it can ‘measure’ the extent of conformance that an application has achieved with respect to an interoperability specification (e.g., data interchange messages specification). In addition to providing a measurement, the methodology directly helps the application development teams ‘achieve’ interoperable capability for their respective applications during development as well as testing phases. We were able to developed test plans for industrial interoperability efforts that follow this methodology and that provide detailed testing steps to the participants and test teams. By following such test plans, it is possible to execute complex testing procedures in complex, collaborative industrial situations.

3.4 Test and analysis of results During the phase of test, the Solution test team (§ 4.3) executes the tests specified in Test

Procedure and test plans. The results obtained during the test execution are documented and for development partners will be notified any sort of discrepancy. Corrections to the Solution will be handled by members of development teams according to strict configuration management procedures previously established by the development teams. Corrected Solutions will be re-tested by Solution test teams (overview of simplified process is reported in fig. 9). During the Solution testing phase, the Solution development team should prepare all the documentation supporting the completed Solution. The Solution will be complete when all tests on the Solution have been waived at the authorization of industrial user. Near the end of the phase, the Solution and its documentation will be audited for completeness.

PREPARATIONFOR

SOLUTIONTESTING

EXECUTE &RE-EXECUTE

TEST PROCEDURE ON

TEST SCENARIOS

ANALYSETEST RESULTS

CORRECTSOLUTION

ERRORS

UPDATINGCORRECTEDSOLUTION

LOADSOLUTIONMODULES

SOLUTIONTEST

RESULTS

DISCREPANCY REPORTING

SOLUTION TESTPLAN

SOLUTION TESTRESULTS

ANALYSEDTEST

RESULTS

SOLUTION RE-CONFIGURAT.

PREPARATIONFOR

SOLUTIONTESTING

EXECUTE &RE-EXECUTE

TEST PROCEDURE ON

TEST SCENARIOS

ANALYSETEST RESULTS

CORRECTSOLUTION

ERRORS

UPDATINGCORRECTEDSOLUTION

LOADSOLUTIONMODULES

SOLUTIONTEST

RESULTS

DISCREPANCY REPORTING

SOLUTION TESTPLAN

SOLUTION TESTRESULTS

ANALYSEDTEST

RESULTS

SOLUTION RE-CONFIGURAT.

Figure 11 Overview on testing phases

4 [NIST-1] Kulvatunyou, B., N. Ivezic, and A. Jones. Content-Level Conformance Testing: An Information Mapping Case Study. Submitted to The 17th IFIP International Conference on Testing of Communicating Systems.

Page 36: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 32/ 44

4 Piloting

For Pilot we mean the “implementation of use case solutions in a real context of industrial users”. The Piloting activity consists of the implementation of Test Case, Test Scenario and Test

Procedure in a real context of industrial users. The purpose of this activity is to test in a limited context the goodness of a use case solution before its extension to the overall enterprise environment.

The Piloting concerns an environment where is possible: • Acquire from the involved end-users a real collaborative business process (business model

export)

• Make available e-Services by the process

• Execute the collaboration using Athena interoperability framework

• Show that process and applications are seamless interoperable.

4.1 Plan A set of established activities regarding use cases definition, development of pilot testing and

management of post test evaluation have been planned as well reported in figure.

Figure 12 Activities Plan

In order to carry out all activities a proposal of Validation Plan has been issued and reported in the next figure. Starting from four different business process scenarios a set of Test scenarios will be identified in order to define the features of collaboration.

2004 2005 2006 2007 First interoperability accepted simple user scenario

Second User Demo: Automotive Interoperabilityscenario

First set of results Available from Action Line Project ()

EM – Collaborative Business Process Scenario First set of services

Available from Initial user scenario

2004 2005 2006 2007

First set of results Available from Action Line A

EM – Collaborative Business Process Scenario

First set of servicesAvailable from Initial user scenario

First pilots on a simulated working

environment

First Demo Interoperability simple

user scenario Second Demo Users Interoperability

complex scenario

Figure 12 Plan overview

Use case definition

Develop and Test Pilot

Post Test evaluation

M2 M13 M19 M21 M25 M33 M36M1 Use case definition

Develop and Test Pilot

Post Test evaluation

M2 M13 M19 M21 M25 M33 M36M1

Page 37: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 33/ 44

4.2 Pilot Setting All activities relative to hw/sw environment setting will be carried out before starting the piloting

activities in order to have the solution ready for test. System infrastructures, collaborative net of laboratories, specifications, and supports guidelines should be made available. The planning of specific presentation and testing activities will be driven by the identified needs of the stakeholders.

4.3 Training A training activity will be established and planned at each users validation site in order to arrange

a suitable validation environment in terms of human resources. The training activity will be conducted on the following professional roles that will constitute the

test team: • Business Process Analyst

• System analyst

• Experimenter

4.4 Test execution The Test will start after arranging the Test environment at each Industrial site. After that the

training activity, the Test Scenarios will be carried out. The Test Scenarios will be used following the established Validation Plan. The Test execution will follow two ideal validation phases.

The first phase could be thought like a functional test where all modules developed will be tried (Figure 14)

Coremodules

Functionality test

Functionality testFor

each module

Is the moduleO K ?

Errorsreport

Version+ 1

Version+ 1

Validationreport

YES

NO

Module ow ner (developer)

New compiledversion

11

22

33

55

Module replacement

00 START

END

44

Each module must be tested

Bugs fixingBugs fixingCoremodules

Functionality test

Functionality testFor

each module

Is the moduleO K ?

Is the moduleO K ?

Errorsreport

Version+ 1

Version+ 1

Validationreport

YES

NO

Module ow ner (developer)

New compiledversion

11

22

33

55

Module replacement

00 START

END

44

Each module must be tested

Each module must be tested

Bugs fixingBugs fixing

Figure 13 First phase – functional test

A second phase foresees a real execution of Test Scenario. During the test execution another possible interaction with developers will be activated. After the execution phase the person who will carry out the test has to compile a validation form giving a judgment for each involved requirement. (Figure 14)

Page 38: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 34/ 44

ScenarioexecutionScenarioexecution

ScenariosFor each scenario

Requirement evaluation using

metrics

Requirement evaluation using

metrics

START

END

Scenario validation form

Interaction with the developers

• After the execution phase, the person who carried out the test has to compile the validation form giving a judgement for each involved requirement.

C on di t i on Lo cat i on

In du st ria l E nd U s er :AI D IM ACR FEA D SIN T RA C O M

B ran che sS te p A ct or De scr i pt i on Ex pec t ed R es ul t T es t R esu l t P ass ed ( y/ n)

ScenarioexecutionScenarioexecution

ScenariosFor each scenario

Requirement evaluation using

metrics

Requirement evaluation using

metrics

START

END

Scenario validation form

Interaction with the developers

• After the execution phase, the person who carried out the test has to compile the validation form giving a judgement for each involved requirement.

C on di t i on Lo cat i on

In du st ria l E nd U s er :AI D IM ACR FEA D SIN T RA C O M

B ran che sS te p A ct or De scr i pt i on Ex pec t ed R es ul t T es t R esu l t P ass ed ( y/ n)

Figure 14 Second phase – Test Scenarios execution

Each end user executes its own Test scenario. At the end of the Test Scenario Execution an evaluation form will have to be compiled. After the Test Scenario execution all Test Scenarios results will be integrated in a single form in order to have a single list of requirements and related judgment. All the judgment related to all requirements must be integrated in a single judgment according predefined criteria.

Starting from the several internal results integration forms a global result evaluation form will be compiled and shared among all partners (Figure 15)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

Co nd itio n Lo ca tio n

In du st ria l E nd Us er :A ID IMAC RFE AD S

S te A ct or D esc rip tio n E xp ec te d R es ult T es t R es ult Pa ss ed (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD SIN T RA CO M

Br an ch esS te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

Co nd itio n Lo ca tio n

In du st ria l E nd Us er :A ID IMAC RFE AD S

S te A ct or D esc rip tio n E xp ec te d R es ult T es t R es ult Pa ss ed (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD SIN T RA CO M

Br an ch esS te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD SIN T RA CO M

Br an ch esS te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD SIN T RA CO M

Br an ch esS te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

TestScenarios

TestScenarios

TestScenarios

TestScenarios

InternalResults

Integration

InternalResults

Integration

InternalResults

Integration

InternalResults

Integration

Global Results

Integration

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

Co nd itio n Lo ca tio n

In du st ria l E nd Us er :A ID IMAC RFE AD S

S te A ct or D esc rip tio n E xp ec te d R es ult T es t R es ult Pa ss ed (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD SIN T RA CO M

Br an ch esS te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

Co nd itio n Lo ca tio n

In du st ria l E nd Us er :A ID IMAC RFE AD S

S te A ct or D esc rip tio n E xp ec te d R es ult T es t R es ult Pa ss ed (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD SIN T RA CO M

Br an ch esS te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD SIN T RA CO M

Br an ch esS te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD S

S te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

C on dit ion L oc at ion

In du st ria l E nd U ser :A ID IMAC RFE AD SIN T RA CO M

Br an ch esS te A ct or D es cr ipt ion E xp ec ted R es ult T es t R es ult Pa sse d (y/ n)

TestScenarios

TestScenarios

TestScenarios

TestScenarios

InternalResults

Integration

InternalResults

Integration

InternalResults

Integration

InternalResults

Integration

Global Results

Integration Figure 15 Internal and general results integration

The results and the analysis regarding this activity will be enclosed in established report.

4.5 Results Analysis The Analysis has the scope to identify if and how well:

During the Test evaluation an activity of comparison between expected quantified results and Test evaluation results will be carried out.

1. Expected test objectives have been achieved 2. Athena interoperability goal has been reached

Page 39: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 35/ 44

4.6 Reporting The following reports are foreseen to capture/show the evaluation/validation results:

• Matrix Report: exhibiting the requirements coverage of Test Scenario by means of Test Procedures

• Test Scenarios Report: describing the steps performed during the test execution

• Evaluation Report: reports the evaluation results

• Analysis Report: providing a comparison between expected results values and obtained results values

• Internal Report: Industrial End Users Final Validation Reports

• Global ATHENA Validation Report: Final ATHENA Validation Report

Page 40: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 36/ 44

5 Pilots

This section intends to propose guidelines to describe in articulated manner the pilots. Each Industrial partner (AIDIMA, CRF,SIEMENS, INTRACOM, EADS) has to define its specific

environment to execute its own Test Scenario.

5.1 Overview on domain Each industrial partner has to provide a general description of Test case in order to give an

overview on scenario.

5.2 Objectives Each industrial partner has to provide the objectives that want to reach.

5.3 Plan Each industrial partner has to observe the general plan and manage an internal plan for test.

5.4 Organization Each industrial partner has to identify the resources (roles, responsibilities) that will be involved in

test (§ 4.3).

5.5 Architectures, tools and services Each industrial partner has to describe the structure and distinguishing features of test

environment in terms of architectures, tools, and services. A visual representation of the internal architecture will help the partner involved in development in environment understanding and the stakeholder in pilot settlement.

5.6 Laboratories A coordinated net of laboratories should be planned in order to test the intra-companies

interoperability.

5.7 Test formalization - Contents A draft of aspects that should be considered and formalized before and during the test execution

and an example of modules gathering these items it is proposed here onwards. The format has to be approved by partners involved in test.

Test Cases

1. Introduction [If necessary, write the introduction of the Test Cases. It should provide an overview of the entire document, and include the purpose, scope, definitions, acronyms, abbreviations, references and overview of this set of Test Cases.]

1.1 Definitions, Acronyms and Abbreviations [This subsection should provide the definitions of all terms, acronyms, and abbreviations required to interpret properly the Test Cases]

1.2 References [This subsection should provide a complete list of all documents referenced elsewhere in the Test Cases. Each document should be identified with a title, report number (if applicable), date, and publishing organization. Specify the sources from which the references can be

Page 41: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 37/ 44

obtained. This information may be provided by reference to an appendix or to another document.]

2. Test Environment [Include description on Test environment]

Solution Components Version

[Include comments on Test environment]

2.1 Glossary Term Description

2.2 Actors [Include description of actors involved in Test]

Actor Description

2.3 Diagrams [Include if needed relevant diagrams of Use cases]

2.4 Assumptions In order to simplify the description, facilitate the test on application and allow the working group to

concentrate on relevant features of delivered Solution, the following assumptions have been defined:

[Include if needed relevant assumptions] n. Desription A1 A2 A3

2.5 Set up information (Pre-conditions) [Enter the prerequisites to execute the test cases]

Page 42: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page 38/ 44

3. Test Scenario [Include list and titles of Test Scenarios]

Each Test Scenario it is identified and coded in this way: “Company name_Test Case of reference_Test Scenario”. E.g. CRF_TC01_TS1]

TEST LIST

TEST SCENARIO DESCRIPTION EXPECTED RESULTS REAL RESULTS TEST PASSED

(Y/N) CRF_TC01_TS1 CRF_TC01_TS2 CRF_TC01_TS3 CRF_TC01_TS4 CRF_TC02_TS1 CRF_TC02_TS2 CRF_TC03_TS1

3.1 Test Procedures List [Include a list of procedures that will be used during the test activity and associated to each test scenario]

3.2 EX .CRF_TC01_TS1 Goal of Test Scenario: [Include goal of Test Scenario] Pre-conditions: [Include if needed pre-conditions of Test Scenario] Test Creator: [Name - Role] Data Creation: [Date] Data Test: [Date] Test Executor: [Name - Role] Internal Test Assistant: [Name - Role] External Test Assistant: [Company - Name - Role] Success Post Conditions: [list of objectives reached] Failed Post Conditions: [Include list problems, errors encountered] Result: [positive - negative] Comments: [Include comments if needed]

Page 43: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page39 / 44

3.3 Test Tab for CRF_TC01_TS1 [Test Scenario Report]

Test Scenario description: [Brief description of Test Scenario]

User [Company 1] [Company 2] [Company 3] [Company 4] Event description [Action that starts the activity] Target description [Aim of the activity]

Actions References Judgment Characteristics N Action Description TP n. Requirement Plan User

Note Reference

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Test ID TS1 Date [Yy/mm/dd] Carried out by [Name] Supported by [Name]

Results Target Reached [Yes] [No] Comments

[Comments]

Page 44: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page40 / 44

6 Conclusion

The present deliverable (M09) contains in this issue a preliminary approach for Technology Testing and Piloting. It will be improved and enriched on the next version (M18) of tangible validation results and considerations. Moreover, on the basis of the application of the adopted methodology (WDB5.5) are foreseen further improvements.

Resuming the overall work, will be performed by the ATHENA stakeholder several activities regarding detailed planning, environment setting, test scenarios preparation, tests, piloting, evaluations and validations based on predefined methods and criteria supported by two validation methodologies (Technical and Business).

Methodology Method Related Aspects Objectives Test objects Acceptability

criteria

Technical

ISO 9126 -Relevant characteristics: Business Level, Knowledge Level, Application Level, Data Level, Quality Level

Identify if a Solution meets and how well the requirements

General and Final Solutions

1. Acceptable value around "3", critical value around "2", optimal value around "4"

Validation

Business

-Based on industrial experience -Based on BPM -Based on ABM

- Demonstration of how well Solution and Scenario fit together 1. Identification of real perimeter of applicability for emerging solution 2. Identification of Business Impact

General and Final Solutions

1. Coverage degree threshold 0,7; Coverage quality threshold 0,7. 2. All mandatory objectives reached; at least 70% objectives reached

Table – Synopsis

Page 45: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page41 / 44

Methodology Method Related Aspects Objectives Test objects Acceptability

criteria

Test Scenarios

- Identification of proper Test Scenarios

- All Test Scenarios predefined should be ready and available before Test activities. If needed other Test Scenarios will be formalised during the Test activities.

Test Procedure

- Identification of proper Test Procedures

- All Test Procedures, ready and instanced for Test Scenarios and available before Test activities

Testing

Test All activities regarding the Test

- Test the Solutions in preliminary phase, intermediate phase, in final phase and in Piloting

General, Final Solutions, Pilots

1. Regarding the Testing Results: at least 80% of Test procedures have been performed in one scenario with satisfactory results.

2. Evaluation on Pilots has similarity respect to evaluation on Final Solution

Table – Synopsis

The main partners involved in these activities, AIDIMA, CRF, EADS, INTRACOM, will provide information and evidence by means of tangible results on the achieved objectives regarding the introduction of the ATHENA Solution. Besides part of these results will constitute cultural heritage of ATHENA consortium and will enrich the general project results.

Page 46: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page42 / 44

7 Glossary and References

Assessment An action of applying specific documented assessment criteria to a specific software module, package or product for the purpose of determining acceptance or release of the software module, package or product. [ISO 9126: 1991, 3.1]

Customer Ultimate consumer, user, client, beneficiary or second party. [ISO 9004: 1987, 3.4]

Defect The non-fulfilment of intended usage requirements. [ISO 8402: 1986, 3.21]

Features Features are identified properties of a software product, which can be related to the quality characteristics. [ISO 9126: 1991, 3.2]

Functional Requirements A functional requirement specifies what the system must be able to do, the functions it should perform. Functional requirements are associated with specific functions, tasks or behaviours the system must support. This term is used at both the user requirements analysis and software requirements specifications phases in the software life cycle. Functional requirements capture the intended behaviour of the system. This behaviour may be expressed as services, tasks or functions the system is required to perform [ATHENA]

Inspection Activities such as measuring, examining, testing, gauging one or more characteristics of a product or service and comparing these with specified requirements to determine conformity. [ISO 8402: 1986, 3.14]

Level of performance The degree to which the needs are satisfied, represented by a specific set of values for the quality characteristics. [ISO 9126: 1991, 3.4]

Liability (product/service) A generic term used to describe the onus on a producer or others to make restitution for loss related to personal injury, property damage or other harm caused by a product or service. [ISO 8402: 1986, 3.19]

Measurement The action of applying a software quality metric to a specific software product. [ISO 9126: 1991, 3.5] Methodology A set of instructions (provided through text, computer programs, tools, etc.) that is a step-by-step aid to the user - [ISO 15704, 1999]

NIST National Institute of Standards and Technology

[NIST-1] Kulvatunyou, B., N. Ivezic, and A. Jones. Content-Level Conformance Testing: An Information Mapping Case Study. Submitted to The 17th IFIP International Conference on Testing of Communicating Systems.

Nonconformity The non-fulfilment of specified requirements. [ISO 8402: 1986, 3.20] NOTE -- The basic difference between `nonconformity' and `defect' is that specified requirements may differ from the requirements for the intended use. [ISO 8402: 1986, 3.20] Non-Functional Requirement A non-functional requirement specifies an aspect of the system other than its capacity to do things. Examples of non-functional requirements include those relating to performance, accessibility, usability,

Page 47: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page43 / 44

etc. [ATHENA]

Pilot The implementation of use case solutions in a real context of industrial users. Pilots are employed to test in a limited context the goodness of a use case solution before its extension on the whole enterprise reality. [ATHENA]

Piloting is the best offered by providing guidelines approaches on how to solve, design and engineering applications and challenges that until now have been poorly or not at all solved. [ATHENA]

Prototype Model on which business enterprise model and services are patterned and developed [ATHENA]

Prototyping Activity performed on a model, regarding many aspects for many targeted groups of people. The prototyping can concern systems, modelling, model management, process analysis, work management methods, work execution [ATHENA].

Rating The action of mapping the measured value to the appropriate rating level. Used to determine the rating level associated with the software for a specific quality characteristic. [ISO 9126: 1991, 3.7]

Rating level A range of values on a scale to allow software to be classified (rated) in accordance with the stated or implied needs. Appropriate rating levels may be associated with the different views of quality i.e. Users, Managers or Developers. These levels are called rating levels. [ISO 9126: 1991, 3.8]

Recoverability Attributes of software that bear on the capability to re-establish its level of performance and recover the data directly affected in case of a failure and on the time and effort needed for it. [ISO 9126: 1991, A.2.2.3]

Reliability The ability of an item to perform a required function under stated conditions for a stated period of time. The term `reliability' is also used as a reliability characteristic denoting a probability of success or a success ratio. [ISO 8402: 1986, 3.18]

Replaceability Attributes of software that bear on the opportunity and effort of using it in the place of specified other software in the environment of that software. [ISO 9126: 1991, A.2.6.4]

Resource behaviour Attributes of software that bear on the amount of resources used and the duration of such use in performing its function. [ISO 9126: 1991, A.2.4.2]

Scenario is a description of a user interaction with a system or alternatively can be defined as a Use Case instance. The scenario is a set of detailed activities [ATHENA].

Security Attributes of software that bear on its ability to prevent unauthorized access, whether accidental or deliberate, to programs and data. [ISO 9126: 1991, A.2.1.5]

Software quality The totality of features and characteristics of a software product that bear on its ability to satisfy stated or implied needs. [ISO 9126: 1991, 3.11]

Software quality assessment criteria The set of defined and documented rules and conditions which are used to decide whether the total quality of a specific software product is acceptable or not. The quality is represented by the set of rated

Page 48: Deliverable B5.1 Methodology & Technology Testing Report ... provided t… · ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1

IP- Project / Programme ATHENA Project - No 507849 ATHENA - Project Methodology and Technology Testing Report Project Number B5 Document Deliverable B5.1 Date 12.03.05

050311_ATHENA_DB51_V10.doc CONFIDENTIAL Page44 / 44

levels associated with the software product. [ISO 9000-3: 1991, 3.12]

Software quality characteristics A set of attributes of a software product by which its quality is described and evaluated. A software quality characteristic may be refined into multiple levels of sub-characteristics. [ISO 9126: 1991, 3.13]

Software quality metric A quantitative scale and method that can be used to determine the value a feature takes for a specific software product. [ISO 9126: 1991, 3.14]

Stability Attributes of software that bear on the risk of unexpected effect of modifications. [ISO 9126: 1991, A.2.5.3]

Suitability Attribute of software that bears on the presence and appropriateness of a set of functions for specified tasks. [ISO 9126: 1991, A.2.1.1]

Testability Attributes of software that bear on the effort needed for validating the modified software. [ISO 9126: 1991, A.2.5.4]

Testing Activity performed on architectures, components and solutions at all three level of the Enterprise Architecture.[ATHENA]

Test Scenario A set of inputs, execution preconditions, and expected outcomes developed for a particular objective, such as to exercise a particular program path or to verify compliance with a specific requirement [ATHENA]

Test Procedure is the step-by-step process needed in order to verify that the product meets all the interoperability requirements established [ATHENA]

TO-BE Scenario Envisaged ideal situation in which Interoperability problems don’t exist. [ATHENA]

Understandability Attributes of software that bear on the users' effort for recognizing the logical concept and its applicability. [ISO 9126: 1991, A.2.3.1]

Usability A set of attributes that bear on the effort needed for use, and on the individual assessment of such use, by a stated or implied set of users. [ISO 9126: 1991, 4.3]

Use Case is a collection of possible sequences of interactions between the system under discussion and its Users (or Actors in Business Processes), relating to a particular goal. A use case defines a goal-oriented set of interactions between external users and the system under consideration or development. Use cases allow capturing functional requirements for a Business case. [ATHENA]

Validation [1]: confirmation by examination and provision of objective evidence that specifications conform to user needs and intended uses, and that the particular requirements implemented can be fulfilled. The primary focus of validation is customer satisfaction. [ATHENA] [2]: Confirmation by examination and provision of objective evidence that particular requirements for a specific intended use are fulfilled” [ISO/IEC 9126]


Recommended