+ All Categories
Home > Documents > ACRuDA Deliverable D3

ACRuDA Deliverable D3

Date post: 22-Nov-2014
Category:
Upload: zlatka-tecec-ribaric
View: 75 times
Download: 1 times
Share this document with a friend
87
... ACRuDA Project DG VII RTD Programme RA-96-SC.231 Deliverable D3: The proposed assessment and certification methodology for digital architectures Authors: All ACRuDA Partners Document.ID: WP2/D3/V2 Date: June 12, 1998 Type of document: Deliverable Status: Proposed Confidentiality: Public Work Package Allocation: WP2 Distribution: All Document Abstract: The deliverable D3 presents the result of the tasks achieved in the ACRuDA Work Package 2 and Work Package. This work consists mainly in the description of an assessment and certification schema, of assessment and certification procedures and of assessment criteria. List of Contributors (Alphabetical Order): A. AMENDOLA (ANSALDO S.F. Italy) P. BENOIT (MATRA T.I.. France) J.L. DUFOUR (MATRA T.I. France) Ph. GABRIEL (MATRA T.I.. France) Ph. GRANDCLAUDON (SNCF-France) Ph. KAPCRZAK (RATP-France) M. El KOURSI (Inrets-France) H. KREBS (TUV-Rheinland - Germany) WCS_AP v.03 Test report 3.11.2010 12:59
Transcript
Page 1: ACRuDA Deliverable D3

...

ACRuDA ProjectDG VII RTD Programme

RA-96-SC.231

Deliverable D3:The proposed assessment and certification

methodology for digital architectures

Authors: All ACRuDA Partners

Document.ID: WP2/D3/V2

Date: June 12, 1998

Type of document: Deliverable

Status: Proposed

Confidentiality: Public

Work Package Allocation: WP2

Distribution: All

Document Abstract:The deliverable D3 presents the result of the tasks achieved in the ACRuDA Work Package 2 and WorkPackage. This work consists mainly in the description of an assessment and certification schema, of assessmentand certification procedures and of assessment criteria.

List of Contributors (Alphabetical Order):

A. AMENDOLA (ANSALDO S.F. Italy) P. BENOIT (MATRA T.I.. France)

J.L. DUFOUR (MATRA T.I. France) Ph. GABRIEL (MATRA T.I.. France)

Ph.GRANDCLAUDON

(SNCF-France) Ph. KAPCRZAK (RATP-France)

M. El KOURSI (Inrets-France) H. KREBS (TUV-Rheinland -Germany)

WCS_AP v.03 Test report

3.11.2010 12:59

Page 2: ACRuDA Deliverable D3

G. LOISEAU (MATRA T.I. France) J.F. LINDEBERG (SINTEF, Norway)

Ph. MEGANCK (Inrets-France) S. MITRA (Lloyds Register-UK)

O. NORDLAND (SINTEF Norway) P. OZELLO (SNCF-France)

F. POLI (ANSALDO S.F. -Italy) G. SONNECK (SEIBERSDORF-Austria)

R. TOOZE (Lloyds Register-UK) S. VALENCIA (RATP-France)

Project SponsorThis report reflects work which is partially funded by the Commission of the European Communities (CEC)under the Framework IV in the area of Specific RTD programme project: ACRuDA» Assessment andCertification Rules for Digital Architectures".

CONTENTForeword for the HTML version of this document1. INTRODUCTION1.1. Objectives1.2. Structure of the document2. BACKGROUND2.1. Introduction2.2. The certification procedure2.2.1. Licensing2.2.2. Certification2.2.3. The certificate2.2.4. The certification body2.2.5. Conclusion3. MAIN CONCEPTS3.1. Product, Safety Requirements Specification.3.2. Assessment process3.3. Certification process4. PRODUCT DEVELOPMENT PROCESS4.1. Introduction4.2. Life cycle of a product4.2.1. Presentation4.2.2. Paper study4.2.3. Model4.2.4. Prototype4.2.5. Pre Production4.2.6. Production4.3. Development Life Cycle and documentation4.3.1. Development Life Cycle4.4. Safety life cycle and documentation4.4.1. Safety life cycle4.4.2. Safety Plan4.4.3. Safety case4.4.3.1. Introduction4.4.3.2. General4.4.3.3. Purpose of the safety case4.4.3.4. Content of the product safety case4.5. Quality Assurance provisions4.5.1. Relationship of Quality Assurance to other Plans5. ASSESSMENT AND CERTIFICATION PROCESS5.1. Introduction

WCS_AP v.03 Test report

3.11.2010 12:59

Page 3: ACRuDA Deliverable D3

5.2. Description of the assessment and certification process5.3. Role of the different bodies5.3.1. Role of the European Union (EU)5.3.2. Role of the authority of a EU member state5.3.3. Role of the accreditation body of a state member5.3.4. Role of the notified body5.3.5. Role of the sponsor5.3.6. Role of the assessors5.3.7. Role of the supplier5.4. Phases of the assessment and certification process5.4.1. Phase I: preparation of the assessment5.4.2. Phase II: assessment5.4.3. Phase III: certification5.5. Re-use of assessment results and product composed of assessed/certified components5.6. Capitalisation of assessment work5.7. Certification report and Certificate5.8. Assessment inputs5.9. Essential Quality Requirements for the assessment activities5.9.1. The normative context for assessment requirements5.9.2 Quality System of the Assessor5.9.3. Quality Handbook5.9.3.1. Background referential of the assessment5.9.3.2. The quality requirements on methods for assessment5.9.3.3. The quality requirements on tools for assessment5.9.3.4. The safety audit5.9.3.5. The Configuration management5.9.3.6. The assessment reports and anomalies5.9.4. Human issues5.9.4.1. Competence and knowledge of the assessor5.9.4.2. Organisation of the assessment team5.9.4.3. Independence of judgement5.9.4.4. Confidentiality of the developer's innovations5.9.4.5. Publication of the results of the assessment5.9.4.6. Subcontractors of the notified body5.9.4.7. Environment organisation5.10. General concepts for assessment and certification of software.5.10.1. Software design5.10.2. Algorithms and formal methods5.10.3. Verification and validation5.10.4. Interfaces6. HIGH LEVEL ASSESSMENT CRITERIA6.1. Introduction6.2. Assessment activities6.2.1. Referential Examination6.2.2. Safety Management Assessment6.2.3. Quality Management Assessment6.2.4. Organisation Assessment6.2.5. Development Phase Assessment6.2.6. Safety Plan Assessment6.2.7. Safety Case Assessment6.3. Structure of the criteria6.4. Process/Project criteria6.4.1. Contents6.4.2. Basic Criteria6.5. Requirements6.5.1. Contents6.5.2. Basic criteria6.6. Design

WCS_AP v.03 Test report

3.11.2010 12:59

Page 4: ACRuDA Deliverable D3

6.6.1. Contents6.6.2. Basic criteria6.7. Validation and off line testing6.7.1. Contents6.7.2. Basic criteria6.8. Fault and failure analyses6.8.1. Contents6.8.2. Basic criteria6.9. Operation, Maintenance and Support6.9.1. Contents6.9.2. Basic criteria6.10. Software Assessment Criteria6.10.1. Software integrity level6.10.2. Life cycle issues and documentation6.11. Hardware Assessment Criteria6.11.1. Life cycle issues and documentation7. TERMINOLOGY7.1. Introduction7.2. Terminology, Definitions and Abbreviations8. REFERENCES8.1. European Council Directives8.2. European Technical Specifications8.3. Standards8.4. ACRuDA Project8.5. CASCADE Project8.6. ERTMS project8.7. Information Technology domain8.8. Others9. ANNEX I: STRUCTURE OF A SAFETY PLAN10. ANNEX II: STRUCTURE OF A PRODUCT SAFETY CASE11. ANNEX III: STRUCTURE OF AN ASSESSMENT PLAN12. ANNEX IV: STRUCTURE OF A TECHNICAL ASSESSMENT REPORT13. ANNEX V: STRUCTURE OF THE CERTIFICATION REPORT14. ANNEX VI: STRUCTURE OF A CERTIFICATE

List of FIGURESFigure 1: Product and Safety requirements specificationFigure 2: Assessment processFigure 3: Certification processFigure 4: Development life cycleFigure 5: Relationships between categories and aspectsFigure 6: Equipment/Generic Product life cycleFigure 7: Safety life cycleFigure 8: Structure of PlansFigure 9: Assessment and certification schemaFigure 10: assessment inputs

Foreword for the HTML version of this documentThe text presented here is the HTML version of the original document that was submitted to and accepted by the European Commission.The text has been reformatted to make it better suited to reading with a web browser, and this foreword has been inserted. All entries inthe table of contents and the list of figures are links to the corresponding chapters respectively figures, and all chapter titles and figurecaptions are links back to the corresponding entry in the table of contents respectively list of figures. The final "To top of text" link at thevery end of the document has been added.

Apart from that, only four typing errors in the original text have been corrected:

WCS_AP v.03 Test report

3.11.2010 12:59

Page 5: ACRuDA Deliverable D3

Chapter 4.5 contained a sub-chapter that was erroneously numbered 4.5.1.3 and not mentioned in the table of contents. In this text,the number has been corrected to 4.5.1 and the corresponding reference added to the table of contents.

1.

The identification of the document referenced in [ACR02] in chapter 8.4 was erroneously typed as "24 February 97. Reference:ACRuDA/INRETS/MK-PM/WP1/D1/97.13/V2" (identical to the identification of [ACR01]). It has been corrected to "29September 97. Reference: ACRuDA/INRETS/PM-MK/WP1/D2/97.39/V3".

2.

Annex II, Chapter 3, section "Safety Plan", erroneously referred to chapter 3.4.2 of this document. The reference has beencorrected to 4.4.2.

3.

Annex III, Chapter 5, erroneously referred to chapter 3.4.3 of this document. The reference has been corrected to 4.4.3.4.

No other corrections have been made. Any other differences between the wording of the original document and this text are unintentionaland unnoticed! Please inform me if you discover any.

Note: If you print this text, some of the images may not appear completely on the printed page. To get a complete print-out of such animage, right-click it and store it on your hard disk. (This works with both Netscape and Explorer). Then print the image separately (e.g.by viewing the file with your browser and then printing it!). You may have to adjust the margin settings in your page setup.

1. IntroductionThe ACRuDA project aim to develop a methodology for safety assessment of safety critical digital architectures.This methodology has to comply with the different requirements of the end-users and the suppliers. In priorityorder:

it has to be the minimum activity of the assessor to gain confidence on the safety of the architecture,it has to use the best practices on assessment so that the end-users maintain or gain confidence on theautomatism supported be safety critical digital architecture,it has to be non ambiguous so that it cannot be misinterpreted by the different assessors. This contributesto the harmonisation of the European market,it has to be cost effective so that the effort are well proportioned between the different activities ofassessment and does not create lacks or deviation.

The minimum activity of the assessor depends on the complexity of the architecture and on the developmentprocess. In all cases, the activity consists in more than just a conformity checking. The assessor has to performsafety studies and expertise to complete assessment with an effectiveness evaluation of the protections ,principles, specific mechanism developed for the architecture and gain confidence on the safety under theproposed conditions of use.

The best practices in assessment in the European countries have been studied in [ACR02]. ACRuDA project hasused results from CASCADE project, European Standard ([CEN01], [CEN02] and [CEN03]) and Directives([DIN01], [DIN02] and [DIN03]) and the experiences of the different ACRuDA partners. This has beenformalised through high level assessment criteria. The set of criteria obtained hereafter is a basic set. These setof criteria can not be applied under this form and the assessors should add work to refine this basic set of criteriain a set of detailed criteria that could be applied to assess a digital architecture. This chapter has been updatedonce, after the ACRuDA case studies results and it must be updated regularly as the practice evolves in Europe.

The principal aim of this document is to define the framework for the assessment method. The objective is toensure that the safety digital architecture meets its safety requirements according to relevant standards, and bestpractice as well as any specific safety requirements contained in contractual or technical specifications forthe equipment. The process of certification of safety critical products or systems can involve many differentbodies, for example, the sponsor, the supplier, the assessors and the notified body. It is, therefore, essential thata harmonised, mutually agreed approach to the assessment of the products should be established which takesinto account the needs of each partner.

1.1. Objectives

This document provides information on the way a process or product is to be assessed by a third party. Thefoundation of the assessors work is this standard which are « codes of practice ». The philosophy of assessment

WCS_AP v.03 Test report

3.11.2010 12:59

Page 6: ACRuDA Deliverable D3

of safety digital architectures in the railway sector, is based on the product and the process. This document isprincipally aimed at the assessors who need to perform an assessment of a safety critical digital architecturebased on the high level criteria. Each criterion defines a high level requirement that the item under assessmentmust fulfil. This is the top level document to be used during an assessment and from this the assessors willdevelop the detailed criteria necessary to assess the specific architecture.

1.2. Structure of the document

This document is the key deliverable of the ACRuDA project and provides:

the definition of an assessment and certification process,the definition of a set of assessment and certification procedures,the definition of assessment criteria.

Chapter 1: this chapter.Chapter 2: This chapter provides a brief description of the background to ACRuDA in terms of the certificationand licensing requirements emerging from EU directives.Chapter 3: In this chapter, the main concepts involved in the assessment and certification process are definedand some basic assessment procedures are given.Chapter 4: This chapter presents the development process of a product.Chapter 5: This chapter presents the assessment and certification processes that underpin in the ACRuDAproject.Chapter 6: In this chapter, the assessment criteria are defined and described.Chapter 7: This chapter gives the definition of the terms used in this document and in the general certificationlanguage.Chapter 8: This chapter gives the references of the documents used for ACRuDA project.Annexes: annexes I and II are related to some document to be produced during the development process for thesupplier. ANNEX I describes the structure of a safety plan, ANNEX II, the structure of a safety case. AnnexesIII to VI are related to the documentation to be produced during an assessment. ANNEX III describes thestructure of an assessment plan, ANNEX IV, the structure of a technical assessment report, ANNEX V, thestructure of a certification, ANNEX VI, the structure of a certificate.

2. Background

2.1. Introduction

Articles 129b to 129d of the amended EC Treaty established the intention of introducing trans-Europeannetworks in the areas of transport, telecommunication and energy infrastructures. The need for inter-operabilityand technical standardisation is stated.

The European Commission has expressed the urgent need for an effective, integrated transport system providinga high degree of inter-operability between rail, air, road and water transport systems. This is referred to ascross-modal transport. To achieve this, there must be efficient cross-border and cross-modal operations betweenall transport systems. The European Commission has recognised the need for an effective railway as part of thisEuropean transport system.

Furthermore, the European Commission is now, on behalf of the European Council, developing the necessarylegislation, mainly in the form of Council Directives. This legislation will become part of Member Statelegislation.

2.2. The certification procedure

Council directive [DIN03], Article 1, contains definitions of:

Technical specifications, Standards, European standards, Common technical specifications,

WCS_AP v.03 Test report

3.11.2010 12:59

Page 7: ACRuDA Deliverable D3

European technical approval.

For the railway, the following directives, standards, and technical specification must be considered:

the three standards [CEN01], [CEN02] and [CEN03],the Technical Specification for Interoperability [STI01],the directives [DIN01] and [DIN02].

[DIN03] refers to the concept of an approval body which certifies that the product or system satisfies theessential requirements making it fit for use. Safety is the first of the essential requirements that Councildescribed in Annex III of [DIN01].

2.2.1. Licensing

[DIN01] and [DIN03] are examples of international legislation which when that member states are required toimplement into their own national legislation. This will results in all member states having similar laws definingresponsibilities and authorities with respect to various kinds of public transport.

In turn, regulations will issued to implement such laws and these will define who is authorised to grant a license(licensing authority) to operators of transport systems or part thereof. This licensing authority may be adepartment within the government, or an external organisation.

The licensing authority will set up the rules to be applied in order for granting a license. The key requirements isthat it must be demonstrate that a given system or product is safe to be used in its intended application.

As many countries have already privatised transportation, the owner of a public transportation system may notnecessarily be the operator of that system. For the purposes of licensing, this reflects mainly an issue of liabilityand does not affect the actual licensing procedure.

The owner/operator will order anything from individual constituents all the way up to a complete transportationsystem from one or more suppliers, who themselves will subcontract out to sub-suppliers etc.

In order to get permission to use the system he has ordered, the owner/operator must convince the licensingauthority that it fulfils the requirements that the licensing authority has defined. This evidence must be providedby an unbiased, independent body: the assessor. The assessor must be accepted by the licensing authority.

It is important here to remember that the owner/operator does not define the requirements: that is done by thelicensing authority. The owner/operator will, however, identify which requirements he wants an assessment for,in dependence on the product or system being assessed and the intended application.

2.2.2. Certification

In order to avoid to repeat the assessment process each time an existing product or system is deployed, it isdesirable to perform some kind of generic assessment and to document the results in a form that is acceptable toall licensing authorities. This is the fundamental concept of certification!

Thus certification requires an unbiased, qualified assessment of the generic properties of a system or product.The term generic is significant: the actual properties of a given object are dependent of the way it is deployed.Therefore, the assessment can only evaluate those properties that are common to all reasonably expectableenvironments and deployments. However, for complex systems, embedded components can certainly becertified for the context of the system that they are embedded in.

This means that individual products can be certified for use in specific assemblies, which can be certified for usein specific subsystems that are certified for use in specific systems. For software, this is equivalent to certifyingspecific modules for use in specific programmes in specific programme systems.

[DIN01] defines in Article 2 the notified bodies as "the bodies which are responsible for assessing the ...suitability for use ... or for appraising ... verification ..." and in annex VI refers to the "certificate from thenotified body ...". In other words, assessment and certification are to be performed by the notified bodies.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 8: ACRuDA Deliverable D3

The liability of the notified body is not clearly defined in the directive but it is said that: the notified body mustsubscribe a civil liability insurance, excepted if this responsibility is covered by the State on the basis of nationallaw or if the controls are directly achieved by the State member.

2.2.3. The certificate

From the above it is clear that the validity of a certification is heavily dependent on the context of the objectbeing certified. It is therefore of paramount importance that the certificate clearly indicates the limits andconditions of validity.

The details of the assessment performed is to be contained in a certification report. This report must be explicitlyidentified on the certificate as an integral part. It must clearly state the conditions and limitations of theassessment and in particular identify the requirements against which the assessment has been performed.

Annex VI of the above mentioned Council directive [DIN01] defines the "Contents of the EC declaration" asbeing:

the Directive referencesthe name and address of the supplier ...description of ... constituent (make, type, etc.)description of the procedure followed ...all the relevant descriptions ... and in particular its conditions of usename and address of notified body (bodies) ... together, where appropriate, with the duration andconditions of validity of the certificate... reference to the European specificationidentification of signatory ...

2.2.4. The certification body

As in the case of assessments performed for the purposes of obtaining a license from the licensing authority,certification must be performed by a person or body that is recognised and accepted by the licensing authority.In other words, the certification body must be certified!

It was pointed out earlier that assessment and certification are to be performed by the "notified body". Article20 of Council directive [DIN01] states:

"2. Member States shall apply the criteria provided for in Annex VII for the assessment of the bodies to benotified. Bodies meeting the assessment criteria provided for in the relevant European standards shall bedeemed to meet the said criteria."

where Annex VII identifies the minimum criteria which must be taken into account by the member states whennotifying bodies.

These minimum criteria are very generic and refer to the independence and impartiality of the notified body'sstaff (must not be involved in the design, development, manufacture, construction, marketing, maintenance oroperation of the product and system they assessed). The technical qualification of the assessors must be"adequate". Thus, the licensing authority must define the detailed criteria that a certification body shall fulfil,just as it defines the criteria for certification. By harmonising the criteria for certification body and certificationacross boarders we will achieve a situation where certification body and certificates throughout Europe will berecognised by all licensing authorities, as indicated in Article 20 (5) of Council directive [DIN01].

2.2.5. Conclusion

The three directives [DIN01], [DIN02] and [DIN03] are the base for the definition of a new European model forassessment and certification in the railway fields but the directives do not give the detailed procedures forassessment/certification of product or system and the detailed criteria for assessment of product or system.Safety digital architectures can be parts of sub-systems. One objective of ACRuDA is the definition ofassessment/certification procedure and assessment criteria for safety digital architectures.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 9: ACRuDA Deliverable D3

3. MAIN CONCEPTS

3.1. Product, Safety Requirements Specification.

[CEN03] (chapter 6.5.1, page 45) standard considers three different categories of programmable electronicssystems:

generic product: (independent of application) A generic product can be re-used for different independentapplications,generic application: (for a class of application) A generic application can be re-used for differentclass/type applications with common function,specific application: (for a specific application) A specific application is used for only one particularinstallation.

The deliverable [ACR01] of the ACRuDA project gives a definition for safety digital architecture, and describesthe differences between basic architecture and application architecture. ACRuDA deals with the basicarchitecture. The basic architecture includes hardware and software, is railway generic and can be used indifferent railway application.

The definition of ACRuDA’s basic architecture is similar to the definition of generic product of [CEN03]. Inthis document , the use of the term « Product » is equivalent to « Generic product »

[CEN03] also gives a definition of Product : « a collection of elements, interconnected to form a system,sub-system, or item of equipment, in a manner which meets the specified requirements ».

A product will be included in different system(s)/sub-system(s)/equipment(s). The supplier of the product canonly have general and theoretical hypothesis on the operational environment. It is necessary for end users andsuppliers of system(s)/sub-system(s)/equipment(s) to verify that the hypothesis on the environment of the usedproducts are consistent with the real environment.

A product can be composed of several different components. From the safety point of view, some of thecomponent don’t influence the safety, some other will contribute to the safety. These components are calledsafety critical components.

Before the beginning of the assessment of a product, it is necessary to give two precise descriptions:

a description of the product in its totality,a description of the safety requirements of the product. This description is called the safety requirementsspecification.

The basic definition of safety requirement specification is taken from [CEN03] (See Annex A, sub-chapterA.2, page 52). This definition has been refined by addition of new items in the definition.

For the purpose of assessing a product, the safety requirements specification should contain:

the safety functional requirements (the safety functions that the product is required to carry out) (comesfrom[CEN03]),the safety requirements: safety integrity level (SIL) and/or a numerical safety target that has been derivedfrom a higher level. A digital architecture must be considered within the context of the overall railwaysystem. This safety requirement comes from a allocation of safety given at a higher level (in generalsystem level) and apportioned to the different sub-levels (sub-system and equipment level and finally thearchitecture level which is a sub-level of the equipment level). (comes from [CEN03] but modified forACRuDA project)the applicable standards and rules (new item, not defined in [CEN03]),the type of application considered (Interlocking, ATP, ATC, etc.) (new item, not defined in [CEN03]),a description of the environment of the product (new item, not defined in [CEN03]).

WCS_AP v.03 Test report

3.11.2010 12:59

Page 10: ACRuDA Deliverable D3

Figure 1 shows the process to obtain the definition of the product and the safety requirement specification:

Figure 1: Product and Safety requirements specification

[CEN01], [CEN02] and [CEN03] defines the safety integrity and the safety integrity levels. The safety integrityis the probability of a safety critical system satisfactorily performing the required safety functions under all thestated conditions within a stated period of time. The Safety Integrity Level (SIL) of a safety requirementsspecification is one of four possible discrete levels for specifying the safety integrity requirements of the safetyfunctions to be allocated to the safety related products/systems. Safety integrity level 4 has the highest level ofsafety integrity and safety integrity level 0, the lowest. The ACRuDA project only considered SIL4 products.In this document, process, procedure, criteria are built for SIL4 level. For SIL level less than 4, the requirementswill be lower, but the basic procedures will still be applicable.

3.2. Assessment process

The main objective of the assessment process is to prepare an impartial report giving enough information on thesafety of the product to demonstrate that the product meets the safety requirements specification and to supportthe certification of the product.

The assessment process is described in Figure 2. The confidence in the safety, is obtained by examination of theproduct, by examination of all its representations and by the understanding of its development process. Anassessment is composed of preliminary analysis of the product, observations, theoretical studies, andexperimentation. The assessment of the product is based on the ACRuDA safety case (see also chapter 4.4.3).

The main organisations involved in the assessment process are: the sponsor, the supplier, the notified body andthe assessor(s). The supplier builds and sells the product and is also responsible for the establishing the safetycase of the product. The sponsor asks and finances the assessment of the product. In most cases, the end user ofthe product or the supplier of the product can be the sponsor. The notified body assesses, with internal and/orexternal assessors, the product on the base on the safety case. In the end of the assessment, the notified bodycertifies a product or not on the base of the assessment results. In this document the term « Assessors » willcover the Internal and External Assessors.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 11: ACRuDA Deliverable D3

Figure 2: Assessment process

There are two main concepts for the assessment: the assessment of conformity and the assessment ofeffectiveness.

The assessment of conformity consists in the assessment of the implementation of a product. It assesses thedegree to which a given real product correspond to its description.

The assessment of effectiveness consists in the assessment of the safety functions, mechanisms and measureschosen to satisfy the safety requirements specification of a product. The assessment is focused on the pertinenceof the functions, mechanisms and measures, the cohesion (if all the functions, mechanisms and measures operatetogether in a good way), the consequences of risks of hazardous failure (from the construction and operationalpoints of view), the ease of use (installation, adaptation to real environment, maintenance...), the safety plan(tools, methods, organisation of the supplier), and the research of remaining scenarii.

The apportionment of work between conformity and effectiveness assessment of product is dependant upon thedefinition and description of the safety requirements specification and the product. If the safety requirementsspecification and the product are well described, defined and detailed then, the main part of the assessment ofthe product will involve the conformity and a few part of the assessment will involve the effectiveness.

The assessors need well defined criteria and procedures to assess a product. The safety of the products isachieved, for the main part, by procedural measures, like organisation controls, staff controls, staff training, andso on. But it is necessary to make also technical controls. The assessment criteria must be procedure andproduct based.

These criteria will cover the aspects of effectiveness and conformity. The assessment criteria must list thenecessary assessment inputs to achieve the assessment. The supplier is responsible for the delivery of theassessment inputs and must verify that all the assessment inputs, given to the assessors, satisfy the requirementon the content and the structure and that all the assessment inputs give the proof or help to establish the proof ofsafety.

The ideal situation is to begin the assessment at the same time as the development of the product. It is necessaryfor the assessors to have a good understanding of the product but the assessors must stay independent and mustnot influence the development of the product (sources [ITS01], [ITS02]).

It is the responsibility of the supplier to prove the safety of its product with all proofs of safety contained in thesafety case. The role of the assessors is to assess the methods used by the manufacturer for testing and safetyanalyses and the result of the tests and the safety analyses. When necessary the assessors can make furtherindependent tests and safety analysis to verify the supplier’s results, to complete some proof element,demonstration or safety analysis. It is recommended that tests and safety analyses are performed using methods

WCS_AP v.03 Test report

3.11.2010 12:59

Page 12: ACRuDA Deliverable D3

and tools, different from these used by the supplier.

3.3 Certification process

Figure 3 shows the ideal set of processes for development, assessment and certification. Each circle is a processand each arrow represents data flow between two processes. This is an ideal scheme but in most cases theprocesses overlap. The assessment process should be in parallel with the development process.

Besides the supplier and assessor, there are four more bodies involved in the certification process: the EuropeanUnion (EU), the authority of a state member of EU, the accreditation body. The European Union is the bodythat accepts or refuses the notified bodies appointed by the authority of a state member. The European Unionis also in charge of defining a common European policy for assessment and certification. The accreditationbody is responsible for issuing accreditation certificates ([EN01] to [EN07] standards) to the assessors and tothe notified bodies. This accreditation it is not explicitly required in the EU directives [DIN01], [DIN02] and[DIN03]. If a body becomes notified body, this could be regarded as an accreditation in its own right.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 13: ACRuDA Deliverable D3

Figure 3: Certification process

A product is developed during a development process. A product is assessed against well defined criteria duringan assessment process. A certification presume the validity of the assessment and confirmed that an assessmentwas performed. The next stage is the integration of the product in a system or the installation of the product inits real environment. This is the approval stage. The approval process is the means to confirm that the use of aproduct in a particular environment for a particular goal is acceptable. In a safe operational exploitation, aproduct is used according to specified procedures. In this case, the changes made to the environment of theproduct can involve modification of the product which can influence the development process (sources [ITS01],[ITS02]). A new assessment will be necessary if changes are made to the product (it can be only a simplified «delta assessment »). The definition of approval procedure and operational exploitation procedures isoutside of the field of ACRuDA project.

Certification is based on the results of the assessment. It is a formal declaration that confirms the results of theassessment and the fact that the assessment criteria were correctly applied and satisfied. All the bodies involvedin the certification and assessment process must be recognised and competent for their role in the certificationand assessment process. The certificate is issued by the notified body (sources [ITS01], [ITS02]).

The certification process requires that the impartiality and independence of the assessment. The liability of thedifferent bodies involved in the assessment and certification process is not clearly defined but will depend ofnational rules and laws.

The Assessment criteria will be most effective if all the countries have a common and harmonised certificationprocess and apply the same set of criteria. The notified body, in each country, is responsible for the applicationof the criteria. It should be noted that these criteria are not only defined for assessors and notified bodies. Theyare also useful for the suppliers and it would be very inefficient for a supplier to develop and build a productwithout considering the assessment criteria (it can lead to an unsuccessful assessment).

Summary of the main concepts developed in the chapter and which form the basis of the document:

digital architectures are considered as generic products, the boundaries, the environment, the conditions of use and the safety requirement specification

of digital architectures shall be clearly and precisely defined, these generic products can be assessed and certified but only SIL4 level will be considered in this

document, the assessment and certification process shall be based on the new European directives, that

define the requirements for notified bodies, the assessment and certification process shall be based on harmonised criteria and a well defined

assessment methodology the definition of approval procedures is outside the scope of this document.

4. PRODUCT DEVELOPMENT PROCESS

4.1. Introduction

Figure 2 shows the assessment process. The first step is to define the product and safety requirementsspecification. Afterwards, the supplier develops the product and supplies enough information to the assessors.To assess a product, it is necessary to know how this product is specified, designed, build and tested. To achievethis, the supplier must develop according to a life cycle with well defined phases and clearly linked inputs andoutputs for each phases. Some assessment criteria address this development life cycle. The assessment ofconformity deals indeed with the process development and the environment of development.

Some development life cycle are proposed in the European railway standards [CEN01], [CEN02], [CEN03] andin the [IEC01] standard.

[CEN01] and [CEN03] defines a system life cycle in terms of:

WCS_AP v.03 Test report

3.11.2010 12:59

Page 14: ACRuDA Deliverable D3

conceptsystem definition and application conditionrisk analysissystem requirementsapportionment of system requirementsdesign and implementationmanufactureinstallationsystem validation (including safety acceptance and commissioning)system acceptanceoperation and maintenancedecommissioning and disposal

and two other phases:

performance monitoringmodification and retrofit

[CEN02] (page 53) defines a life cycle for the development of software and a little bit of system life cycle:

software requirement specificationsoftware architecture and designsoftware module designsoftware codesoftware module testingsoftware integrationsoftware validation

and other phases

software maintenancesoftware planning

[IEC01] defines:

safety requirements specification (safety functions requirements specification + safety integrityrequirements specification)validation planningdesign and developmentintegrationoperation and maintenance proceduressafety validation

The life cycles proposed in [CEN01] and [CEN03] are essentially for the development of system and do notgive details on the life cycle of electronic equipment. The life cycle proposed in [CEN02] is essentially for thedevelopment of software but with few aspects of the development of systems (hardware/software integration).The life cycle proposed in [IEC01] is specifically focused on safety and does not give details on the differentphases of the development life cycle. The different proposed life cycle (chapters 4.3.1 and 4.4.1) are synthesisof life cycle described in all the above mention standards with some additional contributions given by theACRuDA project members.

4.2. Life cycle of a product

The following description is derived from [AQC01].

4.2.1. Presentation

A digital architecture can be considered as a product with a life cycle composed of several stage:

WCS_AP v.03 Test report

3.11.2010 12:59

Page 15: ACRuDA Deliverable D3

paper studymodelprototypepre-productionproduction

The definition of the product is refined at each necessary stage. The life cycle for the product should be definedas clearly as possible.

Each stage (paper study, model, prototype, pre-production, production, etc.) should be clearly defined. Thesestages may vary in duration and generally overlap in practice.

A new architecture may be an upgrade of an existing architecture. Consequently, the life cycle must giveconsideration to existing designs.

4.2.2. Paper study

At a paper study stage, the feasibility and viability of the concepts considered for the product are viable. Theessential output of this stage is documentation.

4.2.3. Model

The aim of this stage is to define a model. To achieve an accurate model it is important to perform a usersrequirements analysis. A preliminary specification and a high level design must be done to determine whatfunctionality must be realised by the model, to demonstrate the feasibility of the product. It is essential toproduce a documentation (but not necessary complete) and all changes are formally controlled as there are agreat number of iterations during the phases of design, realisation and test, to refine the model.

4.2.4. Prototype

In this stage, the specification and design depend not only on the users requirements analysis but also on theresults of the analyses of the model. The prototype takes into account (unlike the model) all the functionality ofthe definitive product but in a provisional form. Testing of the prototype in real or simulated environments is avery important stage in development of the product. All procedures and results must be formally documented.

4.2.5. Pre Production

This stage begins with a rigorous users requirements analysis and development of a specification. The operation(real or simulated) of the prototype is likely to have modified the initial requirements. The phases of designincludes the industrialisation with documents on the definition, manufacture and control of the product inproduction. The tests of the pre-production product are necessary to verify the product before the start ofproduction delivery.

4.2.6. Production

In this stage, the specification and design activities are essentially completes with changes only to correct errors,or if end user’s demands lead to modifications or evolution of the product. Testing is less rigorous than for thepre-production product and utilises sampling, depending on the quantity of the product and the quality assurancelevel requirements.

4.3. Development Life Cycle and documentation

4.3.1. Development Life Cycle

Chapter 4.2 defines the stages of the life of a product. In each of these stages, the supplier should follow adefined development life cycle.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 16: ACRuDA Deliverable D3

The development life cycle is normally considered as a « V ». In the descendant branch of the V, there are asuccession of analyses and study activities. In the ascendant branch, the activities are all activities of testing .

Figure 4 shows a development life cycle:

WCS_AP v.03 Test report

3.11.2010 12:59

Page 17: ACRuDA Deliverable D3

Figure 4: Development life cycle

General abbreviation:

WCS_AP v.03 Test report

3.11.2010 12:59

Page 18: ACRuDA Deliverable D3

- Req. Spec. : Requirement Specification

- Arch : Architecture

Abbreviation:

- SDP : System Development Plan

- SQAP : System Quality Assurance Plan

- SCMP : System Configuration Management Plan

- SVP : System Validation Plan

- SveP : System Verification Plan

- SQP : System Qualification Plan

- SMP : System Maintenance Plan

- SUM : System User Manual

- SQR : System Qualification Record

- SMR : System Maintenance Record

- SRS : System Requirement Specification

- SRTS : System Requirement Test Specification

- SVR : System Validation Report

- SAD : System Architecture Description

- SDTS : System Design Test Specification

- SITR : System Integration Test Report

- SSRS : Sub System Requirement Specification

- SSRTS : Sub System Requirement Test Specification

- SSVR : Sub System Validation Report

WCS_AP v.03 Test report

3.11.2010 12:59

Page 19: ACRuDA Deliverable D3

- SSAD : Sub System Architecture Description

- SSDTS : Sub System Design Test Specification

- SSITR : Sub System Integration Test Report

[CEN03] (chapter 6.5.1, page 45) standard considers three different categories of programmable electronicssystems:

generic product: (independent of application) A generic product can be re-used for different independentapplications,generic application: (for a class of application) A generic application can be re-used for differentclass/type applications with common function,specific application: (for a specific application) A specific application is used for only one particularinstallation.

Figure 5 shows the different categories and the different aspects defined in [CEN03]:

Figure 5: Relationships between categories and aspects

The scope of ACRuDA project is to consider the generic product. It is necessary to define a precisedevelopment life cycle for the product. A product generally comprises hardware and software. There are twolife cycles: one for the software and one for the hardware with interactions between the two cycles (forexample: software may be modified because of problems with the design the hardware).

A equipment life cycle is shown in the Figure 6. The software life cycle is taken from [CEN02].

WCS_AP v.03 Test report

3.11.2010 12:59

Page 20: ACRuDA Deliverable D3

WCS_AP v.03 Test report

3.11.2010 12:59

Page 21: ACRuDA Deliverable D3

Figure 6: Equipment/Generic Product life cycle

Abbreviation:

- Hw : Hardware

- Sw : Software

- SSAD : Sub System Architecture Description

- ERS : Equipment Requirement Specification

- ERVR : Equipment Requirement Verification Report

- ERTS : Equipment Requirement Test Specification

- EVR : Equipment Validation Report

- EAD : Equipment Architecture Description

- EDTS : Equipment Design Test Specification

- EITR : Equipment Integration Test Report

- Sw. Req. Spec. : Software Requirement Specification

- SwRS : Software Requirement Specification

- SwRVR : Software Requirement Verification Report

- Sw. Arch. & Design : Software Architecture & Design

- SwAS : Software Architecture Specification

- SwDS : Software Design Specification

- SwADVR : Software Architect. & Design Verification Report

- SwDTS : Software Design Test Specification

- Sw. Mod. Design : Software Module Design

- SwMDS : Software Detailed Design Specification

- SwMVR : Software Module Verification Report

WCS_AP v.03 Test report

3.11.2010 12:59

Page 22: ACRuDA Deliverable D3

- SwMTS : Software Module Test Specification

- Sw. Code. : Software source Code

- SwSC&D : Software Source Code and Documentation

- SwSCVR : Software Source Code Verification Report

- Sw. Mod. Testing : Software Module Testing

- SwMTR : Software Module Test Report

- Sw. Integ. : Software Integration

- SwITR : Software Integration Test Report

- Sw. Valid. : Software Validation

- SwVR : Software Validation Report

- Hw. Req. Spec. : Hardware Requirement Specification

- HwRS : Hardware Requirement Specification

- HwRVR : Hardware Requirement Verification Report

- HwRTS : Hardware Requirement Test Specification

- Hw. Design : Hardware Design

- HwDS : Hardware Design Specification

- HwDVR : Hardware Design Verification Report

- HwMTS : Hardware Module Test Specification

- Hw. Manuf : Hardware Manufacture

- HwM : Hardware Module

- HwMVR : Hardware Module Verification Report

WCS_AP v.03 Test report

3.11.2010 12:59

Page 23: ACRuDA Deliverable D3

- Hw. Test : Hardware Test

- HwTR : Hardware Test Report

- Hw. Valid. : Hardware Validation

- HwVR : Hardware Validation Report

Others phases:

- Sw. Planning : Software planning

- SwDP : Software Development Plan

- SwQAP : Software Quality Assurance Plan

- SwCMP : Software Configuration Management Plan

- SwVP : Software Validation Plan

- SwITP : Software Integration Test Plan

- SwVeP : Software Verification Plan

- SwMP : Software Maintenance Plan

- DPP : Data Preparation Plan

- DTP : Data Test Plan

- Sw. Maintenance : Software Maintenance

- SwMR : Software Maintenance Record

- SwCR : Software Change Record

- Hw. Planning : Hardware planning

- HwDP : Hardware Development Plan

- HwQAP : Hardware Quality Assurance Plan

- HwCMP : Hardware Configuration Management Plan

- HwVP : Hardware Validation Plan

- HwITP : Hardware Integration Test Plan

- HwVeP : Hardware Verification Plan

- HwMP : Hardware Maintenance Plan

WCS_AP v.03 Test report

3.11.2010 12:59

Page 24: ACRuDA Deliverable D3

- Hw. Maintenance : Hardware Maintenance

- HwMR : Hardware Maintenance Record

- HwCR : Hardware Change Record

- EUM : Equipment Users Manuals

4.4. Safety life cycle and documentation

4.4.1. Safety life cycle

Figure 7 shows the safety life cycle and the associated documentation. This life cycle covers system aspects, thesub-system aspects and the equipment aspects. As for the development life cycle, ACRuDA project deals onlywith the equipment aspects (generic product categories). All this phases involved a team independent from thedevelopment team.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 25: ACRuDA Deliverable D3

WCS_AP v.03 Test report

3.11.2010 12:59

Page 26: ACRuDA Deliverable D3

Figure 7: Safety life cycleAbbreviation:

- Hw : Hardware

- Sw : Software

- FPS : Functional Performance Specification

- PRAS : Preliminary Risk Analysis Specification

- SsaP : System Safety Plan

- SsaRS : System Safety Requirement Specification

- SsaR : System Safety Report

- SSSaP : Sub System Safety Plan

- SSSaRS : Sub System Safety Requirement Specification

- SSSaR : Sub System Safety Report

- EsaP : Equipment Safety Plan

- EsaRS : Equipment Safety Requirement Specification

- EsaR : Equipment Safety Report

- HwSaP : Hardware Safety Plan

- SwSaP : Software Safety Plan

- HwSaRS : Hardware Safety Requirement Specification

- SwSaRS : Software Safety Requirement Specification

- HwSaAR : Hardware Safety Analysis Report

- SwSaAR : Software Safety Analysis Report

4.4.2. Safety Plan

The safety plan defines the safety requirements for each phase of the system life cycle and cover the whole lifecycle. The safety plan is produce by the supplier

WCS_AP v.03 Test report

3.11.2010 12:59

Page 27: ACRuDA Deliverable D3

The Safety Plan is defined as follow in [CEN01]: « a documented set of time scheduled activities, resources andevents serving to implement the organisational structure, responsibilities, procedures, activities, capabilities andresources that together ensure that an item will satisfy given safety requirements relevant to a given contract orproject ».

In addition, [CEN03] specifies the development of the plan thus:" A Safety Plan shall be drawn up at the start ofthe life cycle. This plan shall identify the safety management structure, safety-related activities and approvalmile-stones throughout the life-cycle and shall include the requirements for review of the Safety Plan atappropriate intervals. The Safety Plan shall be updated and reviewed if subsequent alterations or additions aremade to the original system/sub-system/equipment. If any such change is made, the effect on safety shall beassessed, starting at the appropriate point in the life-cycle. ". The Safety Plan is a part of the requirements forthe demonstration of the evidence of the safety management.

The Safety Plan identifies the safety management structure, safety-related activities and approval milestonesthroughout the life-cycle. It also includes the requirements for review of the Safety Plan at appropriate intervals.The Safety Plan shall define all management and technical activities during the whole safety life-cycle which arenecessary to ensure that the safety-related products and external risk reduction facilities achieve and maintainthe required functional safety. The Safety Plan for a product, such as Digital Architecture, is not mentionedexplicitly in the relevant standards but it is possible to derive a Product Safety Plan from the System Safety Planprovide be the standard.

A more precise description of the Safety Plan structure is presented in ANNEX I

The Safety Plan also outlines the methods and techniques to be used to develop, validate and verify the safetydigital architecture against the safety requirements. ANNEX VIII, summarises the techniques and toolsprescribed by relevant standards and proposed in various current practices.

The Safety Plan shall be implemented and functional safety internal audits initiated as required. All thoseinvolved in implementing the Safety Plan shall be informed of responsibilities assigned to them under the plan.

4.4.3. Safety case

4.4.3.1. Introduction

This chapter describes the issues to be considered in developing the Safety Case for a product, and addresses thebasic structure for the Safety Case.

Safety is defined as freedom from unacceptable risk of harm, where risk is defined as the probable rate ofoccurrence of a hazard causing harm times the degree of severity of the harm. In general, the aim of a SafetyCase is to provide the evidence to demonstrate that in all aspects of specified operation, the risk of harm isreduced to the lowest practicable level. This is achieved by demonstrating that:

all the safety requirements have been identified,the safety requirements are achieved,the remaining risk of harm is acceptable or tolerable.

The supplier is ideally placed to develop the Safety Case and in practice it is generally his responsibility.

4.4.3.2. General

There are fundamental safety requirements which apply to all safety critical products. These are describedbelow.

Any credible fault within any part of a safety critical product can be a potential source of a hazard. Indeveloping a product, the bounds of its operation must be defined in terms of the application(s) in whichsupplier envisages the product will be utilised. While the supplier may not know the exact nature of the hazardsrelating to any particular application, he should use his expertise of existing applications and his railwayexperience to identify a set of hazards common to the applications envisaged. The supplier must eliminate ormitigate these wherever possible. The completeness of any such analyses undertaken by the supplier must be

WCS_AP v.03 Test report

3.11.2010 12:59

Page 28: ACRuDA Deliverable D3

demonstrated in the safety case.

All credible failures in a product must be assumed to be hazardous and, consequently, every effort must bemade to eliminate or mitigate the hazard and potential consequences to be within acceptable and practicablelimits or evidence must be provided that the failures are not hazardous.

A safety critical product must perform vital operations including fault detection in a reliable and timely manner.

The safety case of a product should provide evidence that it meets the above requirements. It should bedemonstrated that the SIL of the product is commensurate with that of the applications for which it has beendeveloped. Alternatively, this demonstration may be performed for specific functions rather than the product asa whole.

The Safety Case must demonstrate to the satisfaction of the notified body, the operator, the user of the productand the suppliers themselves, that these requirements have been satisfied.

It may be preferable to develop and issue the safety case in stages. These stages may be linked to life cyclephases, delivery milestones or design reviews. The advantage of this approach is that it gives visibility of thedevelopment of the safety case to the stakeholders in the product, and thus provides opportunity for earlycomment.

The proposed contents of the safety case at each stage, and an outline safety case describing the proposedformat and contents should be issued at a very early stage in the project.

4.4.3.3. Purpose of the safety case

In general, a Safety Case must provide a clear comprehensive, convincing, and defensible argument, supportedby calculation, procedure and management, that a product will, inherently, provide a framework within which adesign may be realised and implemented and be acceptably safe throughout its life.

The safety case for a product will assist in the safe implementation of an application, using the product, and willprovide a major contribution to the application safety case.

In turn, the safety case developed for the application will be built into the safety cases of higher level systems,and finally, into the overall railway system Safety Case. This will contain, or reference, Hazard/Error Logs,design decisions, a history of development and use, and concluding safety arguments for all the components ofthe system, including the safety critical products. In total this will provide the safety argument for the railway.

As well as aspects of product integration in the application, maintenance must be described in detail, definingwhat maintenance is required by whom, where it will be performed, training and spares required, and anymaintenance aids needed, to ensure the level of safety of the product is upheld.

A requirement of [CEN03] is that operational safety of a railway system must be monitored to ensure that thesafety features of the design remain valid during use. This should include the monitoring of safety-relatedperformance and comparison of this to the performance predicted in design analyses, assessment of failures andaccidents to establish actual or potential failure trends, and identifying from these, changes required to improvesafety performance. This is clearly the responsibility of the end user. However, it is this requirement whichmakes it essential that the owner or duty-holder, responsible for the operation of the railway, has access tosources of any information needed to enable the assessment of safety performance and to propose andimplement any changes needed. Clearly, therefore, support may be required from any of the suppliers and it isessential that the appropriate commercial agreements are in place to enable the necessary modifications to bemade and any revisions to the safety arguments developed.

For details of the requirements for the Safety Case for software programmed into, or used directly fordeveloping, safety critical elements of the architecture, refer to the CASCADE Generalised Assessment Method[GAM01]. [GAM01] should also be used to assess any software development tools or utilities which contributedirectly to the integrity level of any aspect of the architecture.

4.4.3.4. Content of the product safety case

WCS_AP v.03 Test report

3.11.2010 12:59

Page 29: ACRuDA Deliverable D3

The Safety Case shall include sections on the following:

ContentsHigh-level documentationSafety Management DocumentationSafety ObjectiveDescription of the ArchitectureFunctional ElementsSafety StudiesOwnership & Responsibilities- Operation, Evolution, ModificationUser SupportConclusion - Safety Argument Summary

A more detailed description of the safety case is given in ANNEX II.

4.5. Quality Assurance provisions

The Quality actions that need to be carried out to ultimately obtain certification of a product. These shall bedescribed, by the supplier, in a Quality Assurance Plan. The Quality Assurance Plan describes practices,means and the sequence of activities related to quality. The plan shall apply to all activities and to the wholeproduct life cycle The approach for the Quality Assurance Plan is specified in EN 29001.

The manufacturer shall be fully compliant with the requirement of EN 29001

4.5.1. Relationship of Quality Assurance to other Plans

The Quality Assurance Plan is applicable to all the supplier activities. These are described in the managementplan that has been prepared to carry these activities. The measures defined by the Quality Assurance plan are,consequently, complementary to the management plan and can be detailed in all other plans derived from them.

The Quality Assurance program is written based on the approach defined in the management plans. These plansare based on a development process described in the chapter 4.3 of this document.

EN 29001 requires that the supplier sets up an organisation capable of assuring the design and productionquality of the product.

To achieve this, the supplier will ensure that his plan are consistent with the safety and reliability requirementsof the product and that these plans are correctly implemented.

Figure 8 shows the relation between the various plans:

WCS_AP v.03 Test report

3.11.2010 12:59

Page 30: ACRuDA Deliverable D3

Figure 8: Structure of Plans

The figure makes a distinction between:

The organisation documentation: this should describe the organisational structure under which the producthas been developed, and defines the roles, responsibilities and reporting structure of personnel involved inmanagement, development, safety, maintainability reliability and user support. The named organisationchart showing persons working on the project shall be kept available.

1.

The development plan: this plan defines the development of the product in terms of development stagesand establishes the criteria for demonstration and acceptance that each stage has been completed. This isa "living" document which must reflect not only the original plan, but also the actual life cycle of thedevelopment that took place.

2.

The Quality plan: this plan is the basic guideline of the quality plan. The Quality Plan defines the qualityrequirements that will be applied to all aspects of the work in developing the product. This will include theQuality Management System (QMS) used on the project together with a traceable path to enabledemonstration that the QMS is in accordance with EN 29001 and related standards. Two Software andHardware Quality plans could be prepared in addition to the Quality Plan. The producers (internal entitiesor subcontractors) use these plans to prepare their own plans and include specific provisions into theiractivities.

3.

The Safety plan: the Safety Plan defines the way in which the safety of the product is to be assured.Details of techniques and processes to be used, at what stage they are to be used and how the findings ofeach analysis is to be addressed as part of the development process shall be described.

4.

Configuration management plans: this document describes the principles and processes by which theproduct under consideration has been controlled throughout its life cycle from conception throughdetailed specification, design, build, validation. The Configuration Management Plan should detail thetiming of design reviews, configuration baselines, status reporting mechanisms and procedures fordeviation from prescribed processes. This document is vital since traceability is a central requirement of aSafety Case and rigorous traceability is only truly achievable when all evidence is from configuredsources.

5.

Verification & Validation (V &V) Plan: this document defines the objective and approach to be adoptedin demonstrating that the requirements described in the Requirement specification documentation andsafety criteria drawn from the various safety analyses have been met. Procedures for, and evidence of,traceability of specific requirements to particular test elements of V &V activities shall be brieflydescribed and appropriate, detailed documentation should be referenced.

6.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 31: ACRuDA Deliverable D3

5. ASSESSMENT AND CERTIFICATIONPROCESS

5.1. Introduction

This chapter contains a proposed assessment and certification process which is derived from the Europeandirectives [DIN01], [DIN02] and [DIN03].

The main objective of an assessment is to gain confidence in the fact that the product meets its safetyrequirements specification. The final objective is to obtain the certification of the product. The certification isbased on the result of the assessment.

The main concepts for an assessment are repeatability, reproducibility, impartiality (sources [ITS01], [ITS02].)An assessment is repeatable if the repetition of the assessment of the same product, with the same safetyrequirements specification evaluated by the same assessor gives the same overall verdict as the first assessment(sources [ITS01], [ITS02]). An assessment is reproducible if the repetition of the assessment of the sameproduct, with the same safety requirements specification evaluated by another assessor gives the same overallverdict as the first assessment (sources [ITS01], [ITS02]). An assessment is impartial if it is free from biastowards achieving any particular result (sources [ITS01], [ITS02]).

An assessment can be concurrent or consecutive. If the assessment is done after the development of theproduct, the assessment is consecutive. If the assessment is done in parallel with the development of theproduct, the assessment is concurrent. For a consecutive assessment, the totality of the assessment inputs(documentation, hardware, software, etc.), are available at the beginning of the assessment. For a concurrentassessment, the assessment inputs are available as the development progresses (sources [ITS01], [ITS02]). Therecommended assessment is the concurrent assessment. This allows problems to be resolved at an early stage.Where existing products are utilised, consecutive assessments is the only solution. For new designs allassessment should be concurrent (sources [ITS01], [ITS02]).

The assessment criteria describe the elements of proof necessary for the assessment. The information on theproduct must be as clear and complete as possible and he assessors should have a good understanding of theproduct particularly the safety requirements specification. An assessment is based on preliminary analyse,observation, theory, and experimentation.

The preliminary analyse of the product is a very important. The assessment requires inputs from the supplier.These inputs should include a description of the product a set of requirement specifications which providesufficient level of detail to undertake the assessment.

A product is composed of components and each of these may be composed of lower level components. It isessential that the requirements at product level are broken down to all components and that each low levelrequirement is traceable to the top level requirement. Safety requirement should be clearly identified separatelyfrom other requirements.

An assessment is successful if all the assessment criteria are satisfied . For each criterion assessed, there arethree possible outcomes:

success: evidence was presented that satisfied the criterion,fail: evidence was presented that should think criterion has not been verifiedto be confirmed: There was insufficient evidence, time and resources to state whether or not a criterionpassed or failed.

By the end of the assessment, all criterion classed as " to be confirmed " verdict must become " success " or "fail " verdict.

5.2. Description of the assessment and certification process

WCS_AP v.03 Test report

3.11.2010 12:59

Page 32: ACRuDA Deliverable D3

Figure 9 shows the overall assessment and certification schema with the bodies involved, the roles of the bodiesand the data exchanged between the bodies.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 33: ACRuDA Deliverable D3

WCS_AP v.03 Test report

3.11.2010 12:59

Page 34: ACRuDA Deliverable D3

Figure 9: Assessment and certification schema

5.3. Role of the different bodies

5.3.1. Role of the European Union (EU)

The EU gives an identification number to each notified body and publishes all the information on the notifiedbody in the Official Journal of the European Communities. The EU keeps the official list of notified bodies(sources [DIN01], [DIN02]).The EU is in charge of defining the directives (for example: Interoperability Directive for the European RailwayHigh Speed System), the standards (for example: [CEN01], [CEN02], [CEN03], [EN01] to [EN07]).

The European Union can be helped by a committee made up of representative peoples of each country andchaired by the European Union. If it exists, this committee is in charge to define the European policy forassessment and certification in Europe. It defines the procedures, methods, rules and criteria for assessment andcertification for all the countries (sources [DIN01], [DIN02]).

5.3.2. Role of the authority of a EU member state

The authority appoints notified bodies. In order, to maintain the notification, the authority must regularlymonitor the competence and the independence of the notified bodies (sources [DIN01], [DIN02]).

Details of the notified bodies are published in the Official Journal of the European Communities (sources[DIN01], [DIN02]).

The authority establishes the national accreditation body (the accreditation body must function in conformancewith the EN 45ACC and EN 45ASS project standards).

5.3.3. Role of the accreditation body of a state member

The accreditation body gives an accreditation to [EN01] to the notified body and the assessors. Theaccreditation body regularly monitor that the notified body complies with [EN01] standard.

5.3.4. Role of the notified body

The notified body provide a third party assessment. It is competent to fulfil the tasks related to the assessment ofconformity planned in the European directives (sources [DIN01], [DIN02]).

The notified body is governed by the laws of the state member who notifies it (source [DIN01]).

The notified body can be accredited ([EN01] standard). In other ways, the state member must justify to theEuropean Union the competencies (for these standards) of the notified body (sources [DIN02]).

The employees of the notified body must be independent (remuneration non proportional to the number ofachieved assessment) and are bound by professional secrecy (industrial property) (source [DIN01]).

The notified body must subscribe a civil liability insurance (source [DIN01]).

The notified body leads the assessment and defines the procedures and the means to fulfil the assessment ofthe product. The notified body may use external assessors to perform some parts of the assessment work.

Upon satisfactory assessment results, the notified body will issue a certification report and a certificate for theproduct.

The notified body is responsible for the assessment technical report (source [DIN01]).

The notified body (sources [DIN01], [DIN02]) maintains, and publishes the list of:

WCS_AP v.03 Test report

3.11.2010 12:59

Page 35: ACRuDA Deliverable D3

assessment requests (past and in progress), certificates refused, certificates delivered.

5.3.5. Role of the sponsor

The sponsor is the person or body who requests the assessment to show that the product meets the safetyrequirements specification. The sponsor orders the assessment (he asks and finances the assessment). Thesponsor is responsible for the appropriate utilisation of the certification report and the certificate (sources[ITS03], [ITS04]).

The sponsor may choose a notified body from any European country (sources [DIN01]).

5.3.6. Role of the assessors

The assessors are bodies of proven integrity, independence (notably financial), technical competence. Theymust be independent of design, manufacture, marketing, maintenance, and operation of the product (sources[DIN01], [DIN02]).

The employees of the assessors must be independent (i.e.. remuneration must not be based on the number ofcertificate issued) and are subjected to requirements confidentiality (sources [DIN01], [DIN02]).

The assessors must be accredited to [EN01].

5.3.7. Role of the supplier

The supplier designs, develops and validates the product according to current European Standards (quality,safety, development, organisation, documentation: [CEN01], [CEN02], [CEN03], etc.), and directives.

The supplier demonstrates the safety of the product according to the defined level (in the case of ACRuDA, thelevel is SIL4). The evidence and argument that the product is safe is contained in the safety case.

5.4. Phases of the assessment and certification process

The assessment and certification process is divided into three main phases: preparation, assessment, andcertification.

5.4.1. Phase I: preparation of the assessment

The sponsor must give a precise description of the product and defines the safety requirements specification andthe boundaries of the assessment. When the sponsor is not the supplier, the participation of the supplier isstrongly recommended.

The sponsor enters in contact with a notified body and asks an assessment for the product (sources [ITS03],[ITS04]).

The notified body consults the assessors (internal and/or external assessors). A preliminary analyse of theproduct description and the Safety requirement specification must be done by the notified body and theassessors, to control the completeness and the coherence of the two description. On the base of the productdescription and safety requirements specification, the notified body and the assessors make an assessmentfeasibility study. The results of these study are:

a) The result of the feasibility study is positive: the notified body and the assessors define an assessment plan.The structure of the assessment plan is given in ANNEX III. This plan contains a detailed assessment work planwith a detailed assessment inputs delivery plan. The assessment plan is submitted for approval to the sponsorand the supplier. The sponsor and the supplier (if the sponsor is not the supplier) prepare an assessment casewhich content:

WCS_AP v.03 Test report

3.11.2010 12:59

Page 36: ACRuDA Deliverable D3

the description and boundaries of the product,the safety requirements specification,the assessment work plan,the assessment inputs delivery plan,the confidentiality clauses (delivery of assessment inputs, etc.),the identification of the notified body and the identification of the assessors

The assessment case is transmitted to the notified body. The notify body can make remarks and comments onthe assessment case (in particular, it can focus on some points which could be problems for the delivery of thecertificate).

The notify body draws up a contract for the assessment (based on the assessment case). The contract is signedby the notified body and the sponsor. The assessment is registered by the notified body in its list of assessmentsin progress.

The assessment can divided into one or more work packages. An assessor can be in charge of all the workpackages. If external assessors are required subcontractors), contracts are signed between the notified body andthe subcontractors before the beginning of the assessment. These contracts must define the assessment work, theassessment inputs delivery plan, and the financial forms.

The assessment plan can be annexed to the contract. It can have a draft status. This plan can be modified duringthe assessment (because of changes of documentation, tools, increase of information, etc.).

b) The result of the feasibility study is negative, the notified body asks the sponsor and the supplier to make thenecessary changes.

5.4.2. Phase II: assessment

The notified body and the assessors proceeds the assessment. The supplier produces the assessment inputsaccording to the assessment plan. The sponsor is responsible for the delivery of the assessment inputs. Theassessment inputs are delivered according to a delivery protocol and the confidentiality clauses, establishedduring the preparation phase of the assessment. Confidentiality clauses are applied. If the supplier wants toprotect its industrial knowledge, the assessment inputs go directly from the supplier to the notified body but thesponsor is informed of the delivery of the assessment inputs.

The notified body (with internal and/or external assessors) execute the assessment tasks defined in theassessment plan. For each task, reports are regularly produced to control the progress of the assessment tasks.The assessment inputs are analysed in conformance with the criteria. The assessors must verify the productaccording to the criteria. During the assessment, the notified body and the assessors must seek to understand theproduct which they investigate. In particular, they must seek to understand whether the product can behave inany way contrary to the safety requirements specification. In other words, they seek to discover potential risksof hazardous failure. It is recommended that the assessors build models, realise experiments and observations onthe product. All the assessment criteria must be verified.

During the assessment, some problems can be detected: non delivery of an assessment input, refusal to correct adesign, etc. These problems are called anomalies and they must be submitted to a particular treatment. Theymust be analysed to determine their consequences on the assessment. It is necessary to take in account theseanomalies the earlier as possible in the assessment process. A special procedure to treat the anomalies must bedefined. All the bodies involved in the assessment must be informed. In general, there are two categories ofanomalies: minor or major. The minor anomalies can be easily corrected and are registered in the assessmentreport. The major anomalies are recorded in anomaly reports. An anomaly report can contain:

the activities during which the anomaly was detected,the description of the anomaly,

Each anomaly report is examined and validated by the notified body and the assessors. These reports are sent tothe sponsor and to the supplier. All the anomalies must be treated. The supplier can dispute an anomaly but hemust have good arguments to convince the notified body and the assessors. The sponsor can also dispute ananomaly if he judges that the treatment of the anomaly can have important consequences on the assessment. In

WCS_AP v.03 Test report

3.11.2010 12:59

Page 37: ACRuDA Deliverable D3

all cases, all the anomaly must be solved at the end of the assessment and the decision of closing an anomalymust be taken with the agreement of all the partners involved in the assessment.

At the end of each assessment task, an assessment report, which contains the results of the assessment work, iswritten. Each assessor writes an assessment report for the notified body. Each assessment report is examinedand internally approved by the notified body. The assessment reports can contain confidential information. Theirdiffusion must be controlled These assessment reports are sent for approval to the sponsor and supplier.Confidentiality clauses are applied. If the supplier wants to protect its industrial knowledge, the assessmentreports are only sent to the supplier but the sponsor is informed of the delivery of the reports. The assessmentreports contain:

the objectives of the task,the assessment inputsthe criteria applied,the description of the work achieved by the assessors,the techniques, methods and tools used for the assessment work,the results of the assessment,a proposal for the verdict of the assessment,a description of the anomalies detected,the time and resources used for the task,

In case of the work packages are apportioned between several assessors, it is necessary to make a synthesis ofall the assessment reports in a final technical assessment report. The notified body is responsible for theconstitution of the technical assessment report. The technical assessment report contains a description of all thework achieved by the assessors, all the results of the assessment, and the conclusion of the assessment.Sometimes, restrictions for the use of the product can be mentioned in the report. All the references of thetechnical assessment report must be available.

When the technical assessment report is internally approved by the notified body, it is sent, for approval, to thesupplier. The technical assessment report can contain confidential information. Its diffusion must be controlled.Confidentiality clauses are applied. If the supplier wants to protect its industrial knowledge, the technicalassessment report is only sent to the supplier but the sponsor is informed of the delivery of the report. Thediffusion of the technical assessment report to entities, not involved in the assessment, is submitted to theapproval of the sponsor and the supplier.

An evaluation rating can be regarded as the assignment of a pass/fail verdict. A pass verdict is assigned if allcriteria are satisfied and, in particular, no risk of hazardous failure have been found. A fail verdict is assigned ifany error is found and is not corrected, or if a risk of hazardous failure is found.

5.4.3. Phase III: certification

This is the final step. The technical assessment report, containing the results of the assessment is approved bythe notified body and the supplier. Confidentiality clauses will be applied. On the base of the technicalassessment report, the notified body summarises the conclusion in the certification report. The certificationreport is a public document. When a end user uses the product, it can only have access to this document. Inconsequence, the certification report must contain all the observations, measures and recommendationnecessary to have a safe use of the product.

When the certification report is approved by the notified body the sponsor, the notified body delivers acertificate. The certificate is signed by the sponsor and by the notified body. The product is added to the list ofcertified products. The certification report and the certificate are published in official national and Europeandocuments.

The structure of the technical assessment report is given in ANNEX IV, the structure of the certification reportin ANNEX V and the structure of the certificate in ANNEX VI.

5.5. Re-use of assessment results and product composed ofassessed/certified components.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 38: ACRuDA Deliverable D3

An assessment is a complex process, which demand lot of time, important resources and money, dependingupon the complexity of the product and the integrity level. The certification report and the certificate for aproduct are valid only for the assessed version and configurations of the product. To limit the quantity of workto achieve for an assessment and when it is possible, it can be interesting to re-use assessment results from aprevious product assessment. There are two cases where the re-use of assessment results can be applied (sources[ITS02], [ITS04]):

a new version of a producta new product which used assessed/certified components, tools, methods, principles, techniques,theoretical studies, etc. (all these things will be called assessed/certified components in this chapter).

The way of doing the assessment is different in the two cases.

In the first case, it is a new version of the product. The supplier must identify, by a clear and precise analysis,and must describe in a report, the modifications and the consequences of these modifications on the safety ofthe product. The supplier or the sponsor must submit this report to the notified body. The notified body analysesthe report and decides if it is necessary to re-assessed the product or not. A re-assessment is identical to theassessment described in the chapters 5.1 to 5.8 except that some results of the previous assessment of theproduct can be re-used. If the re-assessment is successful, the certification report is written and the certificate isdelivered by the notified body. If no re-assessment is needed, the notified body extends the certificate to thenew version of the product (sources [ITS02], [ITS04]).

In the second case, the situation is different because the product is new but it uses some assessed/certifiedcomponents. The sponsor asks the assessment of this new product. This assessment is considered as a totallynew assessment as described in the chapters 5.1 to 5.8. As it is said before, some part of the product has beenassessed/certified in a previous assessment. It is possible to re-use some results of these previous assessmentduring the assessment of this new product. The assessors must carefully verified if the assessed/certifiedcomponents are correctly used in the composed product (in particular: verification of the interfaces andverification that the use of the assessed certified components can not degrade the safety of the composedproduct). If the assessment is successful, the certification report is written and the certificate is delivered by thenotified body (sources [ITS02], [ITS04]).

In all cases, all the partners involved in the assessment and certification process, must be careful in the re-use ofprevious assessment results or with product composed of assessed/certified components.

5.6. Capitalisation of assessment work

During an assessment, the assessors can meet difficulties in the application of methods, techniques and tools. Itis important to write these difficulties and their solutions in a document. The objective is to improve andfacilitate the future assessments. The capitalisation is focused on two subjects: the assessment methods and thedevelopment methods. The progress in the assessment and development methods are written in a capitalisationreport. This report is the property of the notified body.

This report must talk about all the methods, techniques , tools used during the assessment and about the lessonsand benefit found by the assessors.

This report must also talk about the opinion, the judgement of the assessors on the methods, tools andtechniques used by the supplier for the development of the product.

This report can contain some confidential information but some of the results (evolution of the criteria, newmethods, etc.) can be published to the overall community of the assessors. The objective is to improve the globalquality of assessment in the European Community.

5.7. Certification report and Certificate

The certificate attests that the assessment was achieved correctly, with impartiality, competencies, inaccordance with the criteria, procedures and schema.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 39: ACRuDA Deliverable D3

The certificate is valid for the assessed version and configuration of the product. The safety of the product mayreasonably assumed for the correct use of the product in accordance with the recommendation of use containedin the certification report.

The certification report and the certificate are the properties of the notified body. The reproduction andpublication of the two documents are authorised only if there are reproduced in their whole.

The notified body can withdraw the certificate (for example if it is discovered that the data supplied during theassessment were not exact).

A certification report structure is proposed in ANNEX V and the certificate structure is proposed in ANNEX VIof this document.

5.8. Assessment inputs

The assessment inputs are all the data necessary to achieve the assessment (hardware, software, documentationtools, standards, etc.). Figure 10 shows the needed assessment inputs to achieve an assessment: harmonisedcriteria (the base of the assessment), regulation, rules, laws and standards, the product and the safetyrequirements specification, and a set of tools and method.

Figure 10: Assessment inputs

The assessors are not concerned with the relationship between sponsor and supplier. It is recommended todefine, before the beginning of the assessment a complete list of the assessment inputs with the date of delivery.The following points should be defined:

the medium and the format of the assessment inputs (computer medium, tape, paper, etc.),the program for the delivery of the assessment inputs,the number of each assessment input to deliver,the policy for provisional assessment inputs,the development environment,the access of the development site.

During the assessment, the assessors will have access to confidential information (industrial protection). All thebody involved in the assessment process must have the assurance that all the information will stay confidential.This will influence a lot of aspects of the assessment process (reception, management, stocking, and restitutionof the assessment inputs).

WCS_AP v.03 Test report

3.11.2010 12:59

Page 40: ACRuDA Deliverable D3

5.9. Essential Quality Requirements for the assessment activities

5.9.1. The normative context for assessment requirements

The normative context for the assessment activities on vital architectures are few:

- EN 29001 series- [EN01] to [EN07] standards- International standards- National standards- Standards on safety analyses methods

5.9.2. Quality System of the Assessor

The assessment phase is an essential phase in the life cycle of the product (see [CEN01] life cycle) . The aim ofthe assessment is to have the final users confident about the safe use of the product. In this context, a qualitysystem, in the assessment activity, is highly recommended. This quality system has to be described in a QualityHandbook.

Here after are the ACRuDA recommendations for the content of the Quality Handbook of the assessor.

5.9.3. Quality Handbook

The quality handbook can be written following the EN 29001 series. But some specific issues for the assessmentactivities are presented here.

5.9.3.1. Background referential of the assessment

The referential used by the assessor includes the best practices and the applicable norms, standards, andregulations. The domain of the railways safety architectures is submitted to numerous regulations and norms.The assessor should then clearly specify, in the quality handbook, the norms that will be checked in hisassessment.

The quality handbook should explain how the assessor makes sure that he has always the updated referential.

The risk of having a bad referential is to have a certification which will not be recognised by the othersEuropean partners.

5.9.3.2. The quality requirements on methods for assessment

The quality handbook should explain which dispositions are taken to identify and qualify the validity domain ofthe methods used for assessment.

These methods should be in coherence with the referential of the assessor. The assessor should make sure thatthese methods haven been previously tested in safety applications. The assessor should use methods that havebeen defined by national or international standard.

The "training" chapter of the handbook should explain how the assessor is able to use the methods.

It is recommend to use methods which lead to objective results as far as possible. This is the best assurance forobjective and reproducible results. It is recommended to have a wide panel of methods available for theassessment so that the verification is strengthen by a diversification of the studies and points of views taken bythe assessor.

The risk of misusing the methods is to get a non efficient evaluation.

5.9.3.3. The quality requirements on tools for assessment

WCS_AP v.03 Test report

3.11.2010 12:59

Page 41: ACRuDA Deliverable D3

For the tools used for measurement and test, it is recommended to apply the requirements of [EN01] onequipment.

Qualification procedures should be described to qualify the tools of the assessor.

The automation of the tests and analyses is recommended as far as possible, to have a guarantee of reproductionof the evaluation and to limit the human error factor.

The risk of not applying the quality requirements on tools are to have bad measures.

5.9.3.4. The safety audit

A procedure about the safety audit activity should be referenced in the quality handbook. Some qualityrequirements for audit traceability must be ensured: an audit plan should be written, a list of documentsreviewed should be produced.

5.9.3.5 The Configuration management

The Quality handbook should explain the configuration management applied to:

- internal documentation: quality procedures, referential, tools- assessment reports, anomalies- documentation of the developer

The general policy for the configuration management of the product under assessment (versions of the software,versions of the components of the architectures) should be presented too.This is a very important requirement for the assessor because the risks are major:

to assess a version of the product which is not the version under operation. to have lacks in the evaluation of the development process to loose the traceability of the assessment

Further more, in railways the life cycle duration of the products can be 30 years. This needs to develop aconfiguration system to keep the safety documentation (including the assessment reports) valid during thisduration. The quality handbook should state whether the assessor provides a service like recording the differentassessment reports made on a product.

5.9.3.6. The assessment reports and anomalies

The assessment activity will produce assessment reports and anomalies. We suggest to refer to [EN01]requirements on reports. The closure of the safety anomalies should be submitted to strong conditions. Theanomalies that remain open should be presented in the final report and should then imply restrictions in thecertification or use of the architecture.

The risk of a bad traceability of the reports and anomalies is to forget some important points and restrictions ofthe certification.

5.9.4. Human issues

5.9.4.1. Competence and knowledge of the assessor

The legitimacy of the assessor is mainly funded on this characteristic. This issue should then be speciallydetailed in the quality Handbook . The competence and the knowledge of the assessors should be closely relatedto the technology, process, and methods used by the developer in railways. The " training" chapter shouldexplain how the competencies and knowledge are kept up to date.

The use of experts in the assessment team should be organised too. Their activity should be submitted to anassessment project review with the rest of the team, that their independence be demonstrated towards the

WCS_AP v.03 Test report

3.11.2010 12:59

Page 42: ACRuDA Deliverable D3

methods and product evaluated. Further more it is good to keep some expert out of the assessment process as apotential resource in case of conflict.

The risk of a lack of competencies is to be unable to perform the investigations correctly or to judge thetechnical criteria. The risk of a lack of knowledge in the specialised domains can be to overpass a problem, torefuse a knew technology even if it could give some improvement in the safety process.

5.9.4.2. Organisation of the assessment team

The responsibility of the assessors in their results is important and it is recommended to have an organisationthat deals with this specificity:

double control teams regular internal reviews nominative organisation description of the signature process and responsibilities of each member a project oriented team

The risk of a bad organisation is to loose time, money and to give unnecessary stress to the members of theassessment.

5.9.4.3. Independence of judgement

The legitimacy of the assessor is mainly funded on this characteristic. It is recommended to satisfy therequirements of impartiality of [EN01] and Interoperability directive.The risk of a lack of independence of judgement are to: hide some anomalies, reject the product of a concurrent,accept light justification of the process of the developer.

5.9.4.4. Confidentiality of the developer's innovations

A minimum set of procedures in the quality system should explain the protection to reduce the vulnerability ofthe information given by the sponsor and produced by the assessor. The level of security reached should bedefined.

5.9.4.5. Publication of the results of the assessment

The assessor should define a procedure to make sure that its report contain is not altered or changed in theneeded information for the notified body and the final user.

5.9.4.6. Subcontractors of the notified body

The subcontractor should comply to the same quality requirements than the notified body himself. Further more,a reception procedure should be defined to accept the work of the subcontractor.

5.9.4.7. Environment organisation

The relationship between the assessor and other partners should be presented in the Quality Handbook.

5.10. General concepts for assessment and certification of software.

Software is always embedded in a more complex environment and the interaction between the software and itsenvironment determine the quality of the software product. Software is not subject to wear and tear, so it willnot deteriorate in the course of time. Therefore, the specification of the software and the processes leading up tothe generation of code will be considered in great detail, whereas the actual performance of the code will play aless important role.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 43: ACRuDA Deliverable D3

5.10.1. Software design

Software assessment and certification is basically assessment and certification of design. Therefore, the methodsand tools that are used in the design process must be assessed in order to determine if they lead to the desiredquality. Here, quality encompasses error avoidance, error correction and error tolerance.

Error avoidance is clearly an activity that must be performed during the design process. It can be facilitated bythe use of recognised design and control principles. Error correction is used here to mean an activity performedby the software (at run-time) to correct recognisable errors before they have any effect. Error tolerance is theability of the software to function correctly even if certain boundary conditions are not fulfilled.

5.10.2. Algorithms and formal methods

Assessment and certification of software is thus also assessment and certification of algorithms. Not only thealgorithms for performing the specified functions, but also the error correction algorithms must be assessed. Thisallows for the use of very formal proofs, provided a formal description of the algorithm is possible. Sincesoftware is not subject to wear, such proofs can be exceptionally generic.

5.10.3. Verification and validation

Verification is the process of demonstrating that the software truly fulfils the specified requirements, validationis the process of demonstrating that the requirements were correct. For hierarchically structured software,validation of requirements at a lower level consists of demonstrating that they correspond to at least part of arequirement at the next higher level. Then, only the top level requirements must be validated against the safetyand reliability requirements of the encompassing system.

If all requirements at all lower levels can be validated against requirements at a higher level, then verifyingrequirements at the bottom will very often also be a verification of the higher levels too.

This must of course be confirmed, and such confirmation is of course part of the assessment and certificationprocess. But when that can be done, the subsequent validation of the top level requirements becomes theremainder of the assessment and certification process. And that is where the encompassing hardware and itsoperational context must be considered.

5.10.4. Interfaces

At the beginning of this section it was pointed out that software is always embedded in a more complexenvironment and that the interaction between the software and its environment determine the quality of thesoftware product. This interaction is defined through the interfaces between the software and its environment.Thus, the correct definition and implementation of interfaces must be confirmed. Confirming the correctdefinition of the interfaces is a part of the validation task, confirming their correct implementation is part ofthe verification task.

6. HIGH LEVEL ASSESSMENT CRITERIA

6.1. Introduction

Harmonised criteria are necessary to obtain the mutual recognition of the assessment results of safety criticaldigital architectures. The criteria exposed hereafter is a first set of basic criteria to guide assessment. Theycan lead an assessor to organise his assessment plan and assessment activities. Assessment of any safetycritical digital architecture shall be based on a declared set of criteria derived by applying the basiccriteria of this document. This set of basic criteria has to be used with high care. The assessment shall beginwith the formulation of an assessment plan, detailing the scope of assessment and its basis, such as the safetyand reliability targets, integrity level and norms. The assessment plan and detailed criteria shall be produced bythe assessor.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 44: ACRuDA Deliverable D3

The supplier of the architecture shall provide all the evidence required to demonstrate compliance with thedetailed criteria. The evidence should be organised in accordance with a Safety Case Structure (see chapter4.4.3), and shall be readily available for audit, walk-through, review and detailed examination.

The assessment should be based on the judgement resulting of the verification of a set of criteria on thefollowing properties:

the adequacy of the safety requirements specification of the product,the effectiveness of the solution proposed by the supplier,the conformity of the solution implemented by the supplier.

The assessment should be focused, mainly, on the conformity and effectiveness of the techniques and measures.

The assessment of Effectiveness is a judgement about the abstraction of the product, the safety principles or themethod. Effectiveness characterises how effective the techniques and measures are, in identifying andeliminating or mitigating the hazards.

Effectiveness includes:

suitability of the safety principles and mechanisms, standards, safety functions, methods and tools usedto construct a safe product,cohesion of the set of safety principles and safety critical functions,cohesion of the set of tasks described in the safety plan.

The Conformity deals with the completeness of the implementation and the accuracy of the representation ofthe specification. Conformity characterises how accurately the techniques and measures are implemented andhow well they are explained in the supplied documentation.

Conformity can be established through answering the following questions:

does the implementation contain all the requirements that are stated in the specification? does the implementation not contain more than the requirements stated in the specification? is the implementation an accurate representation of the specification? are the methods planned in the safety plan used and applied?

Where necessary, single criterion can be broken down into several lower-level criterion in order to make theassessment. Each criterion shall be applied according to current best practice and experience. In addition, theassessor shall assess the design of the architecture independently, for example, by carrying out as a minimum, anindependent hazard analysis.

The assessor shall provide an assessment report which should summarise the approach, findings, criteria andprovide detailed reasons why the elements of the architecture passed or failed the criteria.

The assessor shall make judgements about the evidence presented by the supplier. The assessment criteria mustcover all the techniques and procedures used by the developer to achieve the integrity of the architecture.According to [CEN01], the means to achieve railway dependability relates to controlling the factors whichinfluence dependability throughout the life of the system. Effective control requires the establishment ofmechanisms and procedures to defend against sources of error being introduced during the realisation andmaintenance of the system. Such defences need to take account of both random and systematic failure.

The means used to achieve dependability are based on the concept of taking precautions to minimise thepossibility of failure occurring as a result of an error during the realisation phases. Precaution is a combinationof:

prevention: concerned with lowering the probability of the impairment,protection: concerned with lowering the severity of the consequences of the impairment.

The strategy to achieve dependability for the system, including the use of prevention and/or protection means,shall be justified in the safety case.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 45: ACRuDA Deliverable D3

By defining a management process based on a life cycle, [CEN01] elaborates the means to ensure dependabilitythrough minimising the effects of errors and by controlling the factors influencing railway dependability (seesection 6 of the standard). Methods, tools and techniques appropriate to engineering dependable systems arepresented in other CENELEC standards, [CEN02] and [CEN01] and in IEC standard [IEC01].

A general overview of the manner in which methods and techniques are used to support dependabilityengineering and management is given in [CEN01] (chapter 5.3.7, figure 12).

The following documentary evidence is a condition (required by the standards [CEN03] and [IEC01]) for thesafety acceptance of the safety-related electronic system.

Evidence of quality management (Quality Management Report)Evidence of safety management (Safety Management Report)Evidence of functional and technical safety (Technical Safety Report)

These documents included in a structured safety justification document (Safety Case), have to present themethods and techniques used to develop the system and ensure the safety. Examples of methods and techniquesto be used for the validation of safety digital architectures are given in the standards.This chapter contains the basic criteria which are expected to provide the infra-structure and rules forunderstanding an assessment of safety critical digital architecture. These assessment criteria have been derivedfrom the State of the Art and the standards [CEN01], [CEN02], [CEN03] and [IEC01]. They provide the basisfor the Development of detailed criteria for the individual architectures.

6.2. Assessment activities

The following assessment activities may be used to assess the processes and products of the architecture.

6.2.1. Referential Examination

This activity aims to identify the safety requirements that have to be taken into account. The goal is to list thedocumentation, the standards and other information, such as assessment and certification reports, that areneeded for the assessment of the computer architecture.

6.2.2. Safety Management Assessment

The safety management assessment will examine all technical and management activities, during the wholearchitecture life-cycle, to ensure that the safety-related systems and external risk reduction facilities allow therequired functional safety to be attained.

Competence of staff, departments or other groups involved in safety management activities will also form partof this assessment.

6.2.3. Quality Management Assessment

The quality management system shall be examined systematically to assess compliance with EN 29001 and/orother applicable quality procedures specified by the developer.

6.2.4. Organisation Assessment

The aim of this element of the assessment is to assess the capability of the organisation to administer safetyprocedures. It has to ensure that the responsibilities of the staff and their competence and training requirementsare clearly specified and this process is being implemented.

6.2.5. Development Phase Assessment

The assessment of the development phase shall examine all development activities in order to verify that theyare undertaken in conformance with the relevant standards and with the required safety integrity level.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 46: ACRuDA Deliverable D3

6.2.6. Safety Plan Assessment

The structure and the content of the safety plan shall be examined to check whether they conform to theACRuDA Safety Plan Requirements.

6.2.7. Safety Case Assessment

The safety case structure and content shall be checked for conformance to the ACRuDA Safety Case Structure(see chapter 4.4.3).

6.3. Structure of the criteria

These criteria are primarily written for the assessors of safety critical digital architectures, but they are alsoexpected to provide valuable guidelines for developers and users.

The basic criteria are presented in the form of process and product properties. They state the requirements forthe life-cycle processes and products and each requirement is devised to address a specific set of hazards. Theserequirements will, in general, be satisfied by using the relevant techniques and measures recommended by thesafety critical standards. Therefore, with each set of basic criteria, a table of relevant techniques and measures isattached. These tables also identify the objects to which they apply.

The effectiveness with which these techniques control the hazards or cover the faults, depends on variousfactors, such as their frequency of application, accuracy of fault detection and timeliness of fault negation. Theeffectiveness therefore, depends on the degree of sophistication used to implement the measure. For example,effectiveness of a coding technique could very well depend on the size of code word, the bigger the size, moreeffective is the implementation.

6.4. Process/Project criteria

A set of well-structured life cycle plans is essential for ensuring the product integrity of digital architectures.Suitable life cycle frameworks have been described in standards, [CEN03] and [IEC01], and in the ACRuDASafety Case Structure (see chapter 4.4.3).

6.4.1. Contents

The structure and activities of the product life cycle shall provide a systematic approach to the development,production, support and maintenance of the product.

The activities required to identify, control or eliminate hazards at each life cycle phase shall be described. Astructured plan of these activities constitutes the safety plan.

6.4.2. Basic Criteria

The life cycle plans shall cover all development phases and describe the processes used to ensure thequality, reliability, maintainability and integrity of the products.

1.

The life cycle plans shall identify all the resources to be used and their essential qualities, such as thedesigners and their competence, tools and their reliability, validation teams and their independence.

2.

Each development phase shall precisely specify:3.the inputs, information and resources required to carry-out the activitysummary of the processesits successful termination conditionsits outputs

All development activities shall be covered by an appropriate safety plan. The safety plan should have theapproval of the supplier's project manager and the supplier's internal independent safety organisation.

4.

Personnel and responsibilities5.Personnel in the safety should be suitably qualified,

WCS_AP v.03 Test report

3.11.2010 12:59

Page 47: ACRuDA Deliverable D3

The designer/implementor shall be independent of the verifier and validator,Personnel in the safety organisation shall have the competence to undertake this work.

In particular, the safety plan shall be compliant with the ACRuDA Safety Plan Requirements (see chapter4.4.2). All life cycle activities shall be audited for compliance with the safety plan.

6.

The supplier shall produce the safety case of the architecture which shall be compliant with therequirements of the ACRuDA Safety Case Structure (see chapter 4.4.3).

7.

The life cycle plan should cover the following plans:8.Configuration and management plan,Development plan,Quality plan ,Maintenance plan,Manufacturing plan,Safety plan,Verification and Validation Plan.

The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]shall be used to prepare and implement a life cycle plan. Variance from the recommendation of thesestandards should be fully described and justified. The techniques and measures from the standards whichare applicable to life-cycle processes and products, are listed in Table 1. This list is not complete. It isimportant to note that SIL 4 architecture will need a combination of techniques and measures with whichto provide a very high degree of protection against any identified hazard. For instance, a SIL 4architecture safety planning process procedure would need to use a combination of several techniques andmeasurers, such as checklist, audit and document review.

9.

Activity/Object Technique/Measure Reference

Safety Planningand QualityAssurance

Checklist Audit Document Inspection andwalk-through of the SpecificationsReview after safety plan change Review after each life cycle phase

[CEN03]-E.1

RessourceQualities

Repetitive and regularStaff Training

Completely independentdesigners, validators andassessor

Highly qualified andexperienced staff

[CEN03]-E.3

ProjectManagement

Definitions of tasks andresponsibilities consistency procedures aftermodification configuration management monitoring and control of projectstatus.

[IEC01]-I 7

Documentation Guidelines for organisation scheme. Checklists for contents, uniqueform, on-line documents, formalisedrevision, interface description,environment studies, modificationand maintenance procedures,

[IEC01]-I 7 &G.2

[CEN03]-E.8

WCS_AP v.03 Test report

3.11.2010 12:59

Page 48: ACRuDA Deliverable D3

manufacturing and applicationdocuments.

Construction Use well tried and approvedcomponents

[IEC01], I 7

Testing andValidation

Black-box testing from cause-consequence diagram, boundaryvalue cases Statistical testing - realisticdistribution of input data andassumed failure modes Proven by use

[IEC01]-I 7

Manufacturing Requirements, precautions andaudit plan of actual manufacturingprocess by safety organisation

[CEN03]-E.9

Installation andMaintenance

Requirements, precautions andaudit plan of actual installation andmaintenance processes by safetyorganisation

[CEN03], E.9

Table 1: Life Cycle - Techniques and Measures

6.5. Requirements

A systematic approach to requirements development is essential to ensure high integrity.

6.5.1. Contents

The functionality and integrity, reliability and performance requirements of the architecture should be specified.

The desired features, such as protection against some specific component faults or target time for faultdetection, are regarded as an integral part of requirements.

6.5.2. Basic criteria

The approach for establishing and identifying detailed requirements shall be described. This shouldinclude procedures for:

1.

deviation of the safety target from the top level architecturedecomposition of system level requirements to lower level requirements specifications,verifying the consistency of requirements,tracing their relationships to the design objects, components and code,providing traceability to test specification to enable testing of each requirements to validate thesystems,mechanisms to ensure that changes to requirements are fully controlled

The safety critical digital architecture shall meet the SIL 4 requirements as prescribed by the standards[CEN03] and [IEC01].

2.

The safety requirements shall consider Human factor issues, reliability of the operators, informationoverloading, operator errors, etc.

3.

The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]4.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 49: ACRuDA Deliverable D3

shall be used for the requirements. Variance from the recommendation of these standards should be fullydescribed and justified. The techniques and measures from the standards which are applicable to life-cycleprocesses and products, are listed in Table 2.

Activity/Object Technique/Measure Reference

RequirementsSpecification

Separation of safety-related functionsfrom non-safety related functions Graphical description Structured Specification Inspection of Specification Hazard log Test Specification

[IEC01]-2B

[CEN03]-E.2

RequiredDesign Features

Protection against operator error Protection against sabotage Protection against single faults fordiscrete components and digitalelectronics Physical independence (insulation) Target time for detection andnegation of single fault Retention of safe state Target time for detection andnegation of multiple faults Dynamic fault detection Program sequence monitoring Measures against power supplymal-function Secondary Protection againstsystematic faults

[CEN03]-E.5

Table 2: Techniques and Measures for Requirements

6.6. Design

Digital architectures are designed to reduce random and systematic credible faults to an acceptable level byusing appropriate techniques and measures.

6.6.1. Contents

The design describes all the elements of the architecture, their interrelationships and interfaces, and their role infulfilling the requirements.

The techniques and measures used to achieve the design goal are also explained.

6.6.2. Basic criteria

The procedures used to derive the design from the requirements and to verify the design against therequirements shall be described.

1.

The safety critical digital architecture shall provide the following functionality:2.implementation of requirements derived from mitigation or elimination of hazard identified for therange of perceived applications of the architecture,execution of application programs,collection of inputs and delivery of outputs,

WCS_AP v.03 Test report

3.11.2010 12:59

Page 50: ACRuDA Deliverable D3

detection and negation of faults,provision of timer and watchdog functions,fail-safe inputs and outputs,facility to install application programs.

The hazardous failure rate of any vital hardware components shall be derived from the overall safety

target. For example, see ERTMS Specifications which stated 10-10 faults/hour [ERT96].

3.

All credible failure modes for each hardware and software element of the architecture shall be identified.4.The hardware components shall be able to perform the safety function in the presence of two faults(source [IEC01] part 2 table 2).

5.

Faults shall be detected with on-line, high diagnostic coverage (source [IEC01] part 2 table 2). A fail-safearchitecture very much depends on the effectiveness of its fault detection measures, it may not need anyon-line diagnostics. However, a fail-operational architecture needs detailed on-line diagnostic coverage toachieve its integrity and reliability, because without this it is very difficult to implement any recoverymechanism.

6.

Undetected hazardous faults shall be detected by the (off-line) proof checks (source [IEC01] part 2 table2).

7.

The architecture shall be designed to minimise the credible faults by using a combination of well tried andwell defined fault avoidance and fault tolerant measures.

8.

The design specification shall identify the components and modules of the architecture, and describe theirfunctional and other characteristics (such as their integrity levels, failure rates, performance). It shall alsodescribe interfaces, internally and with external equipment.

9.

The failure modes of all the following components shall be identified along with the techniques andmeasures used to eliminate or mitigate the hazards arising from such failures:

10.

main processor, co-processors and micro-controllers,watchdog and clock,I/O cards, data path and field bus,communication network,operating system or executive program.

A quantitative estimate of the reliability of the overall architecture (for the worst case scenarios) shall bepresented. The process, procedures and standards on which these are based shall form part of thispresentation.

11.

The design of the architecture shall ensure that higher integrity level modules are not affected by lowerintegrity level modules. Appropriate analyses shall be used to justify this.

12.

The design shall ensure that the architecture operate correctly in all foreseeable environmental conditions,such as EMC, noise, heat, etc. The envelop for the environmental conditions and requirements shall bedefined in the requirements specification.

13.

All software components of the architecture shall conform to [CEN02] norms and the relevant GAMprinciples. The detailed software assessment criteria are given in chapter 6.10.

14.

The detailed hardware assessment criteria are given in chapter 6.11.15.The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]shall be used for the design. Variance from the recommendation of these standards should be fullydescribed and justified. The techniques and measures from the standards which are applicable to life-cycleprocesses and products, are listed in Table 3.

16.

Activity/Object Technique/Measure Reference

Architecture Dual digital channels based oncomposite fail-safety with fail-safecomparison Single digital channel based oninherent fail-safety Single digital channel based onreactive fail-safety Diverse digital channels with fail-safecomparison Justification of the architecture byquantitative reliability analysis of the

[CEN03]-E.4

WCS_AP v.03 Test report

3.11.2010 12:59

Page 51: ACRuDA Deliverable D3

hardware

ProcessingUnits

Comparator Majority voting Self-test by software (single channelonly) Self-test by hardware (single channelonly) Coded processing (single channelonly) Reciprocal comparison by software

[IEC01]-2B

Invariablememory ranges

Signature of a double word (16 bit) Block replication

[IEC01]-2B

Variablememory ranges

Galpat or transparent Galpattest

Abraham test Double RAM with hardware

or software comparison andread/write test

[IEC01]-2B

I/O units andinterfaces

Test pattern Code protection Multi-channelled parallel output Monitored outputs Input comparison

[IEC01]-2B

Data paths Complete hardware redundancy Inspection using test patterns Transmission protocol Transmission redundancy Information redundancy

[IEC01]-2B

Power supply Overvoltage protection with shut-off Voltage control (secondary) Power-down with shut-off Graceful degradation

[IEC01]-2B

Watchdog Separate time basis and time-window Combination of temporal and logicalmonitoring of program sequence Temporal monitoring with on-linecheck

[IEC01]-2B

Clock Reciprocal comparison in redundantconfiguration Dual frequency timer

WCS_AP v.03 Test report

3.11.2010 12:59

Page 52: ACRuDA Deliverable D3

Communication Separation of electrical energy fromcommunication lines Spatial separation in redundant lines Increase of interference immunity Antivalent signal transmission

Input andOutput cards

Idle current principle Test pattern Electrical interlocking Cross-monitoring of redundant units

SoftwareComponents

Techniques and measuresrecommended by [CEN02] and[IEC01] part 3

[CEN02][IEC01]-3

Table 3: Techniques and Measures for Design

6.7. Validation and off line testing

Validation of architectures against their requirements and dynamic off-line testing of their elements andassemblies are essential to ensure their integrity.

6.7.1. Contents

A well structured validation and test plan is required. This plan shall describe all the activities from testenvironment set-up and test scenario selection to test execution and analysis of the test results. It also describestest organisation, test processes and test documentation. The test specifications, acceptance criteria and testresults form an essential part of evidence of safety.

6.7.2. Basic criteria

The plan shall define the validation test process by:1.identifying the requirements specification against which the validation test is based,identifying the different test phases including unit, integration and requirements testing,specifying the test organisation and their responsibilities,describing the testing procedures, e.g. fault injection, statistical testing method, or regression testing,identifying the test environment and their required integrity and quality requirements.

Procedures to ensure and demonstrate independence of test from the design and integration activities shallbe defined. Such procedures shall describe, explicitly, requirements for independence of groups ofpersonnel.

2.

Each test phase shall accompany a test specification describing the test objectives, test scenarios,configuration data, and acceptance criteria.

3.

The SIL 4 test techniques and measures (see Table 2.1), recommended by the standards, shall be used.4.The test results shall record the frequency of tests, fault detection success rate, coverage and meandetection times.

5.

The test results shall be analysed to give quantitative estimates of the hidden faults, and their effects onreliability estimates.

6.

The Assessor shall witness a representative sample of tests to ensure that the test procedures have beencorrectly implemented.

7.

The test specification shall cover all credible failure modes.8.The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]shall be used for the validation and off line testing. Variance from the recommendation of these standardsshould be fully described and justified. The techniques and measures from the standards which areapplicable to life-cycle processes and products, are listed in Table 4

9.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 53: ACRuDA Deliverable D3

Activity/Object Technique/Measure Reference

Hardware Fault injection

Software Regression testing

Verification &Validation

Project management Documentation Functional testing under environmental conditions Interference immunity testing Functional testing at ambient Check-list Calculation of failure rate Static and dynamic analysis ‘Worst case’ and failure analysis Simulation Statistical testing Surge immunity testing Expanded functional testing ‘Worst case’ testing Black-box testing

[CEN03]-E9

[IEC01]-2

Table 4: Techniques and Measures for Validation & Testing

6.8. Fault and failure analyses

An independent fault and failure analysis of the digital architecture shall show that the architecture has beenthoroughly analysed to ensure that all credible faults are identified, the fault control methods are effective, andthe residual faults are non-hazardous.

6.8.1. Contents

The analyses should be carried out as detailed in the safety plan. Their application, procedures and scopeof analysis should be explained.

1.

The main finding of the analyse should be available for examination.2.

6.8.2. Basic criteria

The analyses shall be planned and performed in a timely manner, so that their findings are effectively usedin the development process.

1.

Review and incorporation of the finding of the analyses shall be part of a formal implementation process.2.The analyses shall identify all credible failure modes, estimate their criticality and frequency ofoccurrence.

3.

The types of failures considered shall be specified. They shall cover as far as possible all static andintermittent failures, combination of failure modes, hazardous and safe failures, and latent and undisclosedfailure modes.

4.

The results and findings of the analyses shall be integrated in the safety case of the architecture, they shallform the core of the safety argument, evidence to support functional and technical safety.

5.

The faults arising from the following sources shall be considered:6.hardware and software and their interactions,environmental factors, eg. EMC,network elements and data and field buses,operators and operating conditions,critical operations including start up and close-down.

The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]7.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 54: ACRuDA Deliverable D3

shall be used for fault and failure analysis. Variance from the recommendation of these standards shouldbe fully described and justified. The techniques and measures from the standards which are applicable tolife-cycle processes and products, are listed in Table 5.

Activity/Object Technique/Measure Reference

Risk reduction Preliminary Hazard Analysis (PHA) Fault Tree Analysis (FTA) Failure Mode, Effects and CriticalityAnalysis (FMECA) Hazard and Operability studies(HAZOP) Cause-consequence diagrams Markov diagrams Event tree Reliability Block Diagram Common Cause Failure Analysis Historical event analysis Zonal Analysis

[CEN03]-E6

Measuresagainst

systematichardwarefailures

Failure detection via technicalprocess (on-line) Programme sequence monitoring Test by additional hardware Standard test access port andboundary scan architecture Code protection Diverse hardware Fault-detection and diagnosis Error detection and correcting codes Safety bag techniques Diverse programming Dynamic reconfiguration Failure assertion programming Recovery block Backward or forward recovery Graceful degradation Artificial intelligence - faultcorrection Re-try fault recovery mechanisms Memorising executed

[IEC01]-2

Measuresagainstenvironmentalfailures

Measures against voltage breakdown Measures against voltage variations,overvoltage, low voltage Separation of electrical energy linesfrom information lines Failure detection via technicalprocess (on-line) Programme sequence monitoring Measures against temperatureincrease Spatial separation in redundant lines Test by additional hardware Protection code

[IEC01]-2

WCS_AP v.03 Test report

3.11.2010 12:59

Page 55: ACRuDA Deliverable D3

Increase of interference immunity Antivalent signal transmission Diverse hardware Software architecture

Table 5: Techniques and Measures for Fault & Failure Analysis

6.9. Operation, Maintenance and Support

A digital architecture is a "product", primarily designed to be used as a platform for delivering safety criticalapplications. To fulfil this aim, it must be supported with an adequate, operation and maintenance programme.

The objective of the overall operation and maintenance is to operate and maintain the safety architecture, itscontrol system and the total combination of safety-related systems and external risk reduction facilities such thatthe designed functional safety is maintained.

6.9.1. Contents

User manual, maintenance manual, upgrade and new release procedure FRACAS, user support.

6.9.2. Basic criteria

There shall be a maintenance plan for the product which should include collection of field data. Inspectionand off-line tests shall be performed at regular interval.

1.

A support service plan shall specify support organisation, its responsibilities and policies. The supportprocedure shall explain the mechanisms used for fault reporting and incorporating new releases.

2.

Safety operation procedures, inspection and maintenance procedures shall be formulated and defined in away that ensures safety and minimises operator errors. All relevant issues from the hazard and safetyanalyses shall be addressed.

3.

The digital architecture components shall be kept as simple as possible to reflects the limits of the humancapacity. Appropriate metrics may be used to assess the relative complexity of these components.

4.

Data-driven systems (including parametric or configurable systems) shall be protected against possibleerrors arising from entry of incorrect data.

5.

The control devices and means of surveillance shall be such that additional hazards due to operator errorare remote.

6.

There shall be a well specified procedure for collecting and analysing the product's history of use data.7.The techniques and measures, equivalent to those recommended by the standards, [CEN03] and [IEC01]shall be used for operation, maintenance and support. Variance from the recommendation of thesestandards should be fully described and justified. The techniques and measures from the standards whichare applicable to life-cycle processes and products, are listed in Table 6.

8.

Activity/Object Technique/Measure Reference

Operation andmaintenanceprocedures

Project management Documentation User & maintenance friendliness Limited operation possibilities Training in the execution ofoperational and maintenanceinstructions Protection against operating errors Protection against sabotage

[CEN03]-E10

Table 6: Techniques And Measures for Operation, Maintenance And Support

WCS_AP v.03 Test report

3.11.2010 12:59

Page 56: ACRuDA Deliverable D3

6.10. Software Assessment Criteria

6.10.1. Software integrity level

The required software integrity level shall be decided on the basis of the level of risk associated with the use ofthe software in the architecture. The safety integrity level shall be specified through the process identified in[CEN01].

6.10.2. Life cycle issues and documentation

Software planning. The supplier shall produce the following documents and the assessor shall perform ajudgement on their contents :

1.

Software Configuration Management PlanSoftware development planSoftware quality assurance planSoftware validation planSoftware maintenance planSoftware/hardware integration planSoftware integration planSoftware verification plan

Software Requirements. The supplier shall produce the following documents and the assessor shallperform a judgement on their contents:

2.

Software requirements specificationSoftware requirements test specificationSoftware requirements verification report

Software Design: The supplier shall produce the following documents and the assessor shall perform ajudgement on their contents:

3.

Software architecture specificationSoftware design specificationSoftware design test specificationSoftware integration specificationSoftware architecture and design verification report

Software Module Design. The supplier shall produce the following documents and the assessor shallperform a judgement on their contents:

4.

Software module design specificationSoftware module test specificationSoftware module verification report

Code. The supplier shall produce the following documents and the assessor shall perform a judgement ontheir contents:

5.

Software source code and supporting documentationSoftware source code verification report

Module Testing. The supplier shall produce the following documents and the assessor shall perform ajudgement on their contents:

6.

Software module test reportSoftware Integration. The supplier shall produce the following documents and the assessor shall perform ajudgement on their contents:

7.

Software integration reportSoftware/Hardware Integration. The supplier shall produce the following documents and the assessor shallperform a judgement on their contents:

8.

Software/hardware integration reportSoftware Validation. The supplier shall produce the following documents and the assessor shall perform ajudgement on their contents:

9.

Software validation reportSoftware Assessment. The supplier shall produce the following documents and the assessor shall performa judgement on their contents:

10.

Software assessment reportSoftware Maintenance. The supplier shall produce the following documents and the assessor shall perform11.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 57: ACRuDA Deliverable D3

a judgement on their contents:Software maintenance recordsSoftware maintenance log

The techniques and measures, equivalent to those recommended by the standard, [CEN02] shall be usedfor the software development. Variance from the recommendation of these standards should be fullydescribed and justified. The techniques and measures from the standards which are applicable to life-cycleprocesses and products, are listed in Table 7.

12.

Activity/Object Technique/Measure Reference

Softwarerequirementsspecification

Formal Methods: CSSCSPHOLLOTOSOBJtemporal logicVDMZB

Semi-Formal Methods: Logic/function block diagramsSequence diagramsData flow diagramsFinite state machines/statetransition diagramsTemporal Petri netsDecision/truth tables

Structured Methodology: JSDMASCOTSADTSSADMYourdon

[CEN02]

SoftwareArchitecture

Defensive programming Fault detection and diagnosis Error correcting codes Error detection codes Failure assertion programming Safety bag techniques Diverse programming Recovery block Backward recovery Forward recovery Re-try fault recovery mechanisms Memorising executed cases Artificial intelligence fault correction Dynamic reconfiguration of software Software error effect analysis Fault tree analysis

[CEN02]

Softwaredesign anddevelopment

Formal Methods: CSSCSP

WCS_AP v.03 Test report

3.11.2010 12:59

Page 58: ACRuDA Deliverable D3

(1/4) HOLLOTOSOBJtemporal logicVDMZB

Semi-Formal Methods: Logic/function block diagramsSequence diagramsData flow diagramsFinite state machines/statetransition diagramsTemporal Petri netsDecision/truth tables

Activity/Object Technique/Measure Reference

Softwaredesign anddevelopment(2/4)

Structured Methodology: JSDMASCOTSADTSSADMYourdon

Modular Approach: Module size limitedInformationhiding/encapsulationParameter number limitOne-entry/one-exit point insubroutines and functionsFully defined interface

Design And Coding Standards: Coding standard existsCoding style guideNo dynamic objectsNo dynamic variablesLimited use of pointerslimited use of recursionNo unconditional jumpsAnalysable Programs

Strongly Types ProgrammingLanguage Structured Programming Programming Language:

ADAMODULA-2PASCALFORTRAN 77CPL/MBASICAssemblerLadder diagramsFunctional blocks

[CEN02]

WCS_AP v.03 Test report

3.11.2010 12:59

Page 59: ACRuDA Deliverable D3

Statement listSubset of C with codingstandards

Activity/Object Technique/Measure Reference

Softwaredesign anddevelopment(3/4)

Language Subset Validated Translator Translator Proven In Use Library Of Trusted/Verified ModulesAnd Components Functional And Black-Box Testing

Test case fromcause/consequence diagramsPrototyping/animationBoundary value analysisEquivalence classes and inputpartition testingProcess simulation

Performance Testing Avalanche/stress testingResponse timing and memoryconstraintsPerformance specification

Interface Testing Data Recording And Analysis Fuzzy Logic Object Oriented Programming Software verification and testing Formal Proof Probabilistic Testing

Failure probability per demandfailure probability during acertain period of timeProbability of errorcontainmentProbability of failure freeexecutionProbability of survivalAvailabilityMTBF or failure rateProbability of safe execution

Activity/Object Technique/Measure Reference

Softwaredesign anddevelopment(4/4)

Static Software Analysis Boundary value analysisChecklistsControl flow analysisData flow analysisError guessingFagan InspectionsSneak circuit analysis

Symbolic execution

WCS_AP v.03 Test report

3.11.2010 12:59

Page 60: ACRuDA Deliverable D3

Walkthroughs/ design reviews Dynamic Analysis And Testing

Test case execution fromboundary value analysistest case execution from errorguessingTest case execution from errorseedingPerformance modellingEquivalence classes and inputpartition testingStructured-based testing

Metrics Traceability Matrix

Software/hardwareintegration

Functional And Black-Box Testing Test case fromcause/consequence diagramsPrototyping/animationBoundary value analysisEquivalence classes and inputpartition testingProcess simulation

Performance Testing Avalanche/stress testingResponse timing and memoryconstraintsPerformance specification

[CEN02]

Activity/Object Technique/Measure Reference

Softwarevalidation

Probabilistic Testing Failure probability per demandfailure probability during acertain period of timeProbability of errorcontainmentProbability of failure freeexecutionProbability of survivalAvailabilityMTBF or failure rateProbability of safe execution

Performance Testing Avalanche/stress testingResponse timing and memoryconstraintsPerformance specification

Functional And Black-Box Testing Test case fromcause/consequence diagramsPrototyping/animationBoundary value analysis

[CEN02]

WCS_AP v.03 Test report

3.11.2010 12:59

Page 61: ACRuDA Deliverable D3

Equivalence classes and inputpartition testingProcess simulation

Modelling Data flow diagramsFinite state machinesFormal methodsPerformance modellingTime Petri netsPrototyping/ animationStructure diagrams

Activity/Object Technique/Measure Reference

Assessmenttechniques

Static Software Analysis

Boundary valueanalysisChecklistsControl flow analysisData flow analysisError guessingFagan InspectionsSneak circuit analysisSymbolic executionWalkthroughs/ designreviewsChecklists

Dynamic Software Analysis

Test case executionfrom boundary valueanalysistest case executionfrom error guessingTest case executionfrom error seedingPerformancemodellingEquivalence classesand input partitiontestingStructured-basedtestingTest case fromcause/consequencediagramsPrototyping/animationBoundary valueanalysisEquivalence classesand input partitiontestingProcess simulation

[CEN02]

WCS_AP v.03 Test report

3.11.2010 12:59

Page 62: ACRuDA Deliverable D3

Cause-Consequence Diagrams Event Tree Analysis Software Error Effect Analysis Common Cause Failure

Analysis Markov Model Reliability Block Diagram Field Trial Before

Commissioning

Activity/Object Technique/Measure Reference

Softwarequalityassurance

Accredited to EN 29001 Compliant with EN 29000-3 Company quality system Software configuration management

[CEN02]

Softwaremaintenance

Impact Analysis Data Recording and Analysis

[CEN02]

Table 7: Techniques and Measures for Software

6.11. Hardware Assessment Criteria

6.11.1. Life cycle issues and documentation

Hardware requirements. The supplier shall produce the following documents and the assessor shallperform a judgement on their contents:

1.

Hardware requirements specificationHardware requirements test specificationHardware requirements verification report

Hardware design. The supplier shall produce the following documents and the assessor shall perform ajudgement on their contents:

2.

Hardware architecture specificationHardware design specificationHardware design test specificationHardware integration specificationHardware architecture and design verification report

Hardware testing. The supplier shall produce the following documents and the assessor shall perform ajudgement on their contents:

3.

Hardware test reportThe techniques and measures, equivalent to those recommended by the standard, [CEN03] and [IEC01]shall be used for the Hardware development. Variance from the recommendation of these standardsshould be fully described and justified. The techniques and measures from the standards which areapplicable to life-cycle processes and products, are listed in Table 8.

4.

Activity/Object Technique/Measure Reference

Architecture Separation of safety-relatedsystems from non safety-related systems

[IEC01]

WCS_AP v.03 Test report

3.11.2010 12:59

Page 63: ACRuDA Deliverable D3

Single electronic structurewith self test and supervision

Dual electronic structure Dual digital channels based

on composite fail-safety withfail-safe comparison

Single electronic structurebased on inherent fail-safety

Single electronic structurebased on reactive fail-safety

Diverse electronic structurewith fail-safe comparison

Justification of thearchitecture by quantitativereliability analysis of thehardware

Processing units Comparator Majority voter Selftest by software (one

channel) Selftest supported by

hardware (one channel) Coded processing (one

channel) Reciprocal comparison by

software

[IEC01]

Invariablememory ranges

Signature of a double word(16 bit)

Block replication

[IEC01]

Variablememory ranges

Galpat or transparentGalpat test

Abraham test Double RAM with

hardware or softwarecomparison and read/writetest

[IEC01]

Activity/Object Technique/Measure Reference

I/O units andinterfaces

Test pattern Code protection Multi-channelled parallel

output Monitored outputs Input comparison

[IEC01]

Clock Reciprocal comparison inredundant configuration

[IEC01]

WCS_AP v.03 Test report

3.11.2010 12:59

Page 64: ACRuDA Deliverable D3

Dual frequency timer

Power supply Overvoltage protection withshut-off

Voltage control(secondary)

Power-down with shut-off Graceful degradation

[IEC01]

Watchdog Separate time basis andtime-window

Combination of temporaland logical monitoring ofprogram sequence

Temporal monitoring withon-line check

[IEC01]

Data paths Complete hardwareredundancy

Inspection using testpatterns

Transmission protocol Transmission redundancy Information redundancy

[IEC01]

Communication Separation of electricalenergy

Spatial separation inredundant lines

Increase of interferenceimmunity

Antivalent signaltransmission

[IEC01]

Input andoutput cards

Idle current principle Test pattern Electrical interlocking Cross-monitoring of

redundant units

[IEC01]

Table 8: Techniques and Measures for Hardware

7. TERMINOLOGY

7.1. Introduction

This document is a list of working definitions for terms used in ACRuDA project related to the safety criticalapplication. The following principle have been used in selecting and forming these definitions:

WCS_AP v.03 Test report

3.11.2010 12:59

Page 65: ACRuDA Deliverable D3

Existing definitions in accepted documents (Standards for example) should be used where possible. Inthese cases, the source document of definitions is indicated,

1.

Where no satisfactory existing definition can be agreed, a new term or phrase should be coined ratherthan using an existing one in a new or non-standard way. This will reduce confusion,

2.

If different definitions exist for a same term, the different definitions are presented (each definition ispreceded by a number between brackets) and a definition in relation to ACRuDA project will be agreed.

3.

7.2. Terminology, Definitions and Abbreviations

Acceptance: The status given to any product by the final user. Source: [CEN01].1.Accident: An unintended event or sequence of events that results in death, loss of a system or service orenvironmental damage. Source: [CEN03].

2.

Accreditation:3.

(1) Procedure by which the technical competence and the impartiality of a testing laboratory isrecognised. Source: [ITS01].(2) Formal recognition of the laboratory competence to achieve some tests or some established type tests.Source: [EN01].Accreditation Body: Body which manages a laboratory accreditation system and pronouncedaccreditation. Source: [EN01].

4.

Accreditation Criteria (for a laboratory): Set of requirements defines and applied by an accreditationbody, and that a testing laboratory must satisfy to be accredited. Source: [EN01].

5.

Accreditation System: System with its own procedures and management rules to proceed laboratoryaccreditation. Source: [EN01].

6.

Accredited Laboratory: Testing laboratory that has been accredited. Source: [EN01].7.Apportionment: A process, whereby the RAMS elements for a system, are sub-divided between thevarious items which comprise the system to provide individual targets. Source: [CEN01]

8.

Approval: The status given to any product by the requisite Authority when the product has fulfilled a setof predetermined conditions. Source: [CEN01]

9.

Assurance of Conformity: Procedure resulting in a statement giving confidence that a product, process orservice fulfils specified requirements

10.

Assessment:11.

(1) The undertaking of an investigation in order to arrive at a judgement based on evidence, of thesuitability of a product. Source: [CEN01].(2) The process of analysis to determine whether the design authority and the validator have achieved aproduct that meets the specified requirements and to form a judgement as to whether the product is fit forits intended purpose. Source: [CEN03].Assessment inputs: In the ACRuDA project, the assessment inputs are all the data necessary to achievethe assessment (hardware, software, documentation, tools, standards, etc.).

12.

Assessment Repeatability: the repetition of the assessment of the same product, with the same safetyrequirements specification evaluated by the same assessor must give the same judgement than the overallverdict as the first assessment. Source: [ITS01].

13.

Assessment Reproducibility: assessment of the same product, with the same safety requirementsspecification evaluated by an other assessor must give the same overall verdict as the first assessor.Source: [ITS01].

14.

Assessor:15.

(1) A body with responsibility for undertaking assessments. Source: [CEN01].(2) The person or agent appointed to carry out the assessment. Source: [CEN03].Attribute: An abstract term qualifying the properties of an item of data.16.Audit: A systematic and independent examination to determine whether the procedures specific to therequirements of a product comply with the planned arrangements, are implemented effectively and aresuitable to achieve the specified objectives. Source: [CEN01].

17.

Availability: The ability of a product to be in state to perform a required function under given conditionsat a given instant of time or over a given time interval assuming that the required external resources areprovided. Source: [CEN01], [CEN02], [CEN03].

18.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 66: ACRuDA Deliverable D3

Behaviour: The description of any sequence of states and transitions likely to exist in one system.19.Certification:20.

(1) Formal declaration which confirms the results of an assessment and the fact that the assessmentcriteria were correctly applied. Source: [ITS01].(2) Action by a third party, demonstrating that the specific sample tested is in conformity with a specificstandard or other normative document.Certification Body or Notified body: impartial and independent body which achieves certifications.Source: [ITS01].

21.

Certification System: A system that has its own rules of procedures and management for carrying outcertification of conformity. Source: [ITS01].

22.

Coding: This is the work of translating the results of the detail design into a program using a givenprogramming language - one of the phases of the software life cycle.

23.

Cohesion: The degree to which measures taken, interact with and depend on each other.24.Commercial of the shelf (COTS) software: Software defined by market-driven need, commerciallyavailable and whose fitness for purpose has been demonstrate by a broad spectrum of commercial users.Source: [CEN02].

25.

Common cause failure: A failure which is the result of an event(s) which because of dependencies,causes a coincidence of failure states of components in two or more separate channels of a redundancysystem , leading to the defined system failing to perform its required function. Source: [CEN01].

26.

Common Mode Failure: Failure of apparently independent components or communication links due toan initiating event which effects them all. Source: [IDS01].

27.

Common Mode Fault: Fault common to items which are intended to be independent. Source: [CEN03].28.Compliance: A demonstration that a characteristic or property of a product satisfies the statedrequirements. Source: [CEN01].

29.

Component:30.

(1) A part of a product that has been determined to be a basic unit or building block. A component may besimple or complex. Source: [CEN03].Configuration: The structuring and interconnection of the hardware and software of a system for itsintended application. Source: [CEN03].

31.

Configuration management: A discipline applying technical and administrative direction andsurveillance to identify and document the functional and physical characteristics of a configuration item,control change to those characteristics, record and report change processing and implementation statusand verify compliance with specified requirements. Source: [CEN01].

32.

Conformity: the degree to which a given real product correspond to its description.33.Conformance Testing: Testing whose purpose is checking whether the system satisfies its specification.Source: [LAP01].

34.

Control Flow Analysis:Analysis of the sequence of execution in a computer program. This analysis canshow unreachable code, dynamic halts or false entry points. Source: [IDS01].

35.

Coverage: Measure of the representatively of the situations to which a system is submitted during itsvalidation compared to the actual situations it will be confronted with during its operational life. Source:[LAP01].

36.

Criterion: A standard by which a correct judgement may be formed.37.Criticality (system): Level of safety integrity of function or component. Source: [IDS01].38.Defensive Programming: Writing programs which detect erroneous input and output values and controlflow. Such programs prevent propagation of errors and recover by software where possible. Source:[IDS01].

39.

Dependability: Trustworthiness of a computer system such that reliance can justifiably be placed on theservice it delivers. Source: [LAP01].

40.

Dependent failure: The failure of a set of events, the probability of which cannot be expressed as thesimple product of the unconditional probabilities of the individual events. Source: [CEN01].

41.

Design: The pre - build exercise of defining elements and their interconnection such that the product willmeet its specified requirements. Source: [CEN03].

42.

Detection (error): The action of identifying that a system state is erroneous. Source: [LAP01].43.Deterministic Testing: Form of testing where the test patterns are predetermined by a selective choice.Source: [LAP01].

44.

Development Environment: set of organisational measures, procedures and standards which must be45.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 67: ACRuDA Deliverable D3

used during the development of the product. Source: [ITS01].Development Process: set of phases and tasks by which a product is built and which translates thespecification in software and hardware. Source: [ITS01].

46.

Disturbance: An unexpected influence of the environment on the behaviour of the equipment47.Diversity: A means of achieving all or part of the specified requirements in more than one independentand dissimilar manner. Source: [CEN03].

48.

Dormant Fault: Internal fault not activated by the computation process. Source: [LAP01].49.Dynamic Verification: Verification involving exercising the system. Source: [LAP01].50.Effectiveness: The degree to which the safety measures taken actually achieve the desire results.51.Element: A part of a product that has been determined to be a basic unit or building block. An elementmay be simple or complex. Source: [CEN03].

52.

End User: Person, in contact with a product, and who only uses the operational capacity of the product.Source: [ITS01].

53.

Equipment: A functional physical item. Source: [CEN03].54.Error: A deviation from the intended design which could result in unintended system behaviour orfailure. Source: [CEN02], [CEN03].

55.

Fail Safe: A concept which is incorporated into a design of a product such that, in the event of failure, itenters or remains in a safe state. Source: [CEN03].

56.

Failure: A deviation from the specified performance of a system. A failure is the consequence of a faultor error in the system. Source: [CEN02], [CEN03].

57.

Failure cause: The circumstances during design, manufacture or use which have led to a failure. Source:[CEN01].

58.

Failure Mode: The predicted or observed results of a failure cause on a stated item in relation to theoperating conditions at the time of the failure. Source: [CEN01].

59.

Failure Mode and Effect Criticality Analysis (FMECA): A type of FMEA which intended todetermine, inductively, the nature and criticality of the consequences and fail of the equipment.

60.

Failure rate: The limit, if it exists , of the ratio of the conditional probability that the instant of time, T, ofa failure of a product falls within a given time interval (t + delta(t)) and the length of this interval, delta(t) when delta(t) tends towards zero, given that the item is in an up state at the start of the time interval.Source: [CEN01].

61.

Fault: An abnormal condition that could lead to an error in a system. A fault can be random orsystematic. Source: [CEN01], [CEN02], [CEN03].

62.

Fault Avoidance: The use of design techniques which aim to avoid the introduction of faults during thedesign and construction of a system. Source: [CEN02].

63.

Fault Detection Time: Time span which begins at the instant when a fault occurs and ends when theexistence of the fault is detected. Source: [CEN03].

64.

Fault Mode: One of the possible states of a faulty product for a given required function. Source:[CEN01].

65.

Fault Tolerance: The built in capability of a system to provide continued correct execution i.e. provisionof service as specified in the presence of a limited number of hardware or software faults. Source:[CEN02].

66.

Fault Tree analysis: An analysis to determine which fault modes of the product, sub-product or externalevents, or combination thereof, may result in a stated fault mode of the product, presented in the form ofa fault tree. Source: [CEN01].

67.

FMEA: an acronym meaning Failure Mode and Effect Analysis. A qualitative method of reliabilityanalysis which involves the study of the fault modes which can exist in every sub-product of the productand the determination of the effects of each fault mode on other sub-product of the product and on therequired functions of the product. Source: [CEN01].

68.

Formal Verification: Showing by formal mathematical proof or arguments that software implements its(formal mathematical) specification correctly. Source: [IDS02].

69.

Formal Mathematical Method: Mathematically based method for the specification, design andproduction of software. Also includes a logical inference system for Formal Proofs of Correctness, and amethodological framework for software development in a formally verifiable way. Source: [IDS01].

70.

Formal Mathematical Specification: A specification in a formal mathematical notation. Source:[IDS01].

71.

Formal Proof of Correctness: A way of proving that a computer program follows its specification by amathematical proof using formal rules. Source: [IDS01].

72.

Formally Defined Syntax: A technique such as Backus Naur Form (BNF) used to define the syntax of a73.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 68: ACRuDA Deliverable D3

language, plus collateral definition of an annotation language. Source: [IDS01].FRACAS: A acronym meaning Failure Reporting And Corrective Action System. The process ofreporting a failure on test or in service, analysing its cause and implementing corrective action to preventor reduce the rate of occurrence. Source: [CEN01].

74.

Function: A mode of action or activity by which a product fulfils its purpose. Source: [CEN01].75.Functional Testing: Form of testing where the testing inputs are selected according to criteria relating tothe system's function. Source: [LAP01].

76.

Hazard:77.

(1) A condition which can lead to an accident. Source: [CEN03].(2) A physical situation with a potential for human injury. Sources: [CEN01].Hazard Analysis: The process of identifying the hazards which a product or its use can cause. Source:[CEN03].

78.

Hazard Log: The document in which all safety management activities, decisions made and solutionsadopted, are recorded or referenced. Sources: [CEN01], [CEN03].

79.

Hazard Sequence: A sequence of hazards that can lead to an accident. Sources: [CEN01].80.Human Error: A human action (mistake) which can result in unintended system behaviour/failure.Sources: [CEN03].

81.

Information Flow Analysis: Identification of the input variables on which each output variables dependin a computer program. Used to confirm that outputs only depend on the relevant inputs as specified.Source: [IDS01].

82.

Independence (human): Freedom from intellectual, commercial and /or management involvement.Source: [CEN03].

83.

Independence (technical): Freedom from any mechanism which can affect the correct operation of morethan one item. Source: [CEN03].

84.

Independent Body: A body which is separate and distinct, by ways of management and other ressources,from the bodies responsible for the development of the product. Source: [CEN01].

85.

Item: Element under consideration. Source: [CEN03].86.Maintainability: The probability that a given active maintenance action, for an item under givenconditions of use can be carried out within a stated time interval when the maintenance is performedunder stated conditions and using stated procedures and resources. Source: [CEN01], [CEN03].

87.

Maintenance: The combination of all technical and administrative actions including supervision actions,intended to retain an item in, or restore it to, a state in which it can performs it required function. Source:[CEN01], [CEN03].

88.

Measure: Something done with a view to the accomplishment of a purpose.89.Operating Procedure: set of rules for the definition of the correct utilisation of a product. Source:[ITS01].

90.

Operational Environment: organisational measures, procedures and standards which must be used forthe operation of the product. Source: [ITS01].

91.

Pertinence: The effectiveness of a single measure with respect to a specified desired result.92.Process: A set of operations with defined inputs and outputs.93.Product:94.

(1) Set of software and/or hardware which performs a function design and used or included in multiplesystems. Source: [ITS01].(2) A collection of elements, interconnected to form a system, sub-system, or item of an equipment, in amanner which meets the specified requirements. Source: [CEN03].Proof Obligations: The requirement to prove a theorem to demonstrate the correctness of a developmentstep. Source: [IDS01].

95.

Prototype: A rapidly produced program which is used to validate (part of) a specification. Source:[IDS01].

96.

Quality: A user perception of the attributes of a product. Source: [CEN03].97.RAMS: An acronym meaning a combination of Reliability, Availability, Maintainability and Safety.Source: [CEN01].

98.

Random Faults: An occurrence of a fault based on probability theory and previous performance. Source:[CEN03].

99.

Random Hardware Failure: Failures, occurring at random time, which result from a variety ofdegradation mechanism in the hardware. Source: [CEN01].

100.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 69: ACRuDA Deliverable D3

Random Testing: See Statistical testing. Source: [LAP01].101.Real-time Function: Function required to be fulfilled within finite time intervals dictated by theenvironment. Source: [LAP01].

102.

Real-time Service: Service required to be delivered within finite time intervals dictated by theenvironment. Source: [LAP01].

103.

Real-time System: System fulfilling at least one real-time function or delivering at least one real -timeservice. Source: [LAP01].

104.

Recovery (error): Form of error processing where an error-free state is substituted for an erroneous state.Source: [LAP01].

105.

Reliability: Dependability with respect to the continuity of service. Measure of continuos correct servicedelivery. Measure of time to failure. Source: [LAP01].

106.

Reliability Growth: The system's ability to deliver correct service is improved (stochastic increase of thesuccessive times to failure). Source: [LAP01].

107.

Regression Verification: Verification performed after a correction, in order to check that the correctionhas no undesired consequences. Source: [LAP01].

108.

Risk:109.

(1) The combination of the frequency, or probability, and the consequence of the hazardous event.Sources: [CEN02], [CEN03], [IDS02].(2) The probable rate of occurrence of a hazard causing harm and the degree of severity of the arm.Sources: [CEN01].Risk Analysis: Analysis allowing identification of critical points and safety criteria in system elements.This can be made at the different stages of building a system.

110.

Safe State: A condition which continues to preserve safety. Source: [CEN03]111.Safety: Freedom from unacceptable level of risk. Sources: [CEN01], [CEN02], [CEN03].112.Safety Case: The documented demonstration that the product complies with the specified safetyrequirements. Sources: [CEN01], [CEN03]

113.

Safety Critical: Carries direct responsibility for safety. Source: [CEN03].114.Safety Critical Software: Software used to implement a safety critical function. Source: [IDS01].115.Safety Integrity:116.

(1) The likelihood of a system satisfactorily performing the required safety function under all the statedconditions within a stated period of time. Sources: [CEN01].(2) The likelihood of a safety related system achieving its required safety features under all the statedconditions within a stated operational environment and within a stated period of time. Source: [CEN03].Safety Integrity Level:117.

(1) One of four possible discrete levels for specifying the safety integrity requirements of the safetyfunctions to be allocated to the safety related products/systems. Safety integrity level 4 has the highestlevel of safety integrity and safety integrity level 1, the lowest. Sources: [CEN01].(2) A number which indicates the required degree of confidence that a system will meets its specifiedsafety features. Source: [CEN03].Safety Involved: Carries indirect responsibility for safety. Source: [CEN03].118.Safety Plan:119.

(1) A documented set of time scheduled activities, resources and events, serving to implement theorganisational structure, responsibility, procedures, activities, capabilities and resources that togetherensure that an item will satisfy given safety requirements relevant to a given contract or project. Source:[CEN01].(2) The implemented details of how the safety requirements of the project will be achieved. Source:[CEN03].Safety Process: The series of procedures that are followed to ensure that the safety requirements of aproduct are identified, analysed and fulfilled. Source: [CEN03].

120.

Safety Related: Carries responsibility for safety. Source: [CEN03].121.Safety Related Software: Software which carries responsibility for safety. Source: [CEN02].122.Safety Requirements: the requirements of the safety functions that have to be performed by the safetyrelated products/systems comprising safety functional requirements and safety integrity requirements.Source: [CEN01].

123.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 70: ACRuDA Deliverable D3

Safety Requirement Specification: Specification of the safety, necessary for a product and which is thebase for the assessment. The safety requirements specification must specify: the safety integrity target, therisks, the standards and rules to apply, the safety functions, the type of application considered(Interlocking, ATP, ATC, etc.), the different configurations, assumptions on environment of the product.

124.

Security: Dependability with respect to the prevention of unauthorised access and/or handling ofinformation. Source: [LAP01].

125.

Semantic Analysis: Checking the relationship between input and output for every semantically possiblepath through a program, or part of a program. It can reveal semantically possible paths of which theprogrammer was unaware, and coding errors. Source: [IDS01].

126.

Severity (failure): Grade of the failure consequences upon the system environment. Source: [IDS01].127.Software: Intellectual creation comprising the programs procedures rules and any associateddocumentation pertaining to the operation of a data processing system. Source: [CEN02].

128.

Software assessment: The process of product evaluation either by an official regulatory body or anindependent third party to establish that it complies with all necessary requirements, regulations andstandards.

129.

Software Errors Effects Analysis (SEEA): Analysis intended to determine, inductively and similarly toFMECA, the nature and criticality of consequences of software failures.

130.

Sponsor: person or body who ask an assessment of a product. Source: [ITS01].131.Static Code Analysis: Using mathematical techniques to analyse a program and reveal its structure. Itdoes not need execution of the program, but verifies the program against the specification. Techniquesinclude control flow, data use, information flow and semantic analysis. Source: [IDS01].

132.

Static Verification: Verification conducted without exercising the system. Source: [LAP01].133.Statistical Testing: Form of testing where the test patterns are selected according to a defined probabilitydistribution on the input domain. Source: [LAP01].

134.

Structural Testing: Form of testing where the testing inputs are selected according to criteria relating tothe system's structure.

135.

Sub-system: A portion of a system which fulfils a specialised function. Source: [CEN03].136.Supplier: in the ACRuDA project, the supplier is the person or body who builds and sells a product or asystem. In the case of a product, the requirement, design, and validation phases are achieved by thesupplier. In the case of a system, the requirements can be defined by the end user.

137.

System:138.

(1) A set of sub-systems or elements which interact according to a design. Source: [CEN03].(2) A Specific installation with a particular goal and a particular operational environment. Source:[ITS01].Systematic Failure: Failures due to errors in any safety life cycle activity, within any phase, which causeto fail under some particular combination of inputs or some particular environmental condition. Source:[CEN01].

139.

Systematic Faults: An inherent fault in the specification, design , construction, installation, operation ormaintenance of a system, sub-system, or equipment. Source: [CEN03].

140.

Testing: Dynamic verification performed with valued inputs. Source: [EN01].141.Testing Laboratory: Laboratory which achieves tests. Source: [EN01].142.Traceability: The ability to trace the history, application or location of an item or activity, or similaritems or activities, by means of recorded identifications. Source: [NFX01].

143.

Validation:144.

(1) Confirmation by examination and provision of objective evidence that the particular requirements fora specific intended use have been fulfilled. Source: [CEN01].(2) The activity of demonstration, by test and analysis that the product meets in all respects its specifiedrequirements. Source: [CEN03].Validator: The person or agent appointed to carry out validation. Source: [CEN01], [CEN02], [CEN03].145.Verification:146.

(1) Confirmation by examination and provision of objective evidence that the specified requirements havebeen fulfilled. Source: [CEN01].(2) The activity of determination by analysis and test that the output of each phase of the life cycle fulfilsthe requirements of the previous phase. Source: [CEN02], [CEN03].Verifier: The person or agent appointed to carry out verification. Source: [CEN01], [CEN02], [CEN03].147.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 71: ACRuDA Deliverable D3

8. References

8.1. European Council Directives

[DIN01] Council directive 96/48/EC: « Interoperability of the European High speed train network », 23 July1996.

[DIN02] Council directive 93/465/EC: « Modules related to the different phases of assessment procedures ofconformity and rules of affixing and using CE mark, intended to be used in the technical harmonisationdirectives », 22 July 1993 .

[DIN03] Council directive 90/531/EEC: « Procurement procedures of entities operating in the water, energy,transport and telecommunications sectors », 17 September 1990.

8.2. European Technical Specifications

[STI01] AEIF - European Commission: « Interoperability for the Trans-European High speed Network -Specification of the Technical Requirements for Interoperability - Control Command », Ref. MT114FE1L - 03October 1996 - DRAFT.

8.3. Standards

[CEN01] prEN 50126 - CENELEC: « Railway Applications: The specification and demonstration ofdependability, reliability, availability, maintainability and safety (RAMS) », June 1997.

[CEN02] prEN 50128 - CENELEC: « Railway Applications: Software for Railway Control and ProtectionSystem », January 1997.

[CEN03] prENV 50129 - CENELEC: « Railway Applications: Safety Related Electronic Systems for Signalling», Version 1.0 - January 1997.

[IEC01] IEC 61508: « Functional Safety: Safety Related Systems », Part 1 to 7:

Part 1: General Requirements,Part 2: Requirements For Electrical/Electronic/Programmable Electronic Systems,Part 3: Software Requirements,Part 4: Definitions And Abbreviations Of Terms,Part 5: Guidelines On The Application Of Part 1,Part 6: Guidelines On The Application Of Part 2 And 3,Part 7: Bibliography Of Techniques And Measures.

[EN01] EN 45001: « General criteria for the operation of testing laboratories », 1989.

[EN02] EN 45002: « General criteria for the assessment of testing laboratories », 1989.

[EN03] EN 45003: « General criteria for the accreditation body of testing laboratories », 1989.

[EN04] EN 45011: « General criteria for certification bodies operating product certification », 1989.

[EN05] EN 45012: « General criteria for certification bodies operating certification of quality system », 1989.

[EN06] EN 45013: « General criteria for certification bodies operating certification of personnel », 1989.

[EN07] EN 45014: « General criteria for declaration of conformity by the suppliers », 1989.

[IDS01] Interim Defence Standard 00 - 55 (Draft)

WCS_AP v.03 Test report

3.11.2010 12:59

Page 72: ACRuDA Deliverable D3

Part 1: Requirement « The procurement of safety Critical software in Defence Equipment »Part 2: Evidence « The procurement of safety critical software in Defence Equipment » 00 - 55issue 1 - 5 April 1991,

[IDS02] Interim Defence Standard 00 - 56 (Draft) « Hazard analysis and safety classification of the computerand Programmable Electronic System elements of Defence Equipment » 00 - 56 - issue 1 - 5 April 1991

[NFX01] Norme Francaise - NFX50 120: « Qualite - Vocabulaire », Septembre 1987

8.4. ACRuDA Project

[ACR01] ACRuDA Project: « State of the Art - Safety Architecture Synthesis », 24 February 97. Reference:ACRuDA/INRETS/MK-PM/WP1/D1/97.12/V2

[ACR02] ACRuDA Project: « State of the Art - Method Synthesis », 29 September 97. Reference:ACRuDA/INRETS/PM-MK/WP1/D2/97.39/V3

8.5. CASCADE Project

[GAM01] CASCADE Project - ESPRIT 9032 - version 1.0 - 17 January, 1997: « General assessment method: itprovides the essentials to prepare and to handle the assessment of safety critical systems ». It is composed of:

Part 1: Rules,Part 2: Guidelines ,Part 3: Examples.

8.6. ERTMS project

[ERT96] ERTMS project: « RAMS Requirements Specification », Volume 5: Annexes, Safety, 20/12/96.

8.7. Information Technology domain

[ITS01] ITSEC: « Information Technology Security Evaluation Criteria », Version 1.2 - June 1991.

[ITS02] ITSEM: « Information Technology Security Evaluation Manual », Version 1.0 - September 1993.

[ITS03] ECF01: « Schéma Français d’Evaluation et de Certification des Technologies de l’Information -Presentation du schéma », Version 2.0 - 16 janvier 1997

[ITS04] ECF03: « Schéma Français d’Evaluation et de Certification des Technologies de l’Information -Procédure d’évaluation et de certification », Version 1.0 - 16 janvier 1997

[ITS05]> ECF04: « Schéma Français d’Evaluation et de Certification des Technologies de l’Information -Format des rapports et certificats », Version 1.0 - 16 janvier 1997

8.8. Others

[AQC01] « Assurance Qualité en Conception » (« Quality Assurance in Design »). Marc Reynier. CollectionMASSON.

[LAP01] J.C. LAPRIE « Dependability: Basic concepts and Terminology » springer-Verlag Wien, NewYork.1992.

9. ANNEX I: STRUCTURE OF A SAFETY PLAN

WCS_AP v.03 Test report

3.11.2010 12:59

Page 73: ACRuDA Deliverable D3

The Safety Plan should include the following topics:

Chapter 1 - General Aspects

Introduction

This section provides an overview of the plan and includes any necessary background materialAims and ObjectivesScope of the planThe policy and strategy for achieving safety

This section will provide the policy and strategy for achieving safety, together with the means forevaluating its achievement, and the means by which this is communicated within the organisation toensure a culture of safe working,Assumptions and Constraints

This part of the plan will detail any assumptions being made in connection with the product developmenttogether with the constraints under which the safety construction is to be conducted.Interfaces with other related programs and plansApplicable standards

Normative and supporting document should be listed in the Safety Plan.

Chapter 2 - Product Description

Description of the product

The main design characteristics of the product and its main applications are described in this sectionThe main selected safety measures and techniques

The main selected measures and techniques used to meet the safety requirements should be listed in theSafety Plan

Chapter 3 - Management Aspects

Details of staff structure

The role, responsibilities, competencies and relationships of bodies undertaking tasks within the life cyclewill be described in this section. The identification of the persons, departments, organisations or otherunits which are responsible for carrying out and reviewing each of the safety life-cycle phases will bedescribed. A description of the relationship between the bodies will be identifiedQualification and training of the staff

The procedures for ensuring all staff involved in all safety life-cycle activities are competent to carry outactivities for which they are accountable (Competence of persons).

Chapter 4 - System LifeCycle and Safety Activities

The system life cycle and safety activities

This section outlines the safety life cycle and safety activities to be undertaken within the life cycle alongwith any dependenciesThe safety analysis, engineering, verification and validation

The analyses and validations to be applied during the life cycle, should be clearly identified and should

WCS_AP v.03 Test report

3.11.2010 12:59

Page 74: ACRuDA Deliverable D3

take into account the processes for ensuring an appropriate degree of personnel independence in tasks andnecessary safety reviews, to demonstrate compliance of the management process with the Safety Plan

Chapter 5 - Reporting, Control and Milestone

Details of all safety related deliverables and the milestone

This section presents the list of main documents to be produced and the milestones and the mechanism ofthe documentation review and acceptance.the mechanism of the documentation review and acceptanceRequirements for periodic safety validation and safety review

This section contains the planned safety reviews throughout the life cycle and appropriate to the safetyrelevance of the element under consideration, including any personnel independence requirements andshall be implemented, reviewed and maintained throughout the life cycle of the system.

Chapter 6 - Safety Case Plan

the safety case structure

the Safety Plan should include a safety case plan, which identifies the intended structure and principalcomponents of the final safety case.the mechanism to prepare Safety Case

the procedures for the preparation and scheduling (draft, number of version, etc.) of the safety caseshould be described in the Safety Plan.

Chapter 7 - Safety Maintenance

process for analysing operation and maintenance performance to ensure that realised safety is compliantwith requirementsMaintenance of safety-related documentation

The procedure for maintaining accurate documentation on potential hazards, safety related systems andexternal risk reduction facilities.Each of the components of the Safety Plan shall be formally reviewed by the organisations concerned andagreement gained on the contents.

10. ANNEX II: STRUCTURE OF A PRODUCTSAFETY CASEThe Safety case should include the following topics:

Chapter 1 - Contents

The section should contain a description of how the safety case has been constructed and how it willdemonstrate that the product meets the safety requirements for its intended purpose.

Chapter 2 - High Level Documentation

List of Safety Case Documents

WCS_AP v.03 Test report

3.11.2010 12:59

Page 75: ACRuDA Deliverable D3

This list of documents comprises all the documents which form the safety case or support the safetyarguments contained in the safety case. The structure of this list must be consistent with thestructure of the safety case. If the safety case is self-contained, this section will not be necessary.

Safety Process

This section shall describe the process (plan, execution and evidence) through which the requiredlevel of safety has been met.

Glossary

A comprehensive glossary should be included to provide a clear definition of all technical termsused.

Chapter 3 - Safety Management Documentation

Documentation which relates to the management, organisation and control of safety should be listed in thissection which should include a summary of each document listed, for example:

Organisation Documentation

This should describe the organisational structure under which the product has been developed, anddefines the roles, responsibilities and reporting structure of personnel involved in management, quality,development, safety, maintainability reliability and user support.Development Plan

This defines the development of the product in terms of development stages and establishes the criteriafor demonstration and acceptance that each stage has been completed. This is a "living" document whichmust reflect not only the original plan, but also the actual life cycle of the development that took place.Quality Plan

The Quality Plan defines the quality requirements that will be applied to all aspects of the work indeveloping the product. This will include the Quality Management System (QMS) used on the projecttogether with a traceable path to enable demonstration that the QMS is in accordance with EN 29001 andrelated standards.Safety Plan

The Safety Plan defines the way in which the safety of the product is to be assured. Details of techniquesand processes to be used, at what stage they are to be used and how the findings of each analysis is to beaddressed as part of the development process shall be described.

Refer to chapter 4.4.2 and the ANNEX I of this document, for a detailed description of the Safety PlanStructure. A clear description of the safety case structure should also be included within the Safety Plan.Those elements of the V & V Plan which relate directly to safety requirements may be addressed orreferenced in this document.

Verification & Validation (V&V) Plan

This document defines the objective and approach to be adopted in demonstrating that the requirementsdescribed in the Requirement specification documentation and safety criteria drawn from the varioussafety analyses have been met. Procedures for, and evidence of, traceability of specific requirements toparticular test elements of V&V activities shall be briefly described and appropriate, detaileddocumentation should be referenced. This document should address the V & V of all requirementsincluding those relating to safety which may have been covered in the Safety Plan.Configuration Management Plan

This document describes the principles and processes by which the build standard of, and changes to, theproduct under consideration has been controlled throughout its lifecycle from conception through detailedspecification, design, build, validation. The Configuration Management Plan should detail the timing of

WCS_AP v.03 Test report

3.11.2010 12:59

Page 76: ACRuDA Deliverable D3

design reviews, configuration baselines, status reporting mechanisms and procedures for deviation fromprescribed processes. This document is vital since traceability is a central requirement of a Safety Caseand rigorous traceability is only truly achievable when all evidence is from configured sources.

Despite the rigorous application of management processes, there will be a number of occasions in anydevelopment activity when deviation from strict requirements will be unavoidable in order to maintaincontrol of overall schedule and cost. For example, the design process in a particular development maydictate that all safety analyses are completed before a design is committed to production build. Eventsmay be such that the delays caused in adhering strictly to this requirement could seriously impact theprogramme. To deal with this situation, a formal mechanism which manages deviation from procedures orrequirements should be put in place. This will detail the mechanism for recording the assessment, ,acceptance and resolution or disposition of the deviation. This deviation procedure should be describedtogether with a reference to the list of deviations raised during the development together with theirresolution.

Clearly, because of the complex functionality of product and its development process, each plandescribed above is likely to comprise a number of individual documents which should be controlled aspart of the overall configuration management system (as should all documentation)The structure of anysuch plan should be described, within the Safety Case.

Chapter 4 - Product Element Documentation

Elements in the product, such as microprocessors, power supplies, previously designed modules, etc., whichhave received certification in their own right, must be described briefly, together with reference todocumentation against which certification was granted. Reference must also be made to the notified body.

Evidence should be included to demonstrate that the supplier has fully assessed that the use of theassessed/certified element in the product is entirely within the functional and safety related applicationconditions specified for that element.

Chapter 5 - Reference v Self-containment

There are two viewpoints about how the basic Safety Case should be presented. The first is that the Safety Caseis a self-contained document with no reference to other documents. There are inherent difficulties with thisapproach:

Product designs are the intellectual property of the suppliers and any critical details which are proprietaryto the supplier may not be included.

0.

A self-contained safety case would be extremely bulky for a complex product making distribution andavailability difficult. Changes would be harder to incorporate

1.

The Safety Case of this nature would be difficult to produce as the lifecycle of the product itself developsand would increase the tendency to produce the Safety Case at the end of the lifecycle.

2.

It is not clear how the traceability of the data contained in the safety case could be shown with noexternal referencing

3.

The second viewpoint is that the Safety Case is a reference document describing that documentation relating theproduct needed to demonstrate the safety of that product.

Chapter 6 - Product Section

General Aspects

A high level description of the product , and the scope and limitation of the safety analyses shall be givenin this section. The preliminary hazard list, identification of the system boundary with respect to hazardsand classification of those hazards in terms of risk and consequence should also be presented.Safety Objective

WCS_AP v.03 Test report

3.11.2010 12:59

Page 77: ACRuDA Deliverable D3

The safety objectives must be clearly stated and a description included of the method for allocating thesafety objectives from the top level product requirements to lower levels of the product. Reference shouldbe made to documentation detailing functional and safety requirements for all elements of the product,e.g. between and within hardware and software.Description of the Architecture of the Product

The structure, functionality, operation, interface, and the environment envelope of the product shall bedescribed. This shall include a list of applications for which the product was considered or envisaged bythe designers. For each application, the element of configurability/adaptability must be considered anddescribed.

Functional Elements

According to ACRuDA definition (see deliverable D1 of ACRuDA project) a safety criticalarchitecture, is a configurable, structured set of components which can be demonstrated to complywith critical or vital safety criteria or safety levels. In its simplest representation the architectureconsists of a programmable electronic elements with input and output ports or devices whichinterface with external equipment.

Modes of operation including restricted and degraded operational modes shall be described togetherwith the failure modes of each of these.

Functional Safety Elements

The safety case must provide evidence that :degraded input data, e.g. erroneous, missing and irrational data,errors in the operation of the hardware, e.g. CPU, memories, interfaces, clocks, built-in-test,watchdogs and communication linkserrors in the software

are detectable and that action following detection will bring the system to a safe, defined state in atimely and controlled manner.

Safety Studies

The following should be listed in this section :Safety analyses undertaken,the boundaries of these analyses (to what element of the product each is applicable),the stage in the lifecycle they were completed,any tools used (including databases).Reference to well established practices or known standards which define the appropriate rules forapplication of methods and techniques used should be made here.

Evidence of safety may be considered in two categories: functional safety and technical safety. Theelements of these are considered below :

Evidence of Functional Safety

Evidence must be provided that demonstrates all functional requirements are met and that thedesign process to achieve the specified functionality is such that the "services" the productprovides, meet Safety Integrity Level 4 (SIL4). Consideration of alternative designs or approachesshould be presented together with the rationale for the approach implemented.

Correct and full operation at the limits of "normal conditions" must be demonstrated.

The safety properties (or principles) such as redundancy, diversity, error detection, self test,information redundancy, shall be described together with the safety rationale for the design conceptand the underlying assumptions about what is "safe". Reference shall be made to documentationcontaining quantitative analysis which supports the product design approach.

The effects of faults due to hardware or software must be comprehensively analysed, and evidencethat the effects have been addressed in the design should be presented. Established analyses such asHAZOP, HAZID, Hazard Analysis, Fault Tree, FMECAs, shall be described and used as part of theSafety Assurance lifecycle which shall be described in the Safety Plan documentation. A

WCS_AP v.03 Test report

3.11.2010 12:59

Page 78: ACRuDA Deliverable D3

description of how the results of these analyses were fed back to the design should be given.

Evidence of Technical Safety

Susceptibility of the product. - The suppliers must demonstrate that the product functions correctlywithin the range of specified external influences, e.g. temperature, vibration, humidity,electromagnetic radiation (EMC), power supply variation, contaminants, etc. They must alsodemonstrate that for conditions outside the defined specification for normal operation, the productwill default to a defined, safe state and upon return to normal operating conditions, the product willbehave in a safe predictable manner.

Influences upon susceptibility of other equipment. The supplier must define the characteristics ofthe product which could influence the operation of the system, application or other equipment e.g.thermal radiation, electromagnetic emission, etc. Evidence of testing to confirm that these are asdefined should be provided.

This section of the safety case should contain a description of the safeguards in the design whichprotect the safety properties of the product from compromise during the implementation of theproduct. For example, a product may have provision to accept site specific software. Referenceshould be made to evidence that demonstrates that the safety functionality of the product isunaffected (ring-fenced) when used within its defined range of application. Any mechanism toindicate that the ring-fence has been breached should be given in this section.

Verification and Validation

Verification and validation reports shall be included or referenced. V&V activities must demonstrate thatat every level, each requirement within the product has been tested and that there is traceability ofevidence from the highest level requirement through to final testing. The key objective of this activity is toensure that there is clear evidence that the product conforms with every requirement.

Chapter 7 - Parts and Materials

There are well established methods and standards which describe the approaches and procedures for developinga quantitative analysis for reliability and availability of components. Evidence that analysis has been undertakenshould be included, and reference made to procedures and plans used, and results obtained. What is lesscommonly addressed at the design stage is the issue of component specification and obsolescence.

Components (i.e. COTS items resistors, ICs, ASICS, capacitors, gaskets etc.) are very "fuzzy" items in that theyhave a vast range of parameters which specify their performance. Many of these parameters are a function ofthe fabrication process rather than the design and so can vary from supplier to supplier. These parameters needto be considered by the designer prior to inclusion in the design. Further, suppliers of complex components suchas microprocessors will release only very limited data about the parametric performance of their devices due tocommercial sensitivity. In all cases, availability must be a major consideration in component selection andmultiple sourced components from suppliers with a track record of long term product support should beconsidered.

Evidence must be included to show that a formal parts and materials selection process had been established andfollowed. The process should include detailed criteria for selection of components and assessment of suitabilityby parties other than the design team. Policy and approach taken to minimise the effect of obsolescence shouldalso be discussed.

Chapter 8 - Ownership - Operation, Evolution, Modification

When the safety case of rail system, is accepted, the ownership of the safety case resides with the operator. Ifthe operator wishes to make changes, it is for him to judge whether or not these changes impact the safety casesof system elements such as the product. If it does, then it is his responsibility to ensure that the element safetycases and the overall rail safety cases are satisfactorily revised. It is essential, therefore, that at all points ofhand-over of safety case ownership, the parties must satisfy themselves that all documentation is available to

WCS_AP v.03 Test report

3.11.2010 12:59

Page 79: ACRuDA Deliverable D3

enable any modification of the element to be undertaken and the revised safety case prepared.

Ideally, the most straightforward situation is that when all product documentation is available to the operatorwho will then be equipped to make safety assessments of modifications at all levels of the system. In practice,documentation relating significant elements of a will not be available because of commercial confidentialityissues. In this situation, the responsibility for update of the safety case may reside with the supplier.

Modification may be required many years after initial delivery and, because any changes to the product couldaffect safety, it is essential that all documentation is available. The safety case should, therefore, contain adiscussion on how issues of availability of documentation for design changes after hand-over, have beenaddressed. This should include commercial aspects such as copyright, design authority, non-disclosureagreements and intellectual property rights together with model agreements for dealing with these issues. As partof this, there should be a clearly defined process to address the correction of errors which may be discoveredafter hand-over of the product safety case.

Chapter 9 - User Support

While, ultimately the responsibility for correct usage of the product rests with the user (the term "user" refershere to the operator or any body utilising the product or its application) there is a responsibility upon thesupplier to take all necessary steps to ensure correct usage. This will be achieved by not only by providinginformation as described in the preceding sections but also by offering the facility of a technical supportprogramme to user of the product.

The safety case should include a model programme covering technical support to the user, repair capability,repair times and calibration of support equipment supplied, including any test equipment and support tools(hardware and software). The model should also consider the needs of in-service support which in addition tothe above, will cover issues such as spares holdings, spares manufacturing capability including ownership ofsupport equipment and retention of repair facilities.

The nature and level of support provided by the supplier will be the subject of commercial discussion betweensupplier and user. However, while support issues may not directly impact the safety related functioning of theproduct, lack of a clear strategy and inadequate or contractual agreements user support in place will almostcertainly result in significant difficulties in obtaining appropriate documentation and expert technical help. Thisin turn can lead to poorly effected modifications and flawed safety arguments. Ultimately, this could affect thesafety of the system. It is, therefore, essential that a structure for post hand-over support is developed andpresented as part of the safety case.

Chapter 10 - Conclusion, Safety Argument Summary

The conclusion of the safety case shall summarise the evidence and state whether or not the product meets its allits specified safety requirements. The constraints on the precise configuration, operation and application of theproduct shall be summarised.

11. ANNEX III : STRUCTURE OF ANASSESSMENT PLANThe plan is based on the assessment plan describe in [ITS05]. The assessment plan is structured according to thecriteria :

Chapter 1 - Introduction

Context

WCS_AP v.03 Test report

3.11.2010 12:59

Page 80: ACRuDA Deliverable D3

A description of the context of the assessment is presented in this section. This description must contain :

the identity of the supplier,the identity of the sponsor,the identity of the assessors,the identity of the notified body.

Objectives

Scope of the report

Chapter 2 - Management aspects

Special dispositions of the assessment

The assessment process should be considered as a project. The management of this process should then followthe project management requirements.

Special dispositions in the quality process may be taken : a specific project organisation of the staff , specificconfidentiality agreements, configuration management requirements can be identified in this chapter in additionto the dispositions of the Quality Handbook.

It is recommended to define a specific assessment steering committee. The members should be identified byname.

Relations between the assessment activity and the supplier’s activity.

In case of a simultaneous assessment, the relations between the development phases of the architecture and theassessment activity must be described carefully in this chapter. It is recommended that the assessment startsduring the specification phase of the architecture. A special care has to be taken to the modification process inthe development. The assessor has to define how the modifications during the development are impacting hisinvestigations.

In case of consecutive assessment, the chapter must identify in the planning some review meetings with thesupplier and the supplier activities to answer the assessment needs.

Estimation of time and costs

The duration and costs of the assessment depend on the kind of assessment : simultaneous or consecutive.

The estimation contains a planning of the activity. This planning must describe the duration of each activity andthe effort in men month allocated to the each activity.

It is recommended to identify the assessors members who are going to work on each activity by name.(including the subcontractors).

The risk of underestimating the times and costs is to have a decrease in the global quality of the assessment (ex:safety analyses not exhaustive).

Details of the staff structure

The role, responsibilities, competencies and relationship of all the assessors is given in this chapter.

Qualification and training of the staff

The procedure for ensuring that all staff involved in the safety activities, is competent to carry out activities forwhich they are accountable.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 81: ACRuDA Deliverable D3

Chapter 3 - Description of the product

Functionality of the product

This section shall contain a summary of the operational role and the functions of the product.

History of development

This section shall present the development phases of the product (with tools, standards, method, techniques,etc.)

Architecture of the product

This section shall present the high level architecture of the product. The separation between safety and nonsafety components. The apportionment of the safety functions on hardware and software. A description ofhardware and software must be given and in particular of the components concerned by the safety.

Chapter 4 - Description of the safety requirements specification

A good understanding of the content of the safety requirements specification is necessary for theunderstanding of the assessment report.

This chapters shall contain :

the safety integrity target,the specification of the safety functions,the specification of the safety mechanisms.

Chapter 5 - Description of the assessment

The assessment work can be divided in work packages.

For each work package, it is necessary to define

the work to achieve with a great precision,name of the assessors,a planning (with duration and meetings) to achieve the work,the effort in men month, allocated,all the inputs necessary to achieve the assessment,the technical assessment report to produce.

If external assessors are in the assessment process, a procedure to collect all the reports to compose the finalreport must be defined. The notified body has to do this work.

The minimum documentation supplied is the documentation identified in the safety case (chapter 4.4.3 of thepresent guide).

Several assessment activities can be defined. All these activities are based on the criteria.

Preliminary activities

examine the definition of the product to assess : the assessor should evaluate the relevance of thedefinition of the product in the point of view of safety. For example, the assessor can consider that someelements should be integrated in definition of the product to assess because they have a potential impacton the safety of the whole architecture.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 82: ACRuDA Deliverable D3

examine the safety requirements specification (safety integrity target, standards, risks, etc.) : theassessment plan must explain whether the allocation of the safety requirements is taken as an input of theassessment or if this activity will be examined by the assessor.preliminary analyse of the architecture : a system approach is recommended for the assessment ofcomplex architectures. That is to say : first a global comprehension of the architecture with a global safetystudy and then a detailed comprehension on each component.verification of the set of criteria

If relevant, the assessor should prefer to evaluate the results of the supplier using a different method ofevaluation than the method used by the supplier.

Writing of the reports

The report should be compliant with [EN01] standard (chapter 5.4.3 of the standards)

Chapter 6 - Conclusions

In the conclusion of the assessment plan the assessor should give a justification that the process defined abovecovers all the identified risks in the safety specification requirements. This justification should be based only onthe work of the supplier examined by the assessor and on the results of the assessor.

The main conclusion of the assessment allows to know if the criteria are verified and if there is no risk ofhazardous failure. The recommendations can remind that the results of the assessment are valid for a particularversion of the product when it is configured a certain way.

Annex A - Terminology and abbreviations

This annex must identify all the acronyms and terms used in the technical assessment report.

12. ANNEX IV : STRUCTURE OF A TECHNICALASSESSMENT REPORTThis is a generic plan which can be fitted for the assessors and the notified body. This plan is based on theassessment plan define in [ITS05].

Chapter 1 - Introduction

Context

A description of the context of the assessment is presented in this section. This description must contain :

the identification of the assessment,the name and the version of the productthe identity of the supplier,the identity of the sponsor,the total duration of the assessment,the identity of the assessors.

Objectives

This section presents the objectives of the technical assessment report.

Scope of the report

WCS_AP v.03 Test report

3.11.2010 12:59

Page 83: ACRuDA Deliverable D3

This section must precise all the assessment tasks covered by the technical assessment report. In general, itcovers all the assessment. If it is not all the case, justification must be given.

Organisation

This section presents the organisation of the technical assessment report.

Chapter 2 - Summary

This chapter is the base of all information on the results of the assessment, published by the assessor. So, Thissummary must not contain confidential information on the product (commercial, technical, etc.).

This chapter must contain :

the identification of the product and the version,a brief description of the product,a brief description of the safety characteristics of the producta summary of the main conclusions of the assessment,

Chapter 3 - Description of the product

Functionality of the product

This section must contain a summary of the operational role and the functions of the product. This summary iscomposed of

the type of data to treatthe different kinds of users

History of development

This section must present the development phases of the product (with tools, standards, method, techniques,etc.)

Architecture of the product

This section must present the high level architecture of the product. The separation between safety and nonsafety components. The apportionment of the safety functions on hardware and software. A description ofhardware and software must be given and in particular of the component concerned by the safety.

Description of hardware

This section gives details on all the hardware components necessary for the assessment.

Description of software

This section gives details on all the software components necessary for the assessment.

Chapter 4 : Description of the safety requirements specification

A good understanding of the content of the safety requirements specification is necessary for the understandingof the assessment report.

This chapter points reference to the safety requirements specification or re - describes it in its totality.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 84: ACRuDA Deliverable D3

Chapter 5 : Assessment

History of the assessment

This section is designed like chapter 3. This section must present the assessment process and the main stages :

expected and defined in the assessment plan, at the beginning of the assessment,really reached during the assessment.

This main stages can be : meetings, delivery, end of technical work, etc.

Assessment procedure

A summary of the assessment plan must be presented in this section. The tasks of the assessors defined in theassessment plan and the work packages achieved. All the differences between the proposed work of theassessment plan and the real work achieved must be recorded and argued.

A summary on the conformity of the real delivered assessment inputs to the aimed supplied must be given Allthe differences between the delivered assessment inputs and the aimed assessment inputs must be recorded andargued.

Boundaries of the assessment

This section must identify, clearly and precisely, all the components of the assessed product and all thehypothesis made on the components not assessed.

Constraints and hypothesis

This section must identify all the constraints encountered and all the hypothesis made, during the assessment.

Chapter 6 : Summary of the assessment results

This chapter must supply a summary of the results of the assessment. The structure of the chapter is based onthe effectiveness and the conformity assessment and the criteria.

Each sub-sections of the chapter must contain the name of the assessor and the reference of the work package.All the criteria must be covered.

Chapter 7 - Remaining risks/remaining scenario

This chapter presents all the remaining risks/remaining scenarii discovered during the assessment. For eachrisk/scenario, it is necessary to describe :

the safety function concerned by the risk/scenario ,a description of the risk/scenario,the task during which the assessor found the risk/scenario,the work package during which the assessor found the risk/scenario,the person who found the risk/scenario,the date of discovery,if the risk/scenario was corrected or not, and if correction, the date of correction,the origin of the risk/scenario.

Chapter 8 - Conclusions and recommendations

The main conclusion of the assessment allows to know if the product reach the safety requirements specification

WCS_AP v.03 Test report

3.11.2010 12:59

Page 85: ACRuDA Deliverable D3

and if there is no risk of hazardous failure . The recommendations can contain suggestions towards the sponsorand the supplier. The recommendation reminds that the results of the assessment are valid for a particularversion of the product when it is configured of a certain way.

Annex A - List of assessment inputs

This annex must identify all the assessment inputs, with their version and the date of delivery and reception.

Annex B - Terminology and abbreviations

This annex must identify all the acronyms and terms used in the technical assessment report.

Annex C - Assessed configuration

All the configurations of this product, examined during the assessment must be clearly and precisely identified.All the hypothesis or the configurations not taken in account must be described. The hardware and the softwareassessed must be described and in particular the pertinent elements for the assessment, the safety functions.

Annex D - Work package reports

This annex is facultative if all the assessment reports, issued from the work packages, are contained in thechapter 6 of the assessment report (only one assessor). If this annex exists, it must be composed of all the workand results necessary to justify the verdicts of the assessors.

13. ANNEX V : STRUCTURE OF THECERTIFICATION REPORTThis report is based on the certification report describe in [ITS05]. The certification report is public, should notcontain any confidential information and should contain as a minimum :

Chapter 1 - Introduction

Objectives

This section presents the objectives of the technical assessment report.

Terminology and abbreviations

This section identifies all the acronyms and terms used in the certification report.

References

This section defines the list of all the references used in the certification report.

Chapter 2 - Results

Conclusion

This section describes :

WCS_AP v.03 Test report

3.11.2010 12:59

Page 86: ACRuDA Deliverable D3

the precise identification of the product (with the identification number, version number, etc.),the conclusions in terms of remaining risk/scenariithe recommendations of use

Context of the assessment

a definition of the used criteriathe identity of the supplier, and if necessary the identity of subcontractors,the identity of the sponsorthe date and the duration of the assessment

Chapter 3 - Description of the product

Description of the product

This section must contain a detailed description of the operational role, the components and the functions of theproduct.

Description of hardware

This section gives details on all the hardware components assessed.

Description of software

This section gives details on all the software components assessed.

Description of the documentation

This section gives details on all the documentation associated with the product. The minimum is the usersdocumentation.

Chapter 4 - Assessment

Technical assessment report

The reference (s) of the technical assessment report.

Main results of the assessment

This section describes the safety requirement specification, the applied criteria and the results of the assessment.All useful remarks to understand the results are supplied.

Chapter 5 - Recommendations of use

This chapter describes all the recommendations necessary to the users of the product.

Some recommendations can be done on the configuration and safe use of the product notably by describingprocedural, technical, organisational measures. Some recommendation can end to a restriction of use of theproduct.

Chapter 6 - Certification

This chapter describes the scope of the certificate.

WCS_AP v.03 Test report

3.11.2010 12:59

Page 87: ACRuDA Deliverable D3

14. ANNEX VI: STRUCTURE OF A CERTIFICATEThe certificate is based on the certificate describe in [ITS05]

The certificate contains :

the identification of the version of the product assessed,the identification of the certification report (with how and where the report can be accessed)the identity of the sponsor and the supplier,the identification of the notified body : symbol, drawing, mark ,a mention on the importance of the certificate and the boundaries of the assessment and

certification responsibilities,the signature of the supplier and the notified body.

(To top of text)

WCS_AP v.03 Test report

3.11.2010 12:59


Recommended