+ All Categories
Home > Documents > Project acronym: e-SENS Approved by EC ·  · 2017-05-29Approved by EC D3.1 Guidelines to ... (a...

Project acronym: e-SENS Approved by EC ·  · 2017-05-29Approved by EC D3.1 Guidelines to ... (a...

Date post: 16-May-2018
Category:
Upload: nguyentu
View: 214 times
Download: 1 times
Share this document with a friend
77
Approved by EC D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 1 Submitted to the EC on 01/10/2013 COMPETITIVENESS AND INNOVATION FRAMEWORK PROGRAMME ICT Policy Support Programme (ICT PSP) Project acronym: e-SENS Project full title: Electronic Simple European Networked Services ICT PSP call identifier: CIP-ICT-PSP-2012-6 ICT PSP main theme identifier: CIP-ICT-PSP-2012-6-4.1 Basic Cross Sector Services Grant agreement n°: 325211 D3.1 Guidelines to the assessment the sustainability and maturity of building blocks Deliverable Id : D3.1 Deliverable Name : Guidelines to the assessment the sustainability and maturity of building blocks Version : v 1.0 Status : Final Dissemination Level : Public Due date of deliverable : M6 Actual submission date : 01.10.2013 Work Package : WP3, Task 3.1 Organisation name of lead partner for this deliverable : Ministry of Economic Affairs and Communications of Estonia Author(s): Jaak Tepandi Partner(s) contributing : DIGST.DK Denmark; JM NRW Germany; EISA Estonia - Ministry of Economic Affairs and Communications; SGMAP France - MINISTERE DE LA JUSTICE; Tudor Luxembourg; NL-MEA Netherlands - (TNO), Forum Standaardisatie; Difi Norway; TUBITAK Turkey Abstract. Deliverable D3.1 is intended to assess various types of building blocks. It presents a procedure for assessment, criteria and information to be used in the four assessment steps, and organisational aspects of assessment. The assessment procedure comprises proposal, consideration, assessment and recommendation
Transcript

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 1

Submitted to the EC on 01/10/2013

COMPETITIVENESS AND INNOVATION FRAMEWORK PROGRAMME ICT Policy Support Programme (ICT PSP)

Project acronym: e-SENS

Project full title: Electronic Simple European Networked Services

ICT PSP call identifier: CIP-ICT-PSP-2012-6

ICT PSP main theme identifier: CIP-ICT-PSP-2012-6-4.1 Basic Cross Sector Services

Grant agreement n°: 325211

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks

Deliverable Id : D3.1

Deliverable Name : Guidelines to the assessment the sustainability and maturity of building blocks

Version : v 1.0 Status : Final

Dissemination Level : Public Due date of deliverable : M6 Actual submission date : 01.10.2013

Work Package : WP3, Task 3.1

Organisation name of lead partner for this deliverable : Ministry of Economic Affairs and Communications of Estonia

Author(s): Jaak Tepandi Partner(s) contributing : DIGST.DK Denmark; JM NRW Germany; EISA Estonia

- Ministry of Economic Affairs and Communications; SGMAP France - MINISTERE DE LA JUSTICE; Tudor Luxembourg; NL-MEA Netherlands - (TNO), Forum Standaardisatie; Difi Norway; TUBITAK Turkey

Abstract. Deliverable D3.1 is intended to assess various types of building blocks. It presents a procedure for

assessment, criteria and information to be used in the four assessment steps, and organisational aspects of

assessment. The assessment procedure comprises proposal, consideration, assessment and recommendation

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 2

steps. A research update on already existing studies is provided in an Annex. The deliverable is based on the

CAMSS methodology, ADMS, the e-SENS WP6 deliverable D6.1, suggestions from e-SENS stakeholders,

standards, and other sources.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 3

History

Version Date Changes made Modified by

0.2 30.04.2013 D3.1 Guideline principles Jaak Tepandi

0.3 13.05.2013 Guideline principles are extended and updated to a draft

Guideline; comments of WP3 participants have been

incorporated.

Jaak Tepandi

0.5 28.05.2013 The draft Guideline criteria have been elaborated,

extended, and updated according to suggestions from the

WP3 Brussels meeting on 17.05 and teleconference on

28.05.

Jaak Tepandi

0.7 31.05.2013 New presentation of the assessment procedure has been

elaborated. Assessment criteria from different sources

have been merged. New template for the report has been

applied. The draft Guideline has been extended and

updated throughout the text according to suggestions

from T3.2 and other WP3 participants.

Jaak Tepandi

0.8 19.07.2013 The proposal information/criteria have been aligned with

WP6. The Guidelines have been closer aligned with the e-

SENS deliverable template. Proposals and comments of

WP3, WP5, WP6, and other e-SENS project participants

have been taken into account.

Jaak Tepandi

0.9 01.09.2013 Conformance to e-SENS template has been improved.

Chapter for research update and input for the analysis

has been added. Flexibility of the Guidelines has been

emphasized. Proposals and comments from WP3 and

other e-SENS project participants have been inserted. The

Guidelines have been submitted for the e-SENS

Deliverable Revision Cycle.

Jaak Tepandi

1.0 30.09.2013 Proposals and comments for the previous version from

WP1, WP3, WP5, and WP6 participants have been

incorporated.

Jaak Tepandi

1.0 30.09.2013 Final check by WP1 Carsten Schmidt

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 4

Table of Contents

HISTORY........................................................................................................................................................... 3

TABLE OF CONTENTS ........................................................................................................................................ 4

LIST OF FIGURES ............................................................................................................................................... 7

LIST OF TABLES ................................................................................................................................................ 8

LIST OF ABBREVIATIONS AND GLOSSARY ......................................................................................................... 9

EXECUTIVE SUMMARY ................................................................................................................................... 11

1 INTRODUCTION ..................................................................................................................................... 14

1.1 SCOPE AND OBJECTIVE OF THE GUIDELINES .................................................................................................... 14

1.2 WP3 GENERAL OBJECTIVES AND VISION ...................................................................................................... 14

1.3 METHODOLOGY OF WORK ......................................................................................................................... 15

1.3.1 CAMSS ............................................................................................................................................. 16

1.3.2 ADMS .............................................................................................................................................. 17

1.3.3 Other sources .................................................................................................................................. 18

1.4 RELATIONS TO INTERNAL E-SENS ENVIRONMENT ........................................................................................... 18

1.4.1 Relations to e-SENS Work Packages ............................................................................................... 18

1.4.2 Reviewing and commenting on the Guidelines ............................................................................... 19

1.5 RELATIONS TO EXTERNAL E-SENS ENVIRONMENT .......................................................................................... 19

1.6 QUALITY MANAGEMENT ........................................................................................................................... 19

1.7 RISK MANAGEMENT ................................................................................................................................. 20

1.8 LEGAL ISSUES .......................................................................................................................................... 22

1.9 STRUCTURE OF THE DOCUMENT .................................................................................................................. 22

2 THE ASSESSMENT PROCEDURE AND TARGETS ....................................................................................... 23

2.1 THE ASSESSMENT PROCEDURE .................................................................................................................... 23

2.2 THE TARGETS OF ASSESSMENT .................................................................................................................... 25

3 PROPOSAL AND CONSIDERATION STEPS: DOCUMENTATION OF FORMAT ............................................ 27

3.1 THE PROPOSAL STEP ................................................................................................................................. 27

3.1.1 Complete list of proposal information/criteria ............................................................................... 28

3.1.2 The highly important proposal information/criteria ....................................................................... 32

3.2 THE CONSIDERATION STEP ......................................................................................................................... 33

4 THE ASSESSMENT STEP .......................................................................................................................... 35

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 5

4.1 INTEGRATION AND USE OF ASSESSMENT CRITERIA .......................................................................................... 35

4.1.1 Integration of Assessment Criteria from Various Sources ............................................................... 35

4.1.2 Use of Assessment Criteria .............................................................................................................. 36

4.2 STANDARDISATION CRITERIA ...................................................................................................................... 37

4.2.1 Maturity .......................................................................................................................................... 38

4.2.2 Openness......................................................................................................................................... 39

4.2.3 Intellectual property rights ............................................................................................................. 40

4.2.4 Life cycle, maintenance, service levels, security .............................................................................. 41

4.3 CRITERIA FOR ALIGNMENT WITH EXISTING POLICY FRAMEWORKS ...................................................................... 43

4.3.1 Basic alignment criteria .................................................................................................................. 44

4.3.2 Applicability .................................................................................................................................... 45

4.3.3 Potential .......................................................................................................................................... 47

4.4 BUSINESS NEED CRITERIA .......................................................................................................................... 48

4.4.1 Basic business need criteria ............................................................................................................ 49

4.4.2 Market support ............................................................................................................................... 51

5 THE RECOMMENDATION STEP: CRITERIA AND CLASSIFICATION ............................................................ 52

5.1 RECOMMENDATION CRITERIA ..................................................................................................................... 52

5.2 PROPOSED CLASSIFICATION ........................................................................................................................ 54

6 ORGANIZATIONAL ASPECTS OF ASSESSMENT ........................................................................................ 55

6.1 A QUESTIONNAIRE FORM FOR THE ASSESSMENT OF EUROPEAN BUILDING BLOCKS ............................................... 55

6.2 DEALING WITH CURRENT LSPS THAT DO NOT COMPLY TO THE CRITERIA UPFRONT ................................................ 55

7 CONCLUSION ......................................................................................................................................... 57

8 REFERENCES .......................................................................................................................................... 59

9 CONTRIBUTORS ..................................................................................................................................... 62

10 APPENDIX 1. RESEARCH UPDATE AND INPUT FOR THE ANALYSIS .......................................................... 63

10.1 INTERIM EVALUATION OF THE ISA PROGRAMME ............................................................................................ 64

10.2 THE FEASIBILITY AND SCENARIOS FOR THE LONG-TERM SUSTAINABILITY OF THE LSPS ............................................ 65

10.3 STUDY ON ANALYSIS OF THE NEEDS FOR CROSS-BORDER SERVICES .................................................................... 66

10.4 SPOCS D3.1 ASSESSMENT OF EXISTING E-DELIVERY SYSTEMS & SPECIFICATIONS ................................................ 66

10.5 SPOCS D4.1 ASSESSMENT OF SELECTED NATIONAL APPROACHES AND POSSIBLE SOLUTIONS ................................ 67

10.6 STORK D5.1 EVALUATION AND ASSESSMENT OF EXISTING REFERENCE MODELS AND COMMON SPECS ................... 67

10.7 STORK D6.6 EVALUATION REPORT ............................................................................................................ 68

11 APPENDIX 2. PROPOSAL INFORMATION/CRITERIA BASED ON CAMMS ................................................. 69

12 APPENDIX 3. PROPOSAL INFORMATION/CRITERIA BASED ON WP6 DELIVERABLE D6.1 ......................... 71

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 6

13 APPENDIX 4. THE CAMSS ASSESSMENT CRITERIA .................................................................................. 72

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 7

List of Figures

Figure 1. The procedure for assessment with main stakeholders (Executive Summary) .................... 13

Figure 2. The procedure for assessment with main stakeholders ....................................................... 24

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 8

List of Tables

Table 1. Abbreviations and glossary ..................................................................................................... 10

Table 2. Development of the Guidelines and collaboration with other WPs ....................................... 16

Table 3. Quality Management for the Guidelines................................................................................. 20

Table 4. Risk Management for the Guidelines ...................................................................................... 21

Table 5. The complete list of proposal information/criteria based on CAMMS, D6.1, and ADMS....... 32

Table 6. The highly important proposal information/criteria in the recommended order of

presentation .......................................................................................................................................... 33

Table 7. Consideration criteria .............................................................................................................. 34

Table 8. Assessment criteria for standardization: maturity .................................................................. 39

Table 9. Assessment criteria for standardization: openness ................................................................ 40

Table 10. Assessment criteria for standardization: intellectual property rights .................................. 40

Table 11. Assessment criteria for standardization: life cycle................................................................ 43

Table 12. Assessment criteria for alignment: basic .............................................................................. 45

Table 13. Assessment criteria for alignment: applicability ................................................................... 46

Table 14. Assessment criteria for alignment: potential ........................................................................ 48

Table 15. Assessment criteria for business need: basic ........................................................................ 50

Table 16. Assessment criteria for business need: market support ....................................................... 51

Table 17. Knock-out criteria .................................................................................................................. 53

Table 18. Recommendation criteria ..................................................................................................... 54

Table 19. Dealing with current LSPs that do not comply to the criteria upfront.................................. 56

Table 20. Contributors to D3.1 ............................................................................................................. 62

Table 21. Appendix. Proposal information/criteria based on CAMMS ................................................. 70

Table 22. Appendix. The CAMMS assessment criteria ......................................................................... 77

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 9

List of Abbreviations and Glossary

Acronym Explanation

A2A Administration to Administration

ABB Architecture Building Block

ADMS Asset Description Metadata Schema

ADMS.SW Asset Description Metadata Schema for Software

BB Building Block, represents a (potentially re-usable) component of business, IT, or architectural capability that can be combined with other building blocks to deliver architectures and solutions (a TOGAF 9 definition)

BCSS Basic Cross Sector Services

BOMOS Management and Development Model for Open Standards

CAMSS Common Assessment Method For Standards And Formal Specifications. Final Draft Revision of CAMSS. Version 1.0, March 2012

CIP Competitiveness and Innovation Programme

e-CODEX Online access to judicial procedures for claimants, defendants and legal professionals (e-Justice Communication via Online Data Exchange)

e-SENS Electronic Simple European Networked Services

e-ID Electronic identity

epSOS Smart Open Services for European Patients (Cross-border access to patient information online)

EIA European Interoperability Architecture

EIF European Interoperability Framework

(F)RAND (Fair,) reasonable, and non-discriminatory terms

Guidelines Guidelines to the assessment the sustainability and maturity of building blocks (the current document)

H/R Highly important / recommended

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 10

HBB High Level Building Block

IDABC Interoperable Delivery of European eGovernment Services to public Administrations, Businesses and Citizens

ISA Interoperability Solutions for European Public Administrations programme

ISMS Information security management system

LSP Large Scale Pilot

OpenBRR Open Software Business Readiness Rating

PEPPOL Interoperable eProcurement solutions, such as e-invoices and e-signatures for important documents

SBB Solution Building Block

SLA Service Level Agreement

SPOCS Simple Procedures Online for Cross- Border Services (Online Points of Single Contact to help businesses expand into other countries)

STORK Secure idenTity acrOss boRders linKed (Electronic identity for easier access to public services)

Target of assessment

An artifact (typically in a TOGAF 9 sense: "Artifact - an architectural work product that describes an aspect of the architecture"), submitted for assessment in the proposal step, to be assessed through the consideration, assessment, and recommendation steps

TA, Technical Annex

e-SENS Technical Annex v3.0

TOA See "Target of Assessment" (used in tables)

WP Work Package

Table 1. Abbreviations and glossary

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 11

Executive Summary

The aim of the e-SENS (Electronic Simple European Networked Services) project is to provide generic

interoperable solutions for cross-border public services in Europe. e-SENS is another Large Scale Pilot

project launched by the European Commission to support the realisation of European

interoperability policies. All LSPs already launched facilitate the use of innovative technologies for

deployment of EU-wide services in selected areas and, in turn, the development of a digital single

market. The existing and already completed Large Scale Pilots have already proven that providing

cross-border services can be made simpler. In numerous domains, technical building blocks have

been developed and piloted, which enable seamless cross-border services respecting all the various

challenges and requirements faced. The essence of the new e-SENS pilot is to consolidate and

solidify the work achieved to date, to industrialise the solutions, and to extend their potential to

more and different domains.

The goal of e-SENS Work Package 3 is to pave the way for sustainability and long-term governance of

the LSP building blocks and their usage and interoperability within all European Member States and

Associated Countries. WP3 aims to present proposals for sustainable building blocks such as e-ID,

e-Signatures, e-Documents, e-Delivery, that have emerged from the Large Scale Pilots relevant to

the e-SENS project. This proposal should support competitiveness, openness for future technologies,

and interoperability.

Task T3.1 of the Work Package 3 supports presentation and assessment of building blocks. Its main

deliverable is ‘Guidelines to the assessment the sustainability and maturity of building blocks’

(referred to as "Guidelines", which is the current document).

The objective of the Guidelines is to propose a documentation of format and defining criteria for the

maturity and sustainability assessment of building blocks in close cooperation with WP3 (in

particular Task 3.2 ‘Sustainability assessment’), WP 5, and WP 6. Task 3.1 will be active from M1 to

M6.

The current version of the Guidelines builds upon existing work carried out in the European

Interoperability Framework and is based on the e-SENS Technical Annex v 3.0, the CAMSS

methodology (Final Draft Revision of CAMSS, Version 1.0, March 2012), the ADMS (Asset Description

Metadata Schema, Specification Version 1.00), discussions with e-SENS project participants, and

various other e-SENS project materials.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 12

Following the actual e-SENS Technical Annex and the CAMSS methodology, the current document

uses the assessment procedure that comprises proposal, consideration, assessment, and

recommendation steps as follows.

1. In the proposal step, a target of assessment (for example, an Architecture Building Block or

High level Building Block, including support artifacts such as guidelines) is provided to Task

3.2 by WP6 Architectural Board. The target of assessment is provided using the proposal

information/criteria. The proposal information/criteria provide general information about

the proposed target of assessment, its status, items provided for assessment, and other.

2. In the consideration step, consideration criteria are used before the actual assessment, to

validate information received and relevance of the proposal.

3. In the assessment step, the criteria used are categorised under standardisation, alignment

with existing policy frameworks, and business need. Additional information will be sought

from other Work Packages and external stakeholders.

4. In the recommendation step, recommendation criteria are applied to conclude with a

classification (Discarded, Observed, Accepted, Recommended and Mandatory) of the target

of assessment. Depending on the e-SENS mandate, it may be agreed to avoid the

"Mandatory" recommendation in the assessments. This classification will be reported back

to WP6 Architectural Board for using by other Work Packages.

Figure 1 captures the primary process flow.

The Guidelines, including the sets of criteria provided, are intended for assessing various types of

building blocks. During a specific assessment, relevant criteria are selected depending on the target

submitted for evaluation - the Guidelines are flexible and should help with assessment. Assessments

may be done iteratively, in cooperation with WP6, WP5, and other stakeholders.

The document starts with the background given in the introduction. Section 2 presents a procedure

for assessment and an overview of the targets to be assessed. Section 3 describes the criteria for the

proposal and consideration steps. Section 4 is devoted to the assessment step and criteria, section 5

- to the recommendation step. Section 6 presents the organizational aspects of assessment. To help

conducting future assessments, results of already existing assessments are identified and analysed

for re-use in Appendix 1.

The Guidelines is a living document that can be updated after changes in underlying materials, due

to experience gained in pilot assessments, as a result of new agreements, and other. It is proposed

that a review of this document takes place 3 months before each update of D3.2" Assessment on the

maturity of building blocks" - i.e., in months M15, M27, and M33.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 13

Figure 1. The procedure for assessment with main stakeholders (Executive Summary)

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 14

1 Introduction

1.1 Scope and objective of the Guidelines

Task T3.1 of the Work Package 3 supports presentation and assessment of building blocks in line

with the European Interoperability Architecture1 (EIA) and the European Interoperability

Framework2 (EIF). This work is done in close cooperation of WP3 with WP5, and WP6. Concretely,

this subtask develops guidelines and a procedure to assess the status of existing as well as new core

building blocks, such as e-ID, e-Signatures, e-Documents, and e-Delivery.

The main deliverable of Task T3.1 is ‘Guidelines to the assessment the sustainability and maturity of

building blocks’ ("Guidelines", the current document). Task 3.1 will make it easier to focus on the

generic, modular and re-usable character of these building blocks.

The objective of the Guidelines is to propose a documentation of format and defining criteria for the

maturity and sustainability assessment of building blocks in close cooperation with WP3 (in

particular Task 3.2 ‘Sustainability assessment’), WP 5, and WP 6. Task 3.1 is active from M1 to M6.

The Guidelines are intended for assessing various types of building blocks. A procedure and a

questionnaire form are presented for the assessment of building blocks.

Deliverable 3.1 is a living document that has to be updated after changes in underlying materials,

due to experience gained in pilot assessments, as a result of new agreements, and other.

1.2 WP3 General Objectives and Vision

The goal of e-SENS Work Package 3 is to pave the way for sustainability and long-term governance of

the LSP building blocks and their usage and interoperability within all European Member States and

Associated Countries.

WP3 aims to present proposals to sustain core building blocks such as e-ID, e-Signatures, e-

Documents, e-Delivery, that have emerged from the Large Scale Pilots relevant to the e-SENS

1 European Interoperability Architecture, http://ec.europa.eu/isa/documents/isa_2.1_eia-finalreport-

commonvisionforaneia.pdf 2 European Interoperability Framework (EIF) for European public services,

http://ec.europa.eu/isa/documents/isa_annex_ii_eif_en.pdf

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 15

project. These proposals should support competitiveness, openness for future technologies, and

interoperability.

1.3 Methodology of Work

The methodology for developing the Guidelines builds upon existing work carried out in the

European Interoperability Framework3 and is based on the CAMSS methodology4, the ADMS5,

discussions with e-SENS project participants, and various other e-SENS project materials.

CAMSS has been proposed as a reference process and criteria for Member States to perform

assessments of formal specifications and therefore serves as a good resource for the Guidelines. It is

also identified as a starting point for the Guidelines in the Technical Annex. The Guidelines extend

and modify the CAMMS methodology in the scope, detail and process, for example introducing

categories of criteria required in the Technical Annex but missing in CAMSS.

During the assessment (starting from the proposal step) it is necessary to describe the

interoperability assets. This is the main task of the Asset Description Metadata Schema. For this

reason, ADMS was used as another source for the Guidelines.

Development of the Guidelines comprised the following the process and collaboration activities with

other WPs.

Deadline / duration Activities Deliverables

1.04.2013 Start of the task T3.1

1.04 -15.05.2013 Development of main principles for the

Guidelines. Preliminary discussions with

participants.

Main principles for the

Guidelines

17.05. 2013 Discussion of the main principles with

participants (in Brussels)

Adjustment of main

principles

May-June 2013 Preparation and discussion of the preliminary

draft of the Guidelines via e-mail and / or

teleconference

Preliminary draft of the

Guidelines

3 European Interoperability Framework (EIF) for European public services,

http://ec.europa.eu/isa/documents/isa_annex_ii_eif_en.pdf 4 Common Assessment Method For Standards And Formal Specifications (CAMSS). Final Draft Revision of

CAMSS; Version 1.0, March 2012 5 ADMS: Asset Description Metadata Schema; Specification Version 1.00. Release date 18/04/2012

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 16

Deadline / duration Activities Deliverables

11-12.07. 2013 Discussion of the preliminary draft of the

Guidelines (in Tallinn, 11-12 July 2013)

Adjustment of the

preliminary draft

01.07-19.07. 2013 Preparation and discussion of the draft version

of the Guidelines via e-mail and / or

teleconference (be-weekly teleconferences of

WP3 and Task 3.2, teleconferences of the

WP3/WP5/WP6 joint Summer Task Force,

private communications etc.)

Draft version of the

Guidelines

01.08-30.09. 2013 Preparation of the intermediate draft and final

versions and their discussion as before

Draft and final versions of

the Guidelines

Table 2. Development of the Guidelines and collaboration with other WPs

The assessment methodology is given by the assessment procedure, criteria, and guidelines in the

main body of this document. It follows the level of detail for assessment methodology used in

CAMMS, ADMS, and Technical Annex.

In case the terminology (e.g. notion of a building block) depends on deliverables of other WPs, it is

referred to these deliverables. For presenting the assessment guidelines, the notion of a "target of

assessment" is utilized (please see the List of Abbreviations and Glossary section).

The following sections give a brief overview of CAMMS, ADMS, and other sources used.

1.3.1 CAMSS

Among other methodologies, the starting point for this work will be the CAMSS methodology (Final

Draft Revision of CAMSS, Version 1.0, March 2012). CAMSS is a method to assess formal

specifications in the scope of eGovernment. The deliverable is not restricted to CAMSS. The method

is an initiative of the European Commission performed in close coordination with Member States. It

allows for transparency in the choice of eGovernment solutions and standards. However, whereas

the CAMSS methodology is limited to specification and standards, the scope of WP3 is wider

including broader issues of long term sustainability like governance and funding. Considering the

limitations of the CAMSS, additional approaches must be defined to ensure the full business model

for the sustainability of the building blocks and pilots are duly taken into account. Further holistic

aspects have to be taken into consideration as described in the EIF and overall ISA activities,

including the assessment of business models.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 17

In the CAMSS methodology, three scenarios are described. As the objective of the Guidelines is to

propose a documentation of format and defining criteria for the maturity and sustainability

assessment of building blocks, the CAMMS scenario "Assessment scenario 1 – An assessment of a

standardisation organisation" is not applicable to the e-SENS project. Therefore in this document,

the CAMSS scenarios "Assessment scenario 2 – An assessment of a formal specification for adoption

by public administrations" and "Assessment scenario 3 – An assessment and selection of formal

specifications for specific business needs and requirements" are taken into account.

According to assessment scenario 2, a formal specification is assessed in order to evaluate and

provide a recommendation on the possible adoption of the formal specification by the public

administrations. This assessment scenario can be triggered by a public administration or related

external stakeholders. The outcome for this assessment scenario will be the adoption of a certain

formal specification by the public administrations.

According to assessment scenario 3, a proposed set of formal specifications are assessed and

evaluated in order to select and adopt the most relevant formal specification for specific business

needs and requirements. If a certain business need arises and requires the adoption of a relevant

formal specification, the business need should be examined to list the relevant requirements, and

based upon those requirements, a selection of relevant formal specifications may be established.

The outcome for this assessment scenario will be the selection and adoption of relevant formal

specifications for the specific business needs and requirements.

The basic difference between scenarios 2 and 3 is availability of specific business needs and

requirements in scenario 3. In case such needs and requirements are not available, preference is

given to scenario 2.

1.3.2 ADMS

The CAMSS make use of the ADMS (Asset Description Metadata Schema) - a vocabulary to

describe interoperability assets. Where possible, the current Guidelines use ADMS as well. In

particular, the controlled vocabularies used during the assessment procedure are based on ADMS

(Asset Description Metadata Schema. Specification Version 1.00).

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 18

1.3.3 Other sources

Annex II of the Regulation on European standardisation6 comprises requirements for the

identification of ICT technical specifications. The JoinUp repository7 currently covers the

documentation of semantic and open-source software interoperability assets. The EFIR8 contains

interoperability assets of the Member States.

1.4 Relations to Internal e-SENS Environment

1.4.1 Relations to e-SENS Work Packages

WP3 has close connections with WP5 and WP6 both in development of the Guidelines and in doing

the assessments. The collaboration activities with other WPs during developing the Guidelines are

presented in Chapter 1.3 of the current document. Joint activities of the assessment procedure are

depicted in Chapter 2.

Task 3.1 will make it easier to focus on the generic, modular and re-usable character of building

blocks being piloted in WP5 and further refined and consolidated in WP6.

According to the Technical Annex (p 71), one of the Objectives of WP5 is to support WP3 in its

mission to develop Long Term Sustainability activities for e-SENS services and building blocks, by

providing pilot-level and domain-level validation of requirements from a functional and

organizational acceptance viewpoint.

One of the tasks of WP6 deliverable D6.19 is taking stock - creating an ICT Architectural Baseline for

e-SENS (Technical Annex p 112). Each phase in the different WP6 Subgroups will have the main focus

area of Inception. This focus area comprises assessment of ICT maturity and together with WP3 and

WP5 - the overall maturity in accordance with the maturity model (Technical Annex p 114).

e-SENS Task 3.2 will use the Guidelines for the sustainability and maturity assessment. Task 3.4 will

contribute to assessment of business need criteria.

6 REGULATION (EU) No 1025/2012 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 25 October 2012

on European standardisation. OJ L 316, 14.11.2012, pp 12-33, http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2012:316:0012:0033:EN:PDF 7 Joinup, https://joinup.ec.europa.eu/catalogue/all?current_checkbox=1

8 EFIR, http://ec.europa.eu/isa/actions/04-accompanying-measures/4-2-4action_en.htm

9 WP6 deliverable "D6.1 Executable ICT Baseline Architecture"

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 19

1.4.2 Reviewing and commenting on the Guidelines

The current version of the Guidelines has taken into account comments on the earlier versions from

contributors shown in Section 10 of the current document.

The Guidelines are a living document that has to be updated after changes in underlying materials,

due to experience gained in pilot assessments, as a result of new agreements, and other.

1.5 Relations to External e-SENS Environment

The Guidelines can be used for each architectural, solution and high level building block. Its novelty

is that it goes beyond the state-of-the-art methodologies that focus solely on the assessment of

standards and specifications. This deliverable makes it possible to focus on a set of standards and

specifications. If the Guidelines are well accepted within the e-SENS project, they will be published

on joinup.ec.europa.eu. In this way they will be shared with a variety of stakeholders: institutional,

national, private, industrial, judicial professions etc. This will enable these groups, and

standardization organizations in particular, to profit from this knowledge and they are free to refine

the methodology for their purposes.

1.6 Quality Management

The overall quality of the development process of the Guidelines is based on the following.

1. Use of the most relevant, authoritative, and up-to-date frameworks and methodologies in

development of the Guidelines.

2. Adherence to the requirements stated in the Technical Annex.

3. Adherence to the e-SENS project templates, language requirements, and other quality

specifications.

4. Staged development of the Guidelines comprising planning, preliminary draft, intermediate

drafts, and the final version (see Chapter 1.3).

5. Coordination of the development activities with other WPs.

6. Reviews of the main principles and the drafts in collaboration with other WPs (the External

Quality Team).

7. Discussions via e-mail and / or teleconference (be-weekly teleconferences of WP3 and Task

3.2, teleconferences of the WP3/WP5/WP6 joint Summer Task Force, private

communications etc.

8. Keeping records of reviews and feedback.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 20

Category Remarks Checked by

Conformance to e-

SENS template

OK (see item 3 above) Freek van Krevel, Anette

Junikiewicz, Jaak Tepandi

Language & Spelling OK (see item 3 above) Freek van Krevel, Anette

Junikiewicz, Jaak Tepandi

Delivered on time OK (see item 4 above) Freek van Krevel, Anette

Junikiewicz

Each technology

description contains

the correct elements

OK (see items 1-2,5-7 above) Authors, contributors

Consistency with

description in the TA

and in other e-SENS

deliverables

OK (see items 1-2,5-7 above) Authors, contributors

Contents is fit for

purpose

OK (see items 1-2,5-7 above) Authors, contributors

Contents is fit for use OK (see items 1-2,5-7 above) Authors, contributors

Commitment within

WP

OK (see items 1-2,5-8 above) Authors, contributors

Table 3. Quality Management for the Guidelines

1.7 Risk Management

The initial risk is that the deliverable is not accepted in the remainder of the project. Another risk is

that the Guidelines are not followed - the core building blocks are being produced within one single

silo and cannot be handed over properly to WP3 for a sustainability assessment. Mitigation

measures for this should be made in Task 3.2, so that the Guidelines are being well accepted and

have sufficient authority to be followed. Early awareness mechanisms and consensus building of the

Guidelines prior to the sustainability assessment cycles in M18, M30 and M36 smoothen and

accelerate the outcome of the sustainability assessment in itself.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 21

Description Probability Impact Priority Response Owner

Non acceptance

of the guidelines

Likely High Top Further consensus

building prior to

assessments, escalation

to MB or GA. Flexibility

of the Guidelines

WP3 leader

No proper follow-

up by Task 3.2

Likely High Top Bringing point under

the attention of Task

leader Wp3.2.

Guidelines as a

supporting aid, not the

mandatory list

WP3 leader

and Task

leader

WP3.2

No further

adjustments to

comply with

experience

gained in pilot

assessments,

result of new

agreements, and

other

Likely Medium Medium Task leader WP 3.2

need to adapt

Guidelines in close

cooperation with Task

leader WP3.1

Task leader

WP 3.2

There are no

resources for

assessment of all

BBs

Low Medium Medium Prioritization of BBs for

assessment, planning

for assessments, liaison

with the e-SENS project

management

T3.1, T3.2,

WP3, WP5,

WP6, WP1

All the data for

assessments are

not available

Medium Low Medium The assessment criteria

of the Guidelines can be

modified. Flexibility of

the assessment

WP3, WP5,

WP6,

external

stakeholders

Table 4. Risk Management for the Guidelines

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 22

1.8 Legal Issues

Not applicable within this deliverable.

1.9 Structure of the Document

The document is structured as follows:

Section 1 presents the introduction

Section 2 - a procedure for assessment and an overview of the targets to be assessed

Section 3 - the criteria for the proposal and consideration steps

Section 4 is devoted to the assessment step and criteria

Section 5 - to the recommendation step

Section 6 deals with organizational aspects of assessment

Section 7 presents the conclusions

Sections 8 and 9 - references and contributors

To help conducting future assessments, results of already existing assessments have been analysed

to enable their identification and re-use. This part of the Guidelines is presented in Appendix 1, in

order to enable the main users of the Guidelines (the experts performing specific assessments) to

have fast access to the main body of the Guidelines (the evaluation procedure and criteria).

Appendices 2, 3, and 4 provide additional information - proposal/information criteria based on

CAMMS and on WP6 deliverable D6.1, as well as the CAMMS assessment criteria.

The footnotes for the references are given at the beginning of each chapter and are not duplicated

for other occurrences of the same reference throughout the chapter.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 23

2 The Assessment Procedure and Targets

2.1 The Assessment Procedure

As presented in the introduction, WP3 has assessed applying CAMSS10 scenarios as a means to

develop a procedure for the assessment of building blocks.

CAMSS scenarios 2 and 3, especially the assessment step (CAMSS, section 2.4) are considered for a

basis for organizational aspects of assessment, together with relevant frameworks such as

BOMOS2i11, ISO/IEC 1220712, ISO/IEC 25000 series standards13, and other sources as appropriate.

The assessment is conducted by the e-SENS WP3 Assessment Panel, providing transparency of the

assessment process.

Following the e-SENS Technical Annex and the CAMSS methodology, the current document uses the

assessment procedure that comprises proposal, consideration, assessment, and recommendation

steps (Figure 1).

1. In the proposal step, a target of assessment (for example, an Architecture Building Block or

High level Building Block, including support artifacts such as guidelines) is provided to Task

3.2 by WP6 Architectural Board. The target of assessment is provided using the proposal

criteria. The proposal criteria provide general information about the proposed target of

assessment, its status, items provided for assessment, and other. Some of the proposal

criteria are highly important, the others are recommended.

2. In the consideration step, consideration criteria are used before the actual assessment, to

validate information received and relevance of the proposal.

3. In the assessment step, the criteria used are categorised under standardisation, alignment

with existing policy frameworks, and business need. Additional information will be sought

from other Work Packages and external stakeholders.

4. In the recommendation step, recommendation criteria are applied to conclude with a

classification (Discarded, Observed, Accepted, Recommended, and Mandatory) of the target

10

Common Assessment Method For Standards And Formal Specifications (CAMSS). Final Draft Revision of CAMSS. Version 1.0, March 2012. 11

BOMOS2i, http://www.forumstandaardisatie.nl/fileadmin/os/publicaties/HR_BOMOS_English_translation_Jan2013.pdf 12

ISO/IEC 12207:2008. Systems and software engineering. Software life cycle processes 13

ISO/IEC 25000:2005. Software Engineering. Software product Quality Requirements and Evaluation (SQuaRE). Guide to SQuaRE

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 24

of assessment. Depending on the e-SENS mandate, it may be agreed to avoid the

"Mandatory" recommendation in the assessments. This classification will be reported back

to WP6 Architectural Board for using by other Work Packages.

Figure 1 captures the primary process flow. One of the design principles for this document is

flexibility of the assessment - Guidelines are intended as a supportive material to help with

assessment. Assessments may be done iteratively, in cooperation with WP6, WP5, and other

stakeholders. As an example, there are several options for timing piloting with respect to

assessments. In some cases, it would be useful to have piloting results before the assessment and

use these results in evaluating the criteria. From the other side, depending on the situation and

decision of the Architectural Board, the assessment results may be used to decide about which

building blocks to pilot.

There can be many additional connections between the stakeholders. In particular, the Assessment

Panel communicates with other Work Packages and external stakeholders during the assessment

step. This communication may involve prioritizing the criteria used in assessment, considering

additional criteria, evaluating the criteria, specifying procedures for the recommendation criteria,

etc.

Figure 2. The procedure for assessment with main stakeholders

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 25

2.2 The Targets of Assessment

Since the main topic for the current Guidelines is documentation of format and defining criteria for

assessment, only selected building block examples are mentioned below to improve understanding

of the assessment issues. Further selection of building blocks will be based on input from WP6

during the assessment phase.

The core building blocks for the first cycle of the sustainability assessment are as follows:

e-ID

e-Signatures

e-Documents

e-Delivery

Other building blocks, such as Semantics, Circle of Trust can be added at a later stage as applicable.

The targets of assessment are classified under broader categories of Architecture Building Blocks and

Solution Building Blocks. The Architecture Building Blocks provided for assessment by WP6 may

include but are not constrained to the following types.

Architecture strategy, frameworks, and methodologies

Service contracts, service level agreements, policies

Documentation and guidelines, including software and deployment descriptions

Specification of interfaces of components considering the building block as black box

Specification of the functionality of the building block

Detailed design of the building block and its internal components

Other types

It has been agreed in the management board of 2 July 2013 that Solution Building Blocks are beyond

the targets of assessment for WP3. For the sake of completeness, the following types can also be

regarded as targets for assessment, albeit for other purposes than in WP3.

Implementation of the building block in terms of source files and its documentation

Implementation of the building block in terms of executable files

Other types

These types may occur in combination.

The complete list of building blocks and targets for assessment will be provided by WP6

Architectural Board.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 26

To deal with different types of building blocks, the current guideline proposes a wide set of criteria

that should be applied selectively as follows.

Some criteria may be applicable to a specific type of building blocks

The criteria may be highly important or recommended

There can be assessment levels defined for the criteria

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 27

3 Proposal and Consideration Steps: Documentation of

Format

The documentation of formats provides two sets of information/criteria. First, it describes the

proposal information/criteria used by WP6 to provide building blocks for assessment. Second, it

describes consideration criteria used by Task T3.2 to preliminarily evaluate the proposal.

The documentation of formats is based on several sources:

CAMSS14 sections 3.1 and 3.2

deliverable "D6.1 Executable ICT Baseline Architecture"

Asset Description Metadata Schema15 (ADMS)

other relevant standardization and asset description initiatives such as the Regulation on

European standardisation16, the JoinUp platform17, the European Federated Interoperability

Repository18 (EFIR), and others.

3.1 The Proposal Step

As input to the proposal step, a target of assessment is provided by WP6 Architectural Board. The

target of assessment is provided using the proposal information/criteria. In the e-SENS reference

materials, the following sets of proposal information/criteria are available.

Proposal information/criteria based on CAMMS Section 3.1 and the ADMS specification v

0.8. The main categories of criteria during the proposal step comprise asset description and

relationship. For reference, the CAMMS information/criteria are given in Appendix 2.

Proposal information/criteria based on WP6 deliverable "D6.1 Executable ICT Baseline

Architecture" (section 2.5, "Building Block Evaluation Model"). For reference, the D6.1

information/criteria are given in Appendix 3.

14

Common Assessment Method For Standards And Formal Specifications (CAMSS). Final Draft Revision of CAMSS. Version 1.0, March 2012. 15

ADMS. Asset Description Metadata Schema. Specification Version 1.00. Release date 18/04/2012 16

REGULATION (EU) No 1025/2012 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 25 October 2012 on European standardisation. OJ L 316, 14.11.2012, pp 12-33, http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2012:316:0012:0033:EN:PDF 17

JoinUp, https://joinup.ec.europa.eu/catalogue/all?current_checkbox=1 18

EFIR, http://ec.europa.eu/isa/actions/04-accompanying-measures/4-2-4action_en.htm

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 28

Proposal information/criteria based on the ADMS concept "Semantic Asset" (section 5.5.1 of

the document "ADMS. Asset Description Metadata Schema. Specification Version 1.00.

Release date 18/04/2012").

In case of a software component, the proposal information/criteria may be based on the

ADMS.SW19 main concepts: software project, software release, software package, software

repository (section 4.2 of the document "Asset Description Metadata Schema for Software

1.00. ADMS.SW 1.00. Release date 20/05/2012"). As WP3 will not evaluate SBB, this set is

here mentioned for the sake of completeness and in case of possible subsequent

modifications.

These proposal information/criteria are integrated into one set according to two principles: (1)

merging the information/criteria from CAMMS, ADMS, and the D6.1 deliverable and (2) keeping

ADMS as the central model for the merge.

The following sub-sections include (1) the complete list of proposal information/criteria and (2) for

convenience, the highly important information/criteria in the recommended order of presentation.

Terminological note. Different sources refer differently to the proposal items: D6.1 - as "information

and criteria"; ADMS - as "property" and "relationship"; CAMMS - as "criteria". In the current

document, the proposal items are referred to as "information/criteria".

3.1.1 Complete list of proposal information/criteria

The complete list for the proposal information/criteria given in the table below comprises the

following fields.

Nr - category number.

Category - CAMSS category or D6.1.

Description - category description.

Reference - reference for the information/criteria (CAMSS number, ADMS, or D6.1). Note:

only CAMMS information/criteria that are included in the ADMS v 1.0 are given.

Information/criteria - name of the information/criteria.

Description - description of the information/criteria, or reference to source information.

H/R - highly important / recommended. The table indicates only highly important (H) items;

all other information/criteria are recommended. Highly important information/criteria are:

(1) those that are used to Identify an asset according to the table in ADMS Section 4; (2)

19

Asset Description Metadata Schema for Software 1.00. ADMS.SW 1.00. Release date 20/05/2012, https://joinup.ec.europa.eu/asset/adms_foss/asset_release/admssw-100

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 29

those that are included in the deliverable D6.1, but are not assessed by WP3. Note: in case

of items assessed by WP3, there can still be a preliminary optional assessment by WP6

included in the proposal.

Nr Category Descrip-tion

Refe-rence

Information/ criteria

Description H/R

1 CAMMS asset descrip-tion

Providing the general infor-mation for the proposed TOA

CAMSS P.1

Alternative name

Alternative name for the asset. Note: this information may be used to provide additional access points, e.g. allowing indexing of any acronyms, nicknames, shorthand notations or other identifying information under which a user might expect to find the asset

CAMSS P.2

Date of creation

Creation date of this version of the asset

H

CAMSS P.3

Date of last modification

Date of latest update of asset H

CAMSS P.4

Description Descriptive text for the asset, if applicable including its purpose and functions

H

CAMSS P.5

ID URI for the asset (a globally unique, but not always human-readable identifier)

H

ADMS Identifier Any identifier for the asset (a human-readable identifier)

H

CAMSS P.6

Keyword Word or phrase to describe the asset

ADMS Metadata date Date of the most recent update of the metadata for the Asset

CAMSS P.7

Name Name of the asset. See also D6.1 Section 2.5.1

H

CAMSS P.8

Version Version number or other designation of the asset

H

ADMS Format Format in which an asset is distributed (e.g. PDF, XSD, RDF/XML, HTML, ZIP). See also ADMS Section 6.3

H

ADMS Version notes Description of changes between this version and the previous version of the Asset

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 30

Nr Category Descrip-tion

Refe-rence

Information/ criteria

Description H/R

2 CAMMS relation-ship

Providing the informa-tion on the status of the specific TOA being proposed

CAMSS P.9, D6.1

Asset type Classification of an asset according to a controlled vocabulary, e.g. code list, metadata schema. See also ADMS Section 6.2, D 6.1 Section 2.5.4

H

ADMS Contact point Contact point for further information about an asset

CAMSS P.10

Current version Current or latest version of the asset

ADMS Home page A Web page that is fully dedicated to the asset

ADMS Included asset An asset that is contained in the asset being described, e.g. when there are several vocabularies defined in a single document

ADMS Included item item that is contained in the asset (e.g. a concept in a controlled vocabulary, an individual code in a code list or any other ‘atomic’ element)

CAMSS P.13

Interoperability level

Level according to the European Interoperability Framework (EIF 2.0) for which an asset is relevant

H

CAMSS P.14

Language Language of an asset if it contains textual information, e.g. the language of the terms in a controlled vocabulary or the language that a specification is written in

CAMSS P.11

Main documentation

The main documentation or specification of the asset

ADMS Metadata language

Language of the metadata for the asset

ADMS Metadata publisher

Organisation making the metadata for the asset available

ADMS Next version Newer version of the asset

CAMSS P.15

Previous version Older version of the asset

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 31

Nr Category Descrip-tion

Refe-rence

Information/ criteria

Description H/R

CAMSS P.16

Publisher Organisation making the asset available

H

ADMS Related asset Assets related to the asset

ADMS Related documentation

Documentation that contains information related to the asset

ADMS Related web page

A Web page that contains information related to the asset

ADMS Distribution Implementation of the asset in a particular format

ADMS Repository origin

Repository that contains the primary description of the asset

ADMS Sample Sample of the asset

CAMSS P.18

Spatial coverage Geographic region or jurisdiction to which the asset applies

CAMSS P.20

Status Status of the asset (Completed, Under development, Deprecated, Withdrawn), see ADMS sections 5.6.13 and 6.10

ADMS Temporal coverage

Time period relevant to the asset, e.g. its validity

ADMS Theme Theme or sector to which the asset applies

ADMS Translation Translation of the asset

3 D6.1 Items from D6.1 not included above

D6.1 Relationship to ICT Strategies in EC/MS

See D6.1 section 2.5.2

D6.1 Requirements See D6.1 section 2.5.3 H

D6.1 Generic/Specific See D6.1 section 2.5.5 H

D6.1 Architecture Framework / Component Specifications

See D6.1 section 2.5.6 H

D6.1 Standards See D6.1 section 2.5.7

D6.1 Open Source See D6.1 section 2.5.8 H

D6.1 Relationship and Couplings

See D6.1 section 2.5.9 H

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 32

Nr Category Descrip-tion

Refe-rence

Information/ criteria

Description H/R

D6.1 Related Artifacts and Solution Building Blocks

See D6.1 section 2.5.10 H

D6.1 Ownership See D6.1 section 2.5.11 H

D6.1 Life Cycle Management

See D6.1 section 2.5.12 H

D6.1 In use? See D6.1 section 2.5.13 H

D6.1 Technical Maturity

See D6.1 section 2.5.14 H

D6.1 Business Maturity

See D6.1 section 2.5.15 H

D6.1 Market Maturity

See D6.1 section 2.5.16

D6.1 Evaluation See D6.1 section 2.5.17 H

Table 5. The complete list of proposal information/criteria based on CAMMS, D6.1, and ADMS

Further relevant information/criteria from ADMS and other sources can be included under this main

classification.

3.1.2 The highly important proposal information/criteria

For convenience, the following table comprises the highly important proposal information/criteria

(the criteria having "H" in the H/R field in the table above) in the recommended order of

presentation, omitting the CAMMS category data and the H/R field.

Refe-rence

Information/criteria Description

CAMSS P.7

Name Name of the asset. See also D6.1 Section 2.5.1

CAMSS P.2

Date of creation Creation date of this version of the asset

CAMSS P.3

Date of last modification Date of latest update of asset

CAMSS P.4

Description Descriptive text for the asset (if applicable, including its purpose and functions)

CAMSS P.5

ID URI for the asset (a globally unique, but not always human-readable identifier)

ADMS Identifier Any identifier for the asset (a human-readable identifier)

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 33

Refe-rence

Information/criteria Description

CAMSS P.8

Version Version number or other designation of the asset

ADMS Format Format in which an asset is distributed (e.g. PDF, XSD, RDF/XML, HTML, ZIP). See also ADMS Section 6.3

CAMSS P.9, D6.1

Asset type Classification of an asset according to a controlled vocabulary, e.g. code list, metadata schema. See also ADMS Section 6.2, D 6.1 Section 2.5.4

CAMSS P.13

Interoperability level Level according to the European Interoperability Framework (EIF 2.0) for which an asset is relevant

CAMSS P.16

Publisher Organisation making the asset available

D6.1 Requirements See D6.1 section 2.5.3

D6.1 Generic/Specific See D6.1 section 2.5.5

D6.1 Architecture Framework / Component Specifications

See D6.1 section 2.5.6

D6.1 Open Source See D6.1 section 2.5.8

D6.1 Relationship and Couplings See D6.1 section 2.5.9

D6.1 Related Artifacts and Solution Building Blocks

See D6.1 section 2.5.10

D6.1 Ownership See D6.1 section 2.5.11

D6.1 Life Cycle Management See D6.1 section 2.5.12

D6.1 In use? See D6.1 section 2.5.13

D6.1 Technical Maturity See D6.1 section 2.5.14

D6.1 Business Maturity See D6.1 section 2.5.15

D6.1 Evaluation See D6.1 section 2.5.17

Table 6. The highly important proposal information/criteria in the recommended order of presentation

3.2 The Consideration Step

In the consideration step, consideration criteria are used to preliminarily evaluate quality of

information/criteria received and relevance of the proposal. Following CAMSS (Section 3.2), the

main categories of criteria for the consideration step are correctness, relevance, legislation and

regulation, as detailed in the table below.

The criteria are defined as YES/NO questions, and all answers should be YES in order to consider the

formal specification for further assessment. The criteria may be adapted by the Assessment Panel to

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 34

fit their specific needs and requirements; in particular, relevant additional criteria from ADMS and

other sources can be included under this main classification. The Assessment Panel also decides how

to make the checks needed. Note. All the checks made in the consideration step are preliminary and

the conclusions may change in the following steps.

Nr. Category Description Nr. Criteria

1 Correctness The proposal for the assessment is provided correctly and the necessary documentation is provided

C.1 Is the proposal information correctly provided?

Is the proposal provided by an eligible entity?

Is the proposal provided within the given time-frame?

C.2 Is the documentation provided for performing the assessment complete?

2 Relevance The proposal for the assessment falls within the relevant areas of the public administration

C.3 Is the purpose of the TOA related and of interest to the e-SENS community?

C.4 Is the TOA meeting its purpose?

3 Legislation and regulation

The proposal for the assessment does not impede any legislative or regulatory requirements.

C.5 Is the TOA not violating the regulatory requirements for the area of application?

Is the TOA sufficiently is supported by the legislation implemented in respective jurisdictions?

Table 7. Consideration criteria

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 35

4 The Assessment Step

This section proposes a method for integration and use of assessment criteria and presents three

sets of criteria based on the e-SENS Technical Annex, the CAMSS20 assessment criteria, and other

sources.

4.1 Integration and Use of Assessment Criteria

4.1.1 Integration of Assessment Criteria from Various Sources

The assessment criteria originate from various sources. First, a number of sustainability assessment

criteria have been provided in the e-SENS Technical Annex (pp. 60-62) and in discussions with the

e-SENS project participants. Second, CAMSS (Section 3.3) presents the following main categories of

criteria for the assessment step: applicability, maturity, openness, intellectual property rights,

market support, and potential. Third, there are other frameworks / standards (such as ADMS21, WP6

deliverable D6.122, ISO/IEC 1220723, ISO/IEC 25000 series standards24, and other) potentially suitable

for assessment. Fourth, the criteria should be aligned with the WP5 requirement model, containing

scenarios, use cases, and types of requirements (legal, business, technical and be functional and

non-functional).

There are several ways for integration of these criteria.

1. Taking the Technical Annex as the basis, integrating CAMSS criteria into the criteria of the

Technical Annex, and aligning with the standards and the WP5 requirement model.

2. Taking the CAMSS criteria as the basis and merging the criteria form the Technical Annex

into CAMSS.

3. Taking some other frameworks / standards (such as ADMS, D6.1, ISO/IEC 12207, ISO/IEC

25000 series standards, Open Software Business Readiness Rating25, or other) as the basis

and integrating Technical Annex and CAMSS criteria into the selected framework.

20

Common Assessment Method For Standards And Formal Specifications (CAMSS). Final Draft Revision of CAMSS. Version 1.0, March 2012 21

ADMS. Asset Description Metadata Schema. Specification Version 1.00. Release date 18/04/2012 22

WP6 deliverable "D6.1 Executable ICT Baseline Architecture" 23

ISO/IEC 12207:2008. Systems and software engineering. Software life cycle processes 24

ISO/IEC 25000:2005. Software Engineering. Software product Quality Requirements and Evaluation (SQuaRE). Guide to SQuaRE 25

Open Software Business Readiness Rating (OpenBRR), http://www.oss-watch.ac.uk/resources/archived/brr

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 36

As proposed by WP3, this section presents a solution based on the first option - taking the Technical

Annex as the basis, integrating CAMSS criteria into the criteria of the Technical Annex, and aligning

with the standards, deliverable D6.1, and the WP5 requirement model.

According to the Technical Annex, analysis of sustainability of the building blocks will take into

account standards, policy frameworks, and business needs. The following sub-sections are devoted

to these sets of criteria. In the tables, an acronym TOA is used for the notion of "target of

assessment".

To preserve compatibility with CAMSS, the CAMSS categories are fully integrated into the

assessment criteria based on the e-SENS Technical Annex and complemented where necessary,

indicating the source for the additional criteria.

Some of the terms in the criteria (e.g., “significant market share”) may be perceived differently by

different respondents. Adjustment of these terms is left to the Assessment Panel, as it would be too

restrictive to specify these terms precisely in the current document, for example giving exact

numerical threshold values.

4.1.2 Use of Assessment Criteria

The sets of criteria provided in this section are quite extensive and suitable to perform assessment

of different building blocks. During a specific assessment, the criteria may be prioritized and relevant

criteria selected depending on the target presented for evaluation. Additional criteria may be

introduced by the Assessment Panel, for example longevity / experience of the owners, volatility

(likelihood to change, ability to be compatible with former versions), adaptability to other

situations/contexts, adoption (past, short term, long term), and others. In summary, the Guidelines

are flexible and are intended as a supportive material to help with assessment.

The Assessment Panel may find it useful to start a specific assessment with prioritization and

selection of criteria applicable to the current TOA. Alternatively, prioritization and selection of

applicable criteria may be part of the assessment.

Assessment of the criteria may be performed as follows.

The set of standardisation criteria can be assessed within WP3 in close collaboration with

WP6, WP5, industry partners, and other relevant stakeholders.

Alignment with existing policy frameworks can be assessed in collaboration with other Work

Packages, including WP4 and the e-SENS legal expertise centre.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 37

The business need criteria can be assessed by involving stakeholders from both inside and

outside the e-SENS project, e.g. by using questionnaires, semi-structured interviews, or

other means to get the opinion of important end users and software/service providers.

It is recommended to use at least the following set of answers: ‘YES’, ‘NO’, and ‘Not applicable’. A

‘NO’ answer does not imply immediate discarding of the formal specification.

The Assessment Panel may define certain criteria within the assessment step as being knock-out or

blocking criteria. A ‘NO’ answer for a knock-out criterion will lead to discarding the proposed formal

specification. Examples of knock-out criteria are provided in the section devoted to recommendation

criteria.

It is recommended to provide a justification for each answer, comprising the following elements as

applicable.

The reasoning and possible evidence for the answer.

Definition of possible quantitative measures.

Specification of the topic used in the assessment in case of a criterion involving more than

one topic.

4.2 Standardisation Criteria

The set of standardisation criteria includes technical, organisational and semantic criteria that can be

used to assess the maturity for standardizing the building blocks. Potential criteria include the

following.

Change management and maintenance, licensing, common specifications of components,

openness, support and manuals, security features, service levels (availability of the building

block, emergency helpdesk) as well as the deployment and running of these components in

an organisation’s IT infrastructure.

Life cycle workflow over the project lifetime (development, revisions, updates, work in

progress, and incremental version releases).

Maintaining building blocks beyond the project’s lifetime and who will be responsible for the

different parts.

Interoperability between the LSPs and Building Blocks in Member States.

For reference, the following CAMSS categories of criteria are included in this section.

Maturity

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 38

Openness

Intellectual property rights

Relevant criteria from international standards, ISO/IEC 12207:2008 "Systems and software

engineering. Software life cycle processes" and ISO/IEC 25000:2005 "Software Engineering. Software

product Quality Requirements and Evaluation (SQuaRE)" have been taken into account.

The set of standardisation criteria can be assessed within WP3 in close collaboration with WP6, WP5,

industry partners, and other relevant stakeholders.

4.2.1 Maturity

The following table presents maturity criteria based on CAMSS.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

2 Ma-turity

A TOA should in itself be mature enough for adoption by public administrations. This category addresses the development status, the quality, guidelines and stability of the TOA.

2.1 Deve-lop-ment status

For the ‘development status’, the current development status of the TOA in the development cycle is addressed.

A.9 Has the TOA been sufficiently developed and in existence for a sufficient period to overcome most of its initial problems?

2.2 Quali-ty

For ‘quality’, the level of detail in the TOA and the conformance of implementations is addressed.

A.10 Are there existing or planned mechanisms to assess conformity of the implementations of the TOA (e.g. conformity tests, certifications)?

A.11 Has the TOA sufficient detail, consistency and completeness for the use and development of products?

2.3 Guide-lines

For the ‘guidelines’, the existence of implementation guidelines or reference implementations is addressed.

A.12 Does the TOA provide available implementation guidelines and documentation for the implementation of products?

A.13 Does the TOA provide a reference (or open source) implementation?

2.4 Stabi-lity

For ‘stability’, the level of change to the TOA and the stability of underlying technologies is addressed.

A.14 Does the TOA address backward compatibility with previous versions?

A.15 Have the underlying technologies for implementing the TOA been proven?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 39

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

A.15 Have the underlying technologies for implementing the TOA been stable?

A.15 Have the underlying technologies for implementing the TOA been clearly defined?

In case the Assessment Panel finds it necessary, other indicators may be applied. Examples include measures characterizing real world usage of the TOA, such as volume of transactions made using the TOA, frequency of such transactions, and others.

Table 8. Assessment criteria for standardization: maturity

4.2.2 Openness

The following table presents openness criteria based on CAMSS.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

3 Openness

A TOA should be sufficiently open and available, to be relevant for adoption by public administrations. This category addresses the openness of the organisation maintaining the TOA and its decision-making process, and openness of the documentation and accessibility of the TOA.

3.1 Orga-nisa-tion

For the ‘openness’ of the organisation maintaining the TOA, the level of openness for participating in this organisation is addressed.

A.16 Is information on the terms and policies for the establishment and operation of the organisation maintaining the TOA publicly available?

A.17 Is participation in the creation process of the TOA open to all relevant stakeholders (e.g. organisations, companies or individuals)?

3.2 Pro-cess

For the ‘process’, the level of openness regarding the development and decision-making process for the TOA is addressed.

A.18 Is information on the standardisation process publicly available?

A.19 Information on the decision making process for approving TOAs is publicly available?

A.20 Are the TOAs approved in a decision making process which aims at reaching consensus?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 40

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

A.21 Are the TOAs reviewed using a formal review process with all relevant external stakeholders (e.g. public consultation)?

A.22 All relevant stakeholders can formally appeal or raise objections to the development and approval of TOAs?

3.3 Docu-men-tation

For the openness of the ‘documentation’, the accessibility and availability of the documentation of the TOA is addressed.

A.23 Relevant documentation of the development and approval process of TOAs is publicly available (e.g. preliminary results, committee meeting notes)?

A.24 Is the documentation of the TOA publicly available for implementation and use on reasonable terms?

Table 9. Assessment criteria for standardization: openness

4.2.3 Intellectual property rights

The following table presents intellectual property rights criteria based on CAMSS.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

4 Intel-lec-tual pro-perty rights

A TOA should be licensed on (F)RAND terms or even on a royalty-free basis in a way that allows implementation in different products. This category addresses the availability of the documentation on the IPR and the licenses for the implementation of the TOA.

4.1 IPR Docu-men-tation

For the ‘documentation of the intellectual property rights’, the availability of the information concerning the ownership rights of the TOA is addressed.

A.25 Is the documentation of the IPR for TOAs publicly available?

4.2 Licenses

For the ‘licenses’ within the intellectual property rights, a (fair) reasonable and non-discriminatory ((F)RAND) or even royalty-free basis is addressed for the use and implementation of the TOA.

A.26 Is the TOA licensed on a (F)RAND basis?

A.27 Is the TOA licensed on a royalty-free basis?

Table 10. Assessment criteria for standardization: intellectual property rights

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 41

4.2.4 Life cycle, maintenance, service levels, security

Depending on the target of assessment, criteria presented in the following tables may be relevant

for assessment. These criteria are selected according to the standardisation criteria in the Technical

Annex and are based on international standard ISO/IEC 12207:2008 "Systems and software

engineering. Software life cycle processes", as well as on the ISO/IEC 20000, ISO/IEC 25000, and

ISO/IEC 27000 series of standards.

Criteria referring to interoperability are presented in the next section.

The range of possible criteria presented in the above standards is extremely wide. Only selected

criteria most relevant to the requirements presented in the Technical Annex are presented below.

This selection may be expanded or restricted during a specific assessment.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

Life Cycle ma-na-ge-ment

The life cycle management process provides life cycle policies, processes, and procedures.

There should exist a life cycle management process.

Is an organisation available for providing life cycle policies, processes, and procedures for the TOA?

Is life cycle workflow over the project lifetime (development, revisions, updates, work in progress, and incremental version releases) established?

Mainte-nan-ce

The maintenance process provides cost-effective support to the TOAs during their life-cycle, including change management.

Main-te-nance imp-le-men-tation

There should be an organisation, resources, plans and procedures for conducting the maintenance activities.

Does a maintaining organisation exist? Note: CAMMS assessment criteria A.42, A.43, A.44, A.45 are taken into account in the maintenance category.

Has the maintaining organisation developed, documented, and executed plans and procedures for conducting the maintenance activities?

Does the maintenance organisation for the TOA have sufficient finances and resources for the long term?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 42

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

Prob-lem ana-lysis

The problem reports or modification requests should be analysed for their impact.

Does the maintainer have procedures for analysing the problem reports or modification requests for their impacts on the organization, the existing system, and its interfaces?

Modi-fica-tion imp-le-men-tation

It should be determined and documented which software items need to be modified.

Does the maintainer have procedures for determining which software units and versions need to be modified?

Mig-ration

Migration of a system or software product (including data) should be planned, documented, and performed.

Are there procedures for developing, documenting, and executing migration plans, including the system, data, and users?

Dis-posal

Ending the existence of a TOA should be planned, documented, and performed.

Are there procedures for developing, documenting, and executing disposal plans for TOAs?

Ser-vice le-vels

The services related to the TOA should be agreed with the customers.

SLA If applicable, there should be service level agreements relating to the availability of the TOA.

Do SLAs relating to the availability of the TOA exist?

If applicable, there should exist an emergency helpdesk for the TOA.

Is there an emergency helpdesk for the TOA?

Secu-rity

Systems, data, and resources should be protected from accidental or malicious acts.

ISMS The maintainer should have an information security management system.

Does the maintainer have a system based on a business risk approach, to establish, implement, operate, monitor, review, maintain and improve information security?

Iden-tifica-tion

Information security requirements should be understood.

Has the maintainer analysed and understood the information security requirements related to the TOA?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 43

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

Risks Information security risks should be assessed.

Has the maintainer assessed the information security risks related to the TOA?

Cont-rols

Information security controls should be selected and implemented.

Has the maintainer selected and implemented information security controls related to the TOA?

Conti-nuity

Business continuity management

Is the maintainer managing business continuity, including development, implementation, testing, and improvement of the business continuity plans?

Moni-tor

The effectiveness of the ISMS should be monitored, maintained, and improved.

Is the maintainer monitoring, maintaining, and improving the effectiveness of the ISMS?

Table 11. Assessment criteria for standardization: life cycle

4.3 Criteria for Alignment with Existing Policy Frameworks

This set of criteria addresses alignment of the individual building blocks with existing policy

frameworks. Potential criteria include the following.

Criteria resulting from EIA26 and EIF27. In particular, the criteria present in the EIF/EIA and

missing in the current set of criteria will be considered for inclusion. Criteria include the EIF

conceptual model for public services (presented also in Technical Annex p. 36), as well as

coverage of A2A services. Examples: user authorization and aggregation of services.

Alignment with the EIF and EIA.

The proposed solutions must be compliant with the EU legal framework on data protection

and legislation on electronic signatures according to the WP5 Requirement model.

Alignment with national frameworks of the participating countries.

Potential incompatibilities between Member States.

Maintenance of the legal validity of information exchanged across borders.

Adherence to the data protection legislation in both originating and receiving countries.

26

European Interoperability Architecture, http://ec.europa.eu/isa/documents/isa_2.1_eia-finalreport-commonvisionforaneia.pdf 27

European Interoperability Framework (EIF) for European public services, http://ec.europa.eu/isa/documents/isa_annex_ii_eif_en.pdf

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 44

For reference, the following CAMSS categories of criteria are included in this section.

Applicability

Potential

Alignment with existing policy frameworks will be assessed in collaboration with other Work

Packages, including WP4, the e-SENS legal expertise centre.

4.3.1 Basic alignment criteria

The following table presents basic criteria for alignment with existing policy frameworks based on

requirements from the Technical Annex.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

Interope-rabi-lity

The LSPs and Building Blocks in Member States should be interoperable.

EIA The TOA should confirm to the European Interoperability Architecture.

Are there any disagreements between the TOA and the EIA?

EIF The TOA should confirm to the European Interoperability Framework.

Are there any disagreements between the TOA and the EIF?

A2A ser-vices

The TOA should support A2A services, if applicable.

If applicable, does the TOA support A2A services, e.g. user authorization and aggregation of services?

Com-pli-ance

The proposed solutions should be compliant with the EU legal framework on data protection and legislation on electronic signatures.

Data pro-tec-tion

The proposed solutions should be compliant with the EU legal framework on data protection.

Are the proposed solutions compliant with the EU legal framework on data protection?

Ele-ctro-nic signa-tures

The proposed solutions should be compliant with the EU legislation on electronic signatures.

Are the proposed solutions compliant with the EU legislation on electronic signatures?

Me-mber Sta-tes

Alignment with national frameworks of the participating countries and avoiding potential incompatibilities between Member States.

Nati-onal fra-me-works

Alignment with national frameworks of the participating countries.

Is the TOA aligned with national frameworks of the participating countries?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 45

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

In-com-pati-bili-ties

Avoiding potential incompatibilities between Member States.

Are potential incompatibilities between Member States avoided?

Legal The legal validity of information exchanged must be maintained across borders.

Infor-ma-tion

Maintenance of the legal validity of information exchanged across borders.

Is the legal validity of information exchanged maintained across borders?

Pro-tec-tion

Data protection legislation in both originating and receiving countries must be respected.

Data pro-tec-tion

Adherence to the data protection legislation in both originating and receiving countries.

Is data protection legislation in both originating and receiving countries respected? Note 1: this question does not assume analysis of combinations of originating and receiving countries. Note 2: depending on assessment resources, the Assessment Panel may decide to cover only the basic EU data protection legislation.

Table 12. Assessment criteria for alignment: basic

4.3.2 Applicability

The following table presents applicability criteria based on CAMSS.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

1 Appli-ca-bility

A TOA should be usable and easy implementable in different products to be relevant for adoption by public administrations. This category addresses the definition of functional scope and area of application, the possible reusability in other areas, the possible alternative specifications, the compatibility and dependency on other specifications or technologies.

1.1 Area of appli-cation

For the ‘area of application’, the functionalities and intended use of the TOA are addressed within the context of interoperability and eGovernment.

A.1 Does the TOA address and facilitate interoperability between public administrations?

A.2 Does the TOA address and facilitate the development of eGovernment?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 46

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

1.2 Requ-ire-ments

For the ‘requirements’, the functional and nonfunctional requirements for using and implementing the TOA are addressed. This criterion is related to the use of assessment scenario 3

A.3 Are the functional and nonfunctional requirements for the use and implementation of the TOA clearly defined?

1.3 Re-usabi-lity

For ‘reusability’, the level of reusability of the TOA in the same or other areas of application is addressed.

A.4 Is the TOA applicable and extensible for implementations in different domains?

1.4 Alter-nati-ves

For the ‘alternatives’, the degree to which the TOA adds value compared to alternative TOAs in the same area of application is addressed.

A.5 Does the TOA provide sufficient added value compared to alternative TOAs in the same area of application?

1.5 Com-pati-bility

For ‘compatibility’, the compatibility of the TOA with other TOAs in the same area of application is addressed.

A.6 Is the TOA largely compatible with related (not alternative) TOAs in the same area of application?

1.6 De-pen-den-cies

‘Dependencies’ addresses the degree of independence of the TOA from specific vendor products, platforms or technologies.

A.7 Is the TOA largely independent from specific vendor products?

A.8 Is the TOA largely independent from specific platforms or technologies?

Table 13. Assessment criteria for alignment: applicability

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 47

4.3.3 Potential

The following table presents criteria for potential based on CAMSS.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

6 Po-ten-tial

A TOA should have sufficient and positive future consequences, evolution and impact for being adopted by public administrations. This category addresses the consequences and impact of using or adopting the TOA, the advantages and risks, the maintenance and possible future developments.

6.1 Im-pact

For the ‘impact’, the minimisation of the consequences of using and adopting the TOA is addressed. The consequences can be evaluated and described in terms of different aspects.

A.33 Is there evidence that the adoption of the TOA positively impacts organisational processes?

Is there somebody who directly benefits from the specification?

A.34 Is there evidence that the adoption of the TOA positively impacts the migration of current systems?

A.35 Is there evidence that the adoption of the TOA positively impacts the environment?

A.36 Is there evidence that the adoption of the TOA positively impacts the financial costs?

A.37 Is there evidence that the adoption of the TOA positively impacts the security?

A.38 Is there evidence that the adoption of the TOA positively impacts the privacy?

A.39 Is there evidence that the adoption of the TOA positively impacts the administrative burden?

A.40 Is there evidence that the adoption of the TOA positively impacts the disability support?

Is there evidence that the adoption of the TOA advances or is supported by the emerging technologies such as cloud computing or Internet of Things? Note: the Assessment Panel may decide to provide a summarizing conclusion here, as the responses to this question from different assessors or experts might be very different.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 48

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

6.2 Risks For the ‘risks’, the level of uncertainty is addressed for using and adopting the TOA

A.41 What are the risks? What is the probability of their emergence? Are they related to the adoption of the TOA?

Table 14. Assessment criteria for alignment: potential

4.4 Business Need Criteria

The set of business need criteria addresses the interest of the market for the target of assessment,

its validation, and user-related aspects. The business need criteria should be aligned with the WP5

requirement model, containing scenarios, use cases, and types of requirements (legal, business,

technical and be functional and non-functional). Potential criteria include the following.

Need for the building block by end users. For example, this need could be characterized by

potential change in the quality of the service delivered to the citizen/business by the

administration before and after adopting the building block. Where applicable, SWOT

analysis may be used.

Opportunities for software/service providers to put the building block into use. For example,

availability of a commercially-oriented, robust Business Plan for investment, built upon an

underlying commercially sustainable business model. A business case should also take into

account how a target of assessment will help public partners in achieving their missions.

Relevance of having the same components integrated as European (shared) building blocks

across different Use Cases and their usefulness in the development of eGovernment cross-

border services.

Potential of the building block to be adopted by the market and be used in cross-border

eGovernment services.

Where applicable the costs and benefits of adopting the building block, including the

assessment of the Return on Investment. Cost and benefit analysis can be performed per

each stakeholder group involved (e.g. government, private companies, users, etc). It is also

possible to consider different time frames (e.g. short, medium and long run), as outcomes of

the costs and benefits will change depending on the time frame considered.

Possibility for a broader geographic and sector usage.

Where applicable, the confidence level of the Assessment Panel in the relevance of the

business need/purpose may be indicated (e.g. strong intuition / validated by technical

experts / validated by administrations-users / reflected in national-international regulation).

For reference, the following CAMSS categories of criteria are included in this section.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 49

Market support

This set of criteria will be assessed by involving stakeholders both inside and outside the e-SENS

project, e.g. by using questionnaires, semi-structured interviews, or other means to get the opinion

of important end users and software/service providers.

4.4.1 Basic business need criteria

The following table is based on the basic business need criteria based on the Technical Annex.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

Busi-ness need

Need for the TOA by end users.

Chan-ge

Potential change in the quality of the service delivered to the citizen/business by the administration before and after adopting the TOA.

Are positive changes in the quality of the service delivered to the citizen/business by the administration before and after adopting the TOA foreseen?

Usa-ge

Opportunities for software/service providers to put the TOA into use.

Do opportunities exist for software/service providers to put the TOA into use?

Busi-ness plan

Availability of a commercially-oriented, robust Business Plan for investment, built upon an underlying ‘commercially sustainable’ business model.

Is the Business Plan for investment built upon an underlying ‘commercially sustainable’ business model?

Busi-ness case

A business case should take into account how a TOA will help public partners in achieving their missions.

Does the business case takes into account how a TOA will help public partners in achieving their missions?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 50

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

Sha-ring

Relevance of having the same components integrated as European (shared) building blocks across different Use Cases.

Could the TOA be integrated as a European (shared) building block across different Use Cases? Note: the Assessment Panel may decide to evaluate only one of the following two subcategories: "Sharing" (Section - Basic business need criteria) and "Reusability" (Section - Applicability)

Cross- boar-der

Usefulness of the TOA in the development of eGovernment cross-border services.

Could the TOA be useful in the development of eGovernment cross-border services?

Mar-ket

Potential of the TOA to be adopted by the market and be used in cross-border eGovernment services.

Does the TOA has potential to be adopted by the market and be used in cross-border eGovernment services?

ROI Where applicable the costs and benefits of adopting the TOA, including the assessment of the Return on Investment.

If applicable, is evaluation of the costs and benefits of adopting the TOA available?

Geo-gra-phic

Possibility for a broader geographic and sector usage.

Is there a possibility for a broader geographic and sector usage? Note: the Assessment Panel may decide to evaluate only one of the following two subcategories: "Geographic" (Section - Basic business need criteria) and "Reusability" (Section - Applicability)

Table 15. Assessment criteria for business need: basic

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 51

4.4.2 Market support

The following table presents market support criteria based on CAMSS.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

5 Mar-ket sup-port

A TOA should have sufficient market acceptance and support in order to be adopted by public administrations. This category addresses the proven and operational implementations of the TOA, the market share and demand for the products, and the support from users and communities.

5.1 Imple-men-ta-tions

For the ‘implementations’, the existence of proven and best practice implementations for the TOA is addressed, in different domains and by different vendors.

A.28 Has the TOA been used for different implementations by different vendors/suppliers?

A.29 Has the TOA been used in different industries, business sectors or functions?

5.2 Mar-ket de-mand

For ‘market demand’, the penetration and acceptance of products implementing the TOA in the market is addressed.

A.30 Do the products that implement the TOA have a significant market share of adoption?

5.3 Users For the ‘users’, the diversity of the end-users of the products implementing the TOA is addressed.

A.31 Do the products that implement the TOA target a broad spectrum of end-uses?

5.4 Inte-rest gro-ups

For the ‘interest groups’, the degree of support from different interest groups is addressed.

A.32 Has the TOA a strong support from different interest groups?

Payer For the 'Payer' the existence of groups ready to pay for the service is addressed.

Who is willing to pay for the service?

Com-peti-tion

For the 'Competition' the existence of competing solutions is addressed.

To what extent the TOA competes with other solutions available in member countries?

Sup-port

For the 'Support' the existence of support for the market is addressed.

Is there any support available for the market in using the TOA?

Table 16. Assessment criteria for business need: market support

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 52

5 The Recommendation Step: Criteria and Classification

In the recommendation step, recommendation criteria are applied, concluding with a classification

of the target of assessment to be reported back to WP6 Architectural Board for using by other Work

Packages.

5.1 Recommendation Criteria

Following CAMSS28 (Section 3.4), the main categories of criteria on the recommendation step are the

same as used for grouping the assessment criteria. These categories comprise those from CAMMS

(applicability; maturity; openness; intellectual property rights; market support; potential) and those

from the Technical Annex and other sources (life cycle, maintenance, service levels, security; basic

alignment criteria; basic business need criteria).

For these main categories, a score is generated based on the number of ‘YES’ answers that indicates

the level of meeting the criteria for the target of assessment. Based on the discussion in the

Assessment Panel, an evaluation and possible comments are provided for the main categories. A

possible scale for the evaluation of the categories could include the following range: Very low, Low,

Moderate, High, Very high. The Assessment Panel may assign different weights to each category to

obtain the final evaluation.

Note 1. A more complicated scale for scores may be used by the Assessment Panel, for example

assigning weights to the ‘YES’ answers, specifying in more detail the way of a generating a score

based on the number of ‘YES’ answers, specifying the levels in more detail, comprising different

scale level values, etc. Note 2. The Assessment Panel may decide to consider additional

recommendation criteria, such as usability, backward compatibility, and others.

The specified knock-out criteria are checked if they were all met and an answer is provided for the

main categories. The CAMSS Assessment Library and Tools29, including the tools for Scenarios 2 and

3, may be used for this purpose. Both tools propose the Knock-out criteria presented in the following

table. Additional criteria may be stated by the Assessment Panel.

28

Common Assessment Method For Standards And Formal Specifications (CAMSS). Final Draft Revision of CAMSS. Version 1.0, March 2012 29

CAMSS Assessment Library and Tools, including the tool for Scenario 2, https://webgate.ec.europa.eu/fpfis/mwikis/idabc-camss/index.php/CAMSS_Assessment_Library_and_Tools

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 53

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

1 Applica-bility

A formal specification should be usable and easy implementable in different products to be relevant for adoption by public administrations.

1.1 Area of application

For the ‘area of application’, the functionalities and intended use of the formal specification are addressed within the context of interoperability and eGovernment.

A.1 Does the formal specification address and facilitate interoperability between public administrations?

6 Po-ten-tial

A TOA should have sufficient and positive future consequences, evolution and impact for being adopted by public administrations. This category addresses the consequences and impact of using or adopting the TOA, the advantages and risks, the maintenance and possible future developments.

6.3 Main-te-nance and future deve-lop-ments

For the ‘maintenance’ and future developments, the support and the planned or existing actions to maintain, improve and develop the TOA in the long term are addressed.

A.43 Does the maintenance organisation for the formal specification have sufficient finances and resources for the long term?

Table 17. Knock-out criteria

The recommendation criteria are presented in the following table.

Nr. Category Automated

score

Knock-out

criteria met?

Evaluation Comments

R.2 (CAMMS) Maturity

R.3 (CAMMS) Openness

R.4 (CAMMS) Intellectual

property rights

Technical

Annex and

other sources

Life cycle,

maintenance,

service levels,

security

Technical

Annex and

other sources

Basic

alignment

criteria

R.1 (CAMMS) Applicability

R.6 (CAMMS) Potential

Technical

Annex and

other sources

Basic

business need

criteria

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 54

Nr. Category Automated

score

Knock-out

criteria met?

Evaluation Comments

R.5 (CAMMS) Market

support

Table 18. Recommendation criteria

5.2 Proposed Classification

The classification of the target of assessment is based on CAMSS (p 36), Technical Annex (pp. 33-35),

PEPPOL, and other relevant sources. It uses the assessment and recommendation criteria and

comprises the following classes.

Discarded: the target of assessment is not found relevant and should not be used in the

public administration

Observed: the target of assessment is found relevant but should not be used in the current

state or version

Accepted: the target of assessment is accepted and could be used in the public

administrations (and for the specific business needs)

Recommended: the target of assessment is accepted and is preferred for the use in public

administrations (compared to alternative targets of assessment)

Mandatory: the target of assessment is accepted and public administrations are obliged to

use the target of assessment when applicable (or provide an explanation). Note. Depending

on the e-SENS mandate, it may be agreed to avoid the "Mandatory" recommendation in the

assessments.

The compliance levels of criteria for LSPs, necessary for a building block to be assessed as satisfying

Accepted or higher levels may be defined.

For the last three classes, maturity levels (Technical Annex pp. 33-35 and PEPPOL) may be assessed

in more detail if needed.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 55

6 Organizational aspects of assessment

This chapter presents a questionnaire form for the assessment of building blocks and discusses how

to deal with current LSPs that do not comply to the criteria upfront.

6.1 A Questionnaire Form for the Assessment of European Building

Blocks

The questionnaires for the consideration, assessment, and recommendation steps follow the

assessment framework, procedure, and criteria, and are given in the tables of the preceding

sections. For convenience, the proposal information/criteria are also presented as a separate Excel

file.

6.2 Dealing with Current LSPs that do not comply to the Criteria

Upfront

The solution for dealing with current LSPs that do not comply with the criteria upfront is based on a

case-by-case analysis of the sources for non-compliance and application of appropriate solutions.

Possible sources and solutions are presented in the following table. The requests indicated in

possible solutions below may be provided by WP3, WP5, WP6, or other stakeholders.

In the course of assessment, additional sources for non-compliance and possible solutions will be

considered as needed.

Source for non-compliance Possible solution

Insufficient presentation of the building block for

assessment (for example, inadequate

documentation of format)

Request for improvement of the presentation

A simple problem with reference to an otherwise

acceptable building block

Request to correct the problem in the

subsequent version of the building block

A complex problem with reference to an

otherwise acceptable building block

Request to provide a plan to correct the problem

in the subsequent versions of the building block

The project is classified as discarded or observed,

with parallel building blocks providing adequate

solutions in assessment

Providing a parallel building block for assessment

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 56

Source for non-compliance Possible solution

The project is classified as discarded or observed,

with no parallel building blocks providing

adequate solutions in assessment

Providing a request to a relevant EU initiative for

initiation of a new building block of the same

type as the initial one

Table 19. Dealing with current LSPs that do not comply to the criteria upfront

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 57

7 Conclusion

In the e-SENS Project, a number of high-level building blocks need to be assessed. The objective of

the current Guidelines is to help running the sustainability assessment smoothly, making it

eventually easier to produce generic, modular and re-usable building blocks. The Guidelines are

primarily intended for assessing architectural and high-level building blocks. In the first instance, the

focus will be on the high-level building blocks: electronic identities, electronic signatures, electronic

documents and electronic delivery. At a later stage, more high-level building blocks can be added.

The Guidelines will also be used to assess the architectural building blocks that are beneath those

high-level building blocks.

This document has listed a series of guidelines in order to execute the assessment based on the

most state-of-the-art methodologies.

The Guidelines are first of all based on the CAMSS and the ADMS methodologies. Considering the

focus of building block assessment, the CAMMS scenarios 2 and 3 are taken into account. As the

CAMSS make use of the ADMS, the current Guidelines use ADMS as well. In particular, the controlled

vocabularies used during the assessment procedure are based on ADMS.

When doing the assessment, some points need to be taken into account.

i. The assessment panel has some degrees of freedom when executing the sustainability

assessment. These pertain in particular to assigning weights to make some considerations

more visible, to remove some criteria, or to consider additional criteria if there is a need.

The assessment criteria are flexible and can be adjusted. This document clearly marks where

there is room for manoeuvre. Assessments may be done iteratively, in cooperation with

WP6, WP5, and other stakeholders.

ii. The mandatory recommendation for the assessment panel can be dropped, since e-SENS

may not have the mandate to make "Mandatory" recommendation in the assessments.

iii. The questionnaires for the consideration, assessment, and recommendation steps follow the

assessment framework, procedure, and criteria, and are given in the tables of the preceding

sections. For convenience, the proposal information/criteria are also presented as a

separate Excel file.

iv. Deliverable 3.1 is a living document that has to be updated after changes in underlying

materials, due to experience gained in pilot assessments, as a result of new agreements, and

other.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 58

v. It is better to execute the assessment and try to learn from it than talking all individual

criteria over and over again.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 59

8 References

1. Regulations

1.1. REGULATION (EU) No 1025/2012 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of

25 October 2012 on European standardisation. OJ L 316, 14.11.2012, pp 12-33, http://eur-

lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2012:316:0012:0033:EN:PDF

1.2. Amended proposal for a regulation of the European Parliament and of the Council on

guidelines for trans-European telecommunications networks and repealing Decision No

1336/97/EC. COM(2013) 329 final. Brussels, 28.5.2013, http://eur-

lex.europa.eu/LexUriServ/LexUriServ.do?uri=COM:2013:0329:FIN:EN:PDF

2. European wide scale frameworks, projects, and initiatives

2.1. European Interoperability Framework (EIF) for European public services,

http://ec.europa.eu/isa/documents/isa_annex_ii_eif_en.pdf

2.2. European Interoperability Architecture, http://ec.europa.eu/isa/documents/isa_2.1_eia-

finalreport-commonvisionforaneia.pdf

2.3. EFIR, http://ec.europa.eu/isa/actions/04-accompanying-measures/4-2-4action_en.htm

2.4. JoinUp, https://joinup.ec.europa.eu/catalogue/all?current_checkbox=1

2.5. Data Catalog Vocabulary (DCAT),

http://www.w3.org/2011/gld/wiki/Data_Catalog_Vocabulary

2.6. BOMOS (Management and Development Model for Open Standards),

2.7. BOMOS2i,

http://www.forumstandaardisatie.nl/fileadmin/os/publicaties/HR_BOMOS_English_translat

ion_Jan2013.pdf

2.8. Open Software Business Readiness Rating (OpenBRR), http://www.oss-

watch.ac.uk/resources/archived/brr

3. e-SENS project materials and deliverables

3.1. e-SENS Technical Annex v 3.0

3.2. WP6 deliverable "D6.1 Executable ICT Baseline Architecture"

4. ADMS and ADMS.SW

4.1. ADMS v 1.0. https://joinup.ec.europa.eu/asset/adms/release/100

4.2. ADMS. Asset Description Metadata Schema. Specification Version 1.00. Release date

18/04/2012

4.3. Asset Description Metadata Schema for Software 1.00. ADMS.SW 1.00. Release date

20/05/2012, https://joinup.ec.europa.eu/asset/adms_foss/asset_release/admssw-100

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 60

5. Common Assessment Method For Standards And Formal Specifications (CAMSS). Final Draft

Revision of CAMSS. Version 1.0, March 2012. See also the following sources.

5.1. CAMSS Project – Workshop 2. Agreement on Revision of CAMSS. Wednesday, March 7th

2012 (fail Presentation of EC on CAMSS.ppt)

5.2. CAMSS Assessment Library and Tools, including the tool for Scenario 2,

https://webgate.ec.europa.eu/fpfis/mwikis/idabc-

camss/index.php/CAMSS_Assessment_Library_and_Tools

5.3. https://webgate.ec.europa.eu/fpfis/mwikis/idabc-camss/

6. European LSPs and building blocks

6.1. Overview of LSPs, https://ec.europa.eu/digital-agenda/en/cross-border-pilots

6.2. SPOCS, http://www.eu-spocs.eu/, http://joinup.ec.europa.eu/software/spocs/home

6.3. e-CODEX, http://www.e-codex.eu/home.html

6.4. epSOS, http://www.epsos.eu/home.html,

https://openncp.atlassian.net/wiki/display/ncp/OpenNCP+Home

6.5. PEPPOL EIA Repository, http://www.peppol.eu/peppol_components/peppol-eia/eia

6.6. STORK and STORK 2.1, https://www.eid-stork.eu/,

https://joinup.ec.europa.eu/software/stork/home, https://www.eid-stork2.eu/

7. European project assessments

7.1. Interim evaluation of the ISA programme. Specific contract n°4 under framework contract

n°DI/06693, 31 October 2012, Final Report.

http://ec.europa.eu/isa/documents/interim_evaluation_of_the_isa_programme.pdf

7.2. The feasibility and scenarios for the long-term sustainability of the Large Scale Pilots,

including 'ex-ante' evaluation. Task 1,2,3 Report. 11-03-2013, SMART 2012/0059, Deloitte.

7.3. Study on Analysis of the Needs for Cross-Border Services and Assessment of the

Organisational, Legal, Technical and Semantic Barriers. D1.3 Inventory of cross-border

eGovernment services & D2.1 Analysis of existing and future needs and demand for cross-

border eGovernment services. A study prepared for the European Commission DG

Communications Networks, Content & Technology. European Union, 2012

7.4. SPOCS D3.1 Assessment of existing e-Delivery systems & specifications required for

interoperability, Version 1.00, 15/01/2010, http://www.eu-

spocs.eu/index.php?option=com_processes&task=showProcess&id=18&Itemid=61

7.5. SPOCS D4.1 Assessment of selected national approaches and possible solutions towards

interoperability for eService Directories, Version 1.00, 26/01/2010, http://www.eu-

spocs.eu/index.php?option=com_processes&task=showProcess&id=18&Itemid=61

7.6. STORK D5.1 Evaluation and assessment of existing reference models and common specs.

STORK, Final revision 1.3, 04/10/2010, https://www.eid-

stork.eu/index.php?option=com_processes&act=list_documents&s=1&Itemid=60&id=312

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 61

7.7. STORK D6.6 STORK Evaluation Report, Version 1.0, Final, 23/12/11, https://www.eid-

stork.eu/index.php?option=com_processes&act=list_documents&s=1&Itemid=60&id=312

8. International standards and frameworks

8.1. TOGAF 9.1, http://pubs.opengroup.org/architecture/togaf9-doc/arch/toc.html

8.2. TOGAF 9.1, Definitions, http://pubs.opengroup.org/architecture/togaf9-

doc/arch/chap03.html

8.3. ISO/IEC 12207:2008. Systems and software engineering. Software life cycle processes

8.4. ISO/IEC 25000:2005. Software Engineering. Software product Quality Requirements and

Evaluation (SQuaRE). Guide to SQuaRE

8.5. ISO/IEC 27001:2005. Information technology - Security techniques - Information security

management systems - Requirements

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 62

9 Contributors

Name Surname Organisation Country

Anette Junikiewicz JM NRW, Germany DE

Anni Buhr SSI DK

Sven Rostgaard

Rasmussen

DIGST DK

Uuno Vallner RISO EE

Dea Hvillum SGMAP France - MINISTERE DE LA JUSTICE FR

Jean-Marc Pellet SGMAP France - MINISTERE DE LA JUSTICE FR

Bertrand Grégoire Tudor Institute LU

Jasper Roes TNO NL

Marijke Salters Bureau Forum Standaardisatie NL

Freek Van Krevel Ministry of Economic Affairs NL

Jack Verhoosel TNO NL

Klaus Vilstrup

Pedersen

DIFI NO

Vural Celik TUBITAK TR

Cagatay Karabat TUBITAK TR

Various From the WP3 Brussels meeting on 17.05,

teleconference on 28.05, Tallinn meeting on 11-12.07,

and various e-SENS meetings and teleconferences

Various

Table 20. Contributors to D3.1

Note: the table is sorted by the country code and then by surname.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 63

10 Appendix 1. Research Update and Input for the Analysis

To help conducting future assessments, results of already existing assessments are identified and re-

used. According to the description of Task T3.1.2 of the Technical Annex, this is done before

assessing the building blocks. The following sections present the goals, methodology, as well as main

findings and possibilities for re-use of the existing assessments.

According to the Technical Annex, this subtask involves identification and re-use of already existing

assessments, but not of the assessment tools and frameworks. Moreover, the research update

focuses on assessment of building blocks relevant for e-SENS. Based on the Technical Annex and

other materials, the following possible sources of existing assessments were analysed.

WP5 and WP6 deliverables and other e-SENS materials, including the BSCW and Basecamp

servers

The library http://joinup.ec.europa.eu/

General ISA, LSP, and e-SENS materials

SPOCS30: Online Points of Single Contact to help businesses expand into other countries

e-CODEX31: Online access to judicial procedures for claimants, defendants and legal

professionals

epSOS32: Cross-border access to patient information online

PEPPOL33: Interoperable eProcurement solutions, such as e-invoices and e-signatures for

important documents

STORK and STORK 2.134,35: Electronic identity for easier access to public services

The assessment results from standardisation bodies or other relevant bodies

The list of materials used for assessment is provided in the references section of this document.

30

SPOCS, http://www.eu-spocs.eu/, http://joinup.ec.europa.eu/software/spocs/home 31

e-CODEX, http://www.e-codex.eu/home.html 32

epSOS, http://www.epsos.eu/home.html, https://openncp.atlassian.net/wiki/display/ncp/OpenNCP+Home 33

PEPPOL EIA Repository, http://www.peppol.eu/peppol_components/peppol-eia/eia 34

STORK D5.1 Evaluation and assessment of existing reference models and common specs. STORK, Final revision 1.3, 04/10/2010, https://www.eid-stork.eu/index.php?option=com_processes&act=list_documents&s=1&Itemid=60&id=312 35

STORK D6.6 STORK Evaluation Report, Version 1.0, Final, 23/12/11, https://www.eid-stork.eu/index.php?option=com_processes&act=list_documents&s=1&Itemid=60&id=312

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 64

10.1 Interim Evaluation of the ISA programme

This section is based on the report "Interim evaluation of the ISA programme. Specific contract n°4

under framework contract n°DI/06693, 31 October 2012, Final Report".

The aim of this interim evaluation, as stated in the Article 13(3) of Decision 922/2009/EC, is to

examine issues such as the relevance, effectiveness, efficiency, utility, sustainability and coherence

of the ISA programme's actions and assess performance against the objective of the ISA programme

and the rolling work programme.

In this interim evaluation, quantitative and qualitative data were collected using a selected set of

methodologies and tools. During the data collection phase, the evaluation team used a combination

of desk research, surveys, phone or face-to-face interviews and also case studies of specific actions

of the work programme in order to fulfil the objectives of this evaluation. Furthermore, written

requests were addressed to specific stakeholders to gather additional information on specific sub-

issues of the interim evaluation.

As a general study, this evaluation did not propose assessments of specific LSPs or other projects

that might be reused in e-SENS project assessments. It issued eleven main general

recommendations. Examples of recommendations that might provide useful also in the e-SENS

context include the following.

Recommendation 1. The ISA programme must ensure that all stakeholders involved in the ISA

programme are well aware of the objectives of each action, their contribution to the programme

objectives and their intended and current results.

Recommendation 3. The ISA programme should give priority to the activities related to the

Assessment of ICT implications of EU legislation, taking into account that the expected results were

not delivered, even though it is perceived as an important need by European public administrations.

Recommendation 7. Members of the ISA Working groups should report nationally to the ISA

Coordination group members to ensure that ISA solutions are aligned with the needs and initiatives

at national level.

Recommendation 9. The ISA programme should establish a control mechanism to ensure the reuse

of ISA solutions, besides interoperability, related to both carried forward solutions from IDA and

IDABC and new ISA solutions already producing concrete results within the lifecycle of the

programme.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 65

Recommendation 11. The ISA programme should consider in due-time the sustainability of the ISA

actions by identifying different sustainability options. These options include the internal charge-back

to Commission services based on the use of specific solutions, the financial support from

Commission services having developed specific ISA solutions and the financial support from an ISA

follow-on programme or other similar EU programmes.

10.2 The Feasibility and Scenarios for the Long-Term Sustainability of

the LSPs

This section is based on the study "The feasibility and scenarios for the long-term sustainability of

the Large Scale Pilots, including 'ex-ante' evaluation. Task 1,2,3 Report. 11-03-2013, SMART

2012/0059, Deloitte".

According to the Task 3 report, the objective of this study is to assess the sustainability and future

roll-out of the services developed by the Competitiveness and Innovation Framework Programme

(CIP) Information and Communication Technologies Policy Support Programme (ICT-PSP) Large Scale

Pilots (LSPs). The study investigated possible organisational, financial, resource, and governance

aspects for LSP long-term sustainability, provided one possible scenario based on cost/economic

benefit analysis and an impact assessment of possible scenarios, and provided a roadmap and

concrete recommendations for putting the proposed scenario in place.

The methodology of the study addresses its key question: what type of model is needed to create a

sustainable Digital Service Infrastructure for the provision of cross-border eGovernment Services?

The study starts with development of a conceptual model for the analysis, analysing the current

situation and real-life examples, and proposing the success criteria (Task 1 Report); proceeds with

proposing basic scenarios, their preliminary assessment, selection of a five specific scenarios for

further elaboration, and their preliminary overview (Task 2 Report); continues with impact analysis,

cost-benefit analysis, and findings for scenarios (Task 3 Report); and will conclude with comparison

of the scenarios, recommendations, and roadmap (Task 4 Report).

The study is highly relevant with respect to the e-SENS project in general, and specifically for WP3

and its Task T3.3 "Scenarios and recommendations for sustainability models". It is already highly

used in e-SENS. From the current Guidelines perspective, the study comprises a number of

overviews that can be taken into account in LSP assessment.

Analysis of the current situation of LSPs (Task 1 Report, Section 3)

Lessons learnt from other real-life examples (Task 1 Report, Section 4)

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 66

Preliminary suitability assessment of the basic scenarios with reference to the building

blocks (Task 2 Report, Section 4)

Results of the surveys concerning the building blocks (Task 3 Report, Section 3)

Cost-benefit analysis (Task 3 Report, Section 5).

10.3 Study on Analysis of the Needs for Cross-Border Services

This section is based on the report "Study on Analysis of the Needs for Cross-Border Services and

Assessment of the Organisational, Legal, Technical and Semantic Barriers. D1.3 Inventory of cross-

border eGovernment services & D2.1 Analysis of existing and future needs and demand for cross-

border eGovernment services. A study prepared for the European Commission DG Communications

Networks, Content & Technology. European Union, 2012".

The goal of this study is to assess the real needs and demand, costs-benefits and barriers for cross-

border services where interoperability is a key factor.

Different methodologies, such as statistical analysis, user survey, holistic qualitative assessment and

stakeholder workshops, were used in the study.

From the current Guidelines perspective, the report may provide information about needs for cross-

border services, an overview of barriers to their implementation, and statistical data about LSP

usage that can be taken into account in the future assessments.

10.4 SPOCS D3.1 Assessment of Existing e-Delivery Systems &

Specifications

This section is based on the report "SPOCS D3.1 Assessment of existing e-Delivery systems &

specifications required for interoperability, Version 1.00, 15/01/2010".

The report gives a systematic review of approaches regarding e-Delivery, eSafe, and security

architectures in the participating Member States of the European Union, evaluating the solution

concepts of Austria, France, Germany, Greece, Italy, the Netherlands, and Poland.

The methodology was based on developing a questionnaire to get an overview of national solutions,

identifying the relevant solution experts, sending the questionnaire to the experts, and interviewing

them for solution descriptions.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 67

An overview of the results is given in the executive summary and in the Conclusions for

interconnecting infrastructures in Chapter 3. The main body of the report (in Chapter 2, with more

details given in the Appendix C) covers the solutions in all countries considered systematically

according to a common structure. This helps to get the current status of each national

implementation and a managerial understanding. It may be possible to re-use the results of this

report in future assessments of e-SENS Building Blocks related to e-Delivery, eSafe, and security

architectures in Austria, France, Germany, Greece, Italy, the Netherlands, and Poland.

10.5 SPOCS D4.1 Assessment of Selected National Approaches and

Possible Solutions

This section is based on the report "SPOCS D4.1 Assessment of selected national approaches and

possible solutions towards interoperability for eService Directories, Version 1.00, 26/01/2010".

This document is the result of EU project SPOCS Work Package 4 stock taking and assessment tasks.

SPOCS WP 4 is concentrating on interoperable eService Directories. The eService Directories are

storing information about available electronic services supporting the application process of a

service provider. SPOCS concentrates on communication between Service Providers and Points of

Single Contact in different member states. The objective is to define common specifications and

guidelines supporting the piloting of these procedures.

The methodology of the deliverable involves research made by the partners of SPOCS and a

questionnaire sent to the contributing member states, assessments and discussion of the results,

and an assessment process. Electronic services in Italy, Poland, Greece, Austria, Netherlands, and

Germany are addressed in more detail.

This deliverable is the first step for analysing the given infrastructure of the solutions for Points of

Single Contact and national infrastructures for identifying existing specifications, components and

interfaces supporting the SPOCS idea. The results of this assessment may be re-used in case some

national solutions for interoperability of eService Directories are relevant to assessment of specific

e-SENS Building Blocks.

10.6 STORK D5.1 Evaluation and Assessment of Existing Reference

Models and Common Specs

This section is based on the report "D5.1 Evaluation and assessment of existing reference models

and common specs. STORK, Final revision 1.3, 04/10/2010".

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 68

The purpose of this report is to present standards, different electronic identifiers and registration

procedures, the authentication schemes, a summary of data available in different tokens or their

corresponding identity providers, and a discussion about the models. It also evaluates two most

important models: the middleware model and the Pan European Proxy Service model.

The methodology of the report involves extracting data from IDABC and other reports, and revising

the data by means of questionnaires replied by participating MS.

From the e-SENS assessment perspective, the report may provide re-usable information about

identity federation standards (Chapter 2), identity providers and e-IDs in different Member States

(Austria, Belgium, Estonia, Finland, France, Greece, Italy, Lithuania, Luxembourg, The Netherlands,

Portugal, Slovakia, Slovenia, Spain, Sweden, and The United Kingdom are considered in Chapters 3,

4, and 5), and architectural models (Pan European Proxy Service model, middleware model, and

unified PKI model, together with their integration in different interoperability models, as well as

evaluation of these models in Chapter 6).

10.7 STORK D6.6 Evaluation Report

This section is based on the report "D6.6 STORK Evaluation Report, Version 1.0, Final, 23/12/11".

The goal of this deliverable was to evaluate, whether the plans of the STORK pilots focused on the

right targets of the project (ex ante), whether the execution of the pilots took place according to the

defined plan (mid term), and whether the results of the pilots matched the promised outcomes (ex

post).

The methodology underlying the report is based on general methods for auditing, reviewing and

evaluating ICT pilots, making it possible to compare the results of the different evaluations of the

several pilots: ex ante, mid term and ex post.

The results of this evaluation may be re-used in particular if the future assessments performed in

framework of the e-SENS project concern pilots evaluated in the D6.6 STORK Evaluation Report:

cross-border authentication platform for electronic services, SaferChat, e-ID Student mobility, e-ID

electronic delivery, change of address, and the European Commission Authentication Service

integration with STORK.

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 69

11 Appendix 2. Proposal Information/Criteria Based on

CAMMS

Following CAMSS (Section 3.1) and the ADMS specification v 0.8, the main categories of criteria for

the documentation of format during the proposal step comprise asset description and relationship

(status) categories and criteria according to the following table. The table is included for reference.

Nr. Category Description Nr. Criteria Description

1 Asset description

Providing the general information for the proposed TOA

P.1 Name Name of the asset

P.2

Date of creation

Creation date of this version of the asset

P.3 Date of last modification

Date of latest update of asset

P.4 Description Descriptive text for the asset

P.5 ID URI for the asset

P.6 Keyword Word or phrase to describe the asset

P.7 Alternative name

Alternative name for the asset. Note: this information may be used to provide additional access points, e.g. allowing indexing of any acronyms, nicknames, shorthand notations or other identifying information under which a user might expect to find the asset

P.8 Version Version number or other designation of the asset

2 Relationship Providing the information on the status of the specific TOA being proposed

P.9 Asset type Classification of an asset according to a controlled vocabulary, e.g. code list, metadata schema

P.10 Current version Current or latest version of the asset

P.11 Documentation Document that further describes an asset or give guidelines for its use

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 70

Nr. Category Description Nr. Criteria Description

P.12 Domain Government sector that an asset or repository applies to, e.g. “law” or “environment” according to a controlled vocabulary

P.13 Interoperability level

Level according to the European Interoperability Framework (EIF 2.0) for which an asset is relevant

P.14 Language Language of an asset if its contains textual information, e.g. the language of the terms in a controlled vocabulary or the language that a specification is written in

P.15 Previous version

Older version of the asset

P.16 Publisher Organisation responsible for a repository, asset or release

P.17 Release Implementation of the asset in a particular format

P.18 Spatial coverage

Geographic region or jurisdiction to which the asset applies

P.19 Subject Theme or subject of an asset, e.g. “elections” or “immigration” according to a general or domain specific controlled vocabulary

P.20 Status Indication of the maturity of an asset or release

Table 21. Appendix. Proposal information/criteria based on CAMMS

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 71

12 Appendix 3. Proposal Information/Criteria Based on WP6

Deliverable D6.1

Following the WP6 deliverable "D6.1 Executable ICT Baseline Architecture" (section 2.5, "Building

Block Evaluation Model"), a Building Block will be described using the following information/criteria.

The list is included for reference.

Name

Relationship to ICT Strategies in EC/MS (in cooperation with WP3)

Requirements (in cooperation with WP5)

Type

Generic/Specific

Architecture Framework / Component Specifications

Standards

Open Source

Relationship and Couplings

Related Artifacts and Solution Building Blocks

Ownership

Life Cycle Management

In use?

Technical Maturity

Business Maturity (WP5)

Market Maturity (WP3)

Evaluation

For description of individual items from the above list we refer to the "D6.1 Executable ICT Baseline

Architecture" (section 2.5, "Building Block Evaluation Model").

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 72

13 Appendix 4. The CAMSS Assessment Criteria

The CAMSS original categories, sub-categories, and criteria are presented in the following table. The

table is included for reference.

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

1 Applica-bility

A formal specification should be usable and easy implementable in different products to be relevant for adoption by public administrations. This category addresses the definition of functional scope and area of application, the possible reusability in other areas, the possible alternative specifications, the compatibility and dependency on other specifications or technologies.

1.1 Area of application

For the ‘area of application’, the functionalities and intended use of the formal specification are addressed within the context of interoperability and eGovernment.

A.1 Does the formal specification address and facilitate interoperability between public administrations?

A.2 Does the formal specification address and facilitate the development of eGovernment?

1.2 Requirements

For the ‘requirements’, the functional and non-functional requirements for using and implementing the formal specification are addressed. This criterion is related to the use of assessment scenario 3

A.3 Are the functional and nonfunctional requirements for the use and implementation of the formal specification clearly defined?

1.3 Reusability

For ‘reusability’, the level of reusability of the formal specification in the same or other areas of application is addressed.

A.4 Is the formal specification applicable and extensible for implementations in different domains?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 73

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

1.4 Alterna-tives

For the ‘alternatives’, the degree to which the formal specification adds value compared to alternative TOAs in the same area of application is addressed.

A.5 Does the formal specification provide sufficient added value compared to alternative TOAs in the same area of application?

1.5 Compatibility

For ‘compatibility’, the compatibility of the formal specification with other TOAs in the same area of application is addressed.

A.6 Is the formal specification largely compatible with related (not alternative) TOAs in the same area of application?

1.6 Depen-dencies

‘Dependencies’ addresses the degree of independence of the formal specification from specific vendor products, platforms or technologies.

A.7 Is the formal specification largely independent from specific vendor products?

A.8 Is the formal specification largely independent from specific platforms or technologies?

2 Maturity

A formal specification should in itself be mature enough for adoption by public administrations. This category addresses the development status, the quality, guidelines and stability of the formal specification.

2.1 Development status

For the ‘development status’, the current development status of the formal specification in the development cycle is addressed.

A.9 Has the formal specification been sufficiently developed and in existence for a sufficient period to overcome most of its initial problems?

2.2 Quality For ‘quality’, the level of detail in the formal specification and the conformance of implementations is addressed.

A.10 Are there existing or planned mechanisms to assess conformity of the implementations of the formal specification (e.g. conformity tests, certifications)?

A.11 Has the formal specification sufficient detail, consistency and completeness for the use and development of products?

2.3 Guide-lines

For the ‘guidelines’, the existence of implementation guidelines or reference implementations is addressed.

A.12 Does the formal specification provide available implementation guidelines and documentation for the implementation of products?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 74

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

A.13 Does the formal specification provide a reference (or open source) implementation?

2.4 Stability For ‘stability’, the level of change to the formal specification and the stability of underlying technologies is addressed.

A.14 Does the formal specification address backward compatibility with previous versions?

A.15 Have the underlying technologies for implementing the formal specification been proven, stable and clearly defined?

3 Openness

A formal specification should be sufficiently open and available to be relevant for adoption by public administrations. This category addresses the openness of the standardisation organisation and decision-making process, and the openness of the documentation and accessibility of the formal specification.

3.1 Organi-sation

For the ‘openness’ of the organisation, the level of openness for participating in the standardisation organisation is addressed.

A.16 Is information on the terms and policies for the establishment and operation of the standardisation organisation publicly available?

A.17 Is participation in the creation process of the formal specification open to all relevant stakeholders (e.g. organisations, companies or individuals)?

3.2 Process For the ‘process’, the level of openness regarding the development and decision-making process for the formal specification is addressed.

A.18 Is information on the standardisation process publicly available?

A.19 Information on the decision making process for approving TOAs is publicly available?

A.20 Are the TOAs approved in a decision making process which aims at reaching consensus?

A.21 Are the TOAs reviewed using a formal review process with all relevant external stakeholders (e.g. public consultation)?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 75

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

A.22 All relevant stakeholders can formally appeal or raise objections to the development and approval of TOAs?

3.3 Documentation

For the openness of the ‘documentation’, the accessibility and availability of the documentation of the formal specification is addressed.

A.23 Relevant documentation of the development and approval process of TOAs is publicly available (e.g. preliminary results, committee meeting notes)?

A.24 Is the documentation of the formal specification publicly available for implementation and use on reasonable terms?

4 Intellectual property rights

A formal specification should be licensed on (F)RAND terms or even on a royalty-free basis in a way that allows implementation in different products. This category addresses the availability of the documentation on the IPR and the licences for the implementation of the formal specification.

4.1 IPR Documentation

For the ‘documentation of the intellectual property rights’, the availability of the information concerning the ownership rights of the formal specification is addressed.

A.25 Is the documentation of the IPR for TOAs publicly available?

4.2 Licences For the ‘licences’ within the intellectual property rights, a (fair) reasonable and non-discriminatory ((F)RAND) or even royalty-free basis is addressed for the use and implementation of the formal specification.

A.26 Is the formal specification licensed on a (F)RAND basis?

A.27 Is the formal specification licensed on a royalty-free basis?

5 Market support

A formal specification should have sufficient market acceptance and support in order to be adopted by public administrations. This category addresses the proven and operational implementations of the formal specification, the market share and demand for the products, and the support from users and communities.

5.1 Implementations

For the ‘implementations’, the existence of proven and best practice implementations for the formal specification is addressed, in different domains and by different vendors.

A.28 Has the formal specification been used for different implementations by different vendors/suppliers?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 76

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

A.29 Has the formal specification been used in different industries, business sectors or functions?

5.2 Market demand

For ‘market demand’, the penetration and acceptance of products implementing the formal specification in the market is addressed.

A.30 Do the products that implement the formal specification have a significant market share of adoption?

5.3 Users For the ‘users’, the diversity of the end-users of the products implementing the formal specification is addressed.

A.31 Do the products that implement the formal specification target a broad spectrum of end-uses?

5.4 Interest groups

For the ‘interest groups’, the degree of support from different interest groups is addressed.

A.32 Has the formal specification a strong support from different interest groups?

6 Potential

A formal specification should have sufficient and positive future consequences, evolution and impact for being adopted by public administrations. This category addresses the consequences and impact of using or adopting the formal specification, the advantages and risks, the maintenance and possible future developments.

6.1 Impact For the ‘impact’, the minimisation of the consequences of using and adopting the formal specification is addressed. The consequences can be evaluated and described in terms of different aspects.

A.33 Is there evidence that the adoption of the formal specification positively impacts organisational processes?

A.34 Is there evidence that the adoption of the formal specification positively impacts the migration of current systems?

A.35 Is there evidence that the adoption of the formal specification positively impacts the environment?

A.36 Is there evidence that the adoption of the formal specification positively impacts the financial costs?

A.37 Is there evidence that the adoption of the formal specification positively impacts the security?

Approved by EC

D3.1 Guidelines to the assessment the sustainability and maturity of building blocks 77

Nr Cate-gory

Description Nr Sub-Cate-gory

Description Nr Criteria

A.38 Is there evidence that the adoption of the formal specification positively impacts the privacy?

A.39 Is there evidence that the adoption of the formal specification positively impacts the administrative burden?

A.40 Is there evidence that the adoption of the formal specification positively impacts the disability support?

6.2 Risks For the ‘risks’, the level of uncertainty is addressed for using and adopting the formal specification

A.41 Are the risks related to the adoption of the formal specification acceptable?

6.3 Maintenance and future developments

For the ‘maintenance’ and future developments, the support and the planned or existing actions to maintain, improve and develop the formal specification in the long term are addressed.

A.42 Does the formal specification have a defined maintenance organisation?

A.43 Does the maintenance organisation for the formal specification have sufficient finances and resources for the long term?

A.44 Does the formal specification have a defined maintenance and support process?

A.45 Does the formal specification have a defined policy for version management?

Table 22. Appendix. The CAMMS assessment criteria


Recommended