Door to Door Information for Air Passengers
D6.1 Evaluation plan
Editor: TUB (Technische Universität Berlin)
Deliverable nature: Report
Dissemination level: Public
Date: planned | actual 29 February 2016 29 February 2016
Version | no. of pages 1.0 74
Keywords: Evaluation Plan / Impact Evaluation / Process Evaluation /
Usability Evaluation
DORA Deliverable D6.1
2 / 74 2 / 74
Disclaimer
This document contains material, which is the copyright of certain DORA consortium parties,
and may not be reproduced or copied without permission.
All DORA consortium parties have agreed to full publication of this document.
The commercial use of any information contained in this document may require a license
from the proprietor of that information.
Neither the project consortium as a whole nor a certain party of the consortium warrant that
the information contained in this document is capable of use, nor that use of the
information is free from risk, and accepts no liability for loss or damage suffered by any
person using this information.
Impressum
Project acronym/name DORA Door to Door Information for Air Passengers
Project number/type 643973 Research and Innovation Action
WP number/leader 6 TUB
Task(s) no.(s)/leader(s) 6.1 TUB
Copyright notice
2015/2016/2017/2018 Center for Technology and Society, Technische Universität Berlin
and members of the DORA consortium
DORA Deliverable D6.1
3 / 74 3 / 74
Executive Summary
This deliverable ‘6.1 Evaluation Plan’ starts with a general chapter on the background of
evaluation in DORA and its purpose. It shows the relation of the DORA project to Horizon
2020 as the biggest EU Research and Innovation programme of the European Union and to
the specific call ‘MG 1.3-2014: Seamless Air Mobility’.
It continues with describing the purpose of the deliverable which is to set the plan for all
evaluation activities that will be carried out within the DORA project. It, moreover, describes
the approach to the evaluation, presents a working plan for the evaluation activities and sets
the frame for the responsibilities of the partners involved. Besides this clearly structured
delineation of the evaluation framework, the evaluation plan also aims at providing
explanations on methods and indicators to be used, as well as presenting guidelines for the
application of specific methods by the partners involved in evaluation.
The following chapter summarises the aims of the project as they were described in the
‘Description of Work’ document. Aims are being understood as the realisation of single
components of the DORA service and their final integration. The component s being
described are: “Design and Implementation of a DORA Service Platform (PLA) and a Long
Term Door-to-Door Journey Planner (DJP)”, “Design and Implementation of End-User-
Applications for Seamless Mobility Information (APP / SWA / WEB)”, “Implementation of
Personal Information Services (FIS / ILR / SRS / TMS)”, “Implementation of an Incident and
Information Management for Airports (MIM)”, “Design and Implementation of a System for
Detection of Waiting Time in Airports (WTS)” and “Design and Implementation of an Indoor
Location and Routing at the Terminal (IP / IR)”.
The following chapter (1.5) identifies and describes all objectives that will be evaluated
within this project. The objectives being described are ordered along their corresponding
task in Work Package 6. Together with the indications of the following chapter, that describe
the frame conditions in the implementation sites in Berlin and Palma de Mallorca, the before
mentioned objectives provide the basis for the description of those methods that will be
applied to measure if – or to which extend - the objectives were reached.
Chapter 2 is dedicated to the description of the iterative DORA evaluation approach. It
describes the evaluation activities and their relations to each other, and presents methods
and information on how they can be applied. It contains detailed information on the four
evaluation activities that will be carried out in DORA:
The Technical evaluation chapter (2.3.1) demonstrates how the evaluation of the technical
features of DORA will be evaluated. The DORA developments are divided into three main
DORA Deliverable D6.1
4 / 74 4 / 74
categories: Services, Applications and Technology. Each category follows a different
development methodology that is being described by giving information on methods and
timelines for accomplishing them by orienting on the ‘Evaluation Roadmap’.
The chapter Usability evaluation (2.3.2) describes the main goal of the Usability Evaluation,
which is to assess usability criteria in order to identify possible problems with the design of
the interfaces at an early stage during the conceptual phase and during its integration phase.
It also provides a description on how to realize three usability evaluations at each of the
three development steps of the DORA system: Usability Evaluation of the Concept for the
Web and App GUI / Usability Evaluation of the DORA Prototype / Usability Evalua tion of the
Integrated Final Product (DORA system Alpha Version). The end of this chapter consists of an
overview on the operationalization of these evaluation activities, including information on
methods to be applied, partners who are involved, or times for their conducts.
The chapter Results evaluation (2.3.3) focuses on the description of the evaluation activities
related to the assessment of the results of the project. It comprises information related to
Impact Evaluation and Evaluation of User Satisfaction. It, moreover, includes information on
the application of impact evaluation, shows which indicators are being applied for the
assessment of the objectives, describes why a control site approach is planned to be applied
for two indicators, and shows which methods (user tests / surveys with questionnaire and
interviews) will be applied for the assessment of ‘User Satisfaction’. Methods and remarks
concerning the operationalization are summarized in tables.
The chapter Process evaluation (2.3.4) starts with an explanation of the intention of this
evaluation activity, which is to understand what influences the project process in an either
positive or negative way and how experiences gathered during the early phase of the project
have supported the further process of the project. It looks at the strengths and weaknesses
(drivers and barriers), thereby focusing on HOW an outcome or product is being realized
rather than on how the impact from the project performs. This chapter also summarizes
information concerning the operationalization of the three phases of this activity at the end
of the chapter.
A major ambition of the DORA project, is to develop a system that is easily transferable to
other cities and their airports in Europe. Thus, the following chapter 3, provides information
on the relevance of the results from all evaluation activities that are being carried out within
DORA for other cities that are considering to implement the system in their cities.
Being aware of the fact that every development or implementation process is constantly
subject of changes and also risks, chapter 4 describes ‘Issues that can jeopardise the
evaluation’. Since these risks are mainly related to the technical development of the
components or to the integration of them (e.g.: incompatibilities of systems or a lack of
DORA Deliverable D6.1
5 / 74 5 / 74
information that is necessary for the development of the system) , this chapter tries to
foresee a range of possible risks in order to prepare for the case of their occurrence. Thus,
also mechanisms to avoid or minimise the risks are being described.
The Evaluation Plan ends with chapter 5 “Operationalization / Roles and Responsibilities”. It
provides a quick overview on the DORA objectives and the related Work Packages, shows
the involvement of the partners in the tasks of Works Package 6 and the timeline for the
deliverables of the single tasks as described in the DOW. It presents a table with an overview
on all evaluation tasks and sub-tasks, the related methods for assessment and time frames
for their accomplishment.
DORA Deliverable D6.1
6 / 74 6 / 74
List of Authors
Organisation Authors Main organisations’ contributions
TUB Michael Abraham
TUB Norman Döge
CSE Konstantinos KoutsopoulosChapter 2.3.1 Technical Evaluation;
Review
TUB Mandy Töppel Consultation for Usability Evaluation
VMZ Jan-Niklas Willing Review
VMZ Tom Schilling Review
DORA Deliverable D6.1
7 / 74 7 / 74
Table of contents
1 INTRODUCTION ............................................................................................... 10
1.1 Horizon 2020 ................................................................................................... 10
1.2 MG 1.3-2014: Seamless Air Mobility................................................................. 10
1.3 Purpose of the deliverable ............................................................................... 11
1.4 Aims of the Project .......................................................................................... 12
1.5 Objectives........................................................................................................ 15
1.5.1 Objectives related to Task 6.2 Technical Evaluation and Assessment ................... 15
1.5.2 Objectives related to Task 6.3 Usability Assessment............................................. 16
1.5.3 Objectives related to Task 6.4 Evaluation of Impact and Process.......................... 17
1.6 City and Airport Description Berlin ................................................................... 21
1.7 City and Airport Description Palma de Mallorca ............................................... 23
2 EVALUATION APPROACH AND METHODOLOGY................................................ 25
2.1 Introduction .................................................................................................... 25
2.2 DORA Evaluation Approach.............................................................................. 25
2.3 Methods and Operationalization...................................................................... 30
2.3.1 Technical Evaluation ............................................................................................ 30
2.3.2 Usability Evaluation.............................................................................................. 40
2.3.3 Results Evaluation ................................................................................................ 47
2.3.4 Process Evaluation ............................................................................................... 57
3 TRANSFERABILITY ............................................................................................ 63
4 ISSUES THAT CAN JEOPARDIZE THE EVALUATION ............................................. 65
5 OPERATIONALIZATION / ROLES AND RESPONSIBILITIES.................................... 67
6 REFERENCES .................................................................................................... 72
7 ANNEX............................................................................................................. 73
DORA Deliverable D6.1
8 / 74 8 / 74
List of Figures
Figure 1 Interlinkage of DORA Objectives .............................................................................17
Figure 2 Dora Airports ..........................................................................................................21
Figure 3 DORA Evaluation Approach .....................................................................................26
Figure 4 Evaluation tasks and their relations.........................................................................27
Figure 5 Web Content Accessibility Guidelines 2.0 - success criteria examples for DORA......41
Figure 6: Baseline and Business-As-Usual Scenario [4] ..........................................................48
Figure 7 Control Site Evaluation in DORA ..............................................................................52
Figure 8 Categories of process barriers and drivers and examples [4] ...................................59
Figure 9 WP 6 Effort per partner...........................................................................................69
List of Tables
Table 1: Development Methodologies ..................................................................................31
Table 2: Evaluation Roadmap for Services ............................................................................33
Table 3: DJP Development Roadmap ....................................................................................34
Table 4: IR Development Roadmap.......................................................................................35
Table 5: ILR Development Roadmap .....................................................................................35
Table 6: FS Development Roadmap ......................................................................................36
Table 7: TMS Development Roadmap ...................................................................................37
Table 8: SRS Development Roadmap ....................................................................................37
Table 9: Operationalisation - usability test of conceptual variants ........................................43
Table 10: Operationalisation - usability test of first prototype ..............................................44
Table 11: Operationalisation - last usability test before trial phase .......................................45
Table 12 DORA objectives and their indicators .....................................................................49
Table 13 DORA Indicators and field test data gathering methods (control site approach) .....53
Table 14 DORA Indicators and field test data gathering methods (survey) ............................55
Table 15 DORA objectives and the related Work Packages ...................................................67
Table 16 DORA deliverables and the related Work Packages ................................................69
Table 17 Overview of DORA Evaluation Tasks .......................................................................70
DORA Deliverable D6.1
9 / 74 9 / 74
AbbreviationsAbbreviation Explanation
APP DORA App
BER Berlin Brandenburg Airport Willy Brandt
D#.# Deliverable number #.# (D7.1 deliverable 1 of work package 7)
DJP Door-2-Door Journey Planner
DoA Description of Action of the project
DORA Door to Door Information for Airports and Airlines
EC European Commission
EU European Union
FIS Flight Information Service
H2020 Horizon 2020 Programme for Research and Innovation
ILR Intermodal Landside Router
IP Indoor Positioning
IR Indoor Routing
M# #th month of the project (M1=June 2015)
MIM Mobility and Incident Management Panel
PLA DORA Platform
SRS Strategic Routing Service
SWA DORA Smart Watch App
SXF Berlin-Schönefeld Airport
TMS Trip Monitoring Service
TXL Berlin-Tegel Airport
PMI Palma de Mallorca airport
WEB DORA Web GUI
WMS Web Map Service
WP Work Package
WTS Waiting Time Service
DORA Deliverable D6.1
10 / 74 10 / 74
1 INTRODUCTION
The purpose of this deliverable is to give an overview of all evaluation activities that will be
carried out within the DORA project. In this chapter there will be given information on the
frame conditions and goals of the deliverable. The respective call as well as funding scheme,
the purpose of the deliverable as well as its structure will be described.
1.1 Horizon 2020
Horizon 2020 is the biggest EU Research and Innovation programme of the European Union
ever with nearly €80 billion of funding available over 7 years (2014 to 2020). It is the
financial instrument implementing the Innovation Union, a Europe 2020 flagship initiative
aimed at securing Europe's global competitiveness. It is being regarded as an investment in
the future of Europeans and seeks to boost smart, sustainable and inclusive growth and
jobs.[1]
1.2 MG 1.3-2014: Seamless Air Mobility
With this call the European Commission addresses the specific challenge to “… create links
between people and exchanges for business, leisure and culture within Europe and
worldwide.” by improving flight connections between European cities. The specific approach
to shorten travel times is to focus on door to door travelling comprising the entire travel
chain that consists of several single segments. Those can be enhanced concerning time
efficiency, seamlessness, robustness and accessibility of the European air transport system.
Further, and more concretely, the call is looking for research and innovation projects that
could target products and services which can enhance customer experience, minimize the
duration of the travel of air passengers and provide them with integrated and
comprehensive information to plan their travel. Also they could seek to directly improve the
accessibility to airport and planes.
Another specific focus lies on the provision of mobility information for passengers with
reduced mobility.
In conclusion, the call is clearly addressing intermodality that links all kinds of transportation
modes, be it landside or aircraft transportation. This way, transportation should become
smarter, greener and more integrated.
DORA Deliverable D6.1
11 / 74 11 / 74
The expected impacts are being formulated in the call as follows:
- 90% of the travels involving air transport within Europe can be completed in 4 hours
door to door.
- Passengers can make informed decisions.
- Air transport is well connected to other modes.
1.3 Purpose of the deliverable
This deliverable sets the plan for all evaluation activities that will be carried out within the
DORA project. It will describe the approach to the evaluation, it will present a working and
time plan for the evaluation activities and sets responsibilities of the partners involved.
Moreover it will include explanations on methods and indicators to be used, as well as
guidelines for the application of specific methods.
DORA is a project composed of a high number of partners across Europe. The technical
development of the DORA services will be developed conjointly between these partners. The
high number of partners and the complexity of factors that are influencing the development
of the various technical components make the development process a comprehensive task.
It can be assumed that a range of unforeseen events might occur. This has an effect on the
evaluation activities. Thus, this evaluation plan does not present a final, rigid plan that has to
be implemented by all means. It rather can only be regarded as a flexible, developing plan
that takes into consideration actual occurrences. It sets the frame for the evaluation
activities, describes methods that should be applied, but also seeks to present alternative
methods for the case that it will turn out that the initially planned method cannot be carried
out.
This deliverable is crucial for all other work packages since it describes how to evaluate
whether the objectives of each work package, as they were described in the ‘Description of
Work’ document, have been reached after the finalization of the project. Evaluation will
focus specifically on the work packages 3 ‘Concept Specification’, 4 ‘Software development
and integration’ and 5 ‘Pilot Execution’, since all steps during the development and testing
phase of the technical features will be evaluated. Thus an iterative approach is being
pursued that constantly evaluates the different steps of the service and that feeds back its
results into the further development. The results from evaluating the test during the pilot
execution will finally deliver evidence proving the success or failure of the project.
DORA Deliverable D6.1
12 / 74 12 / 74
Further, this evaluation plan will be the basis for the following other deliverables that will be
produced within work package 6:
- D6.2 Technical Evaluation and Assessment Report [M30] as a summary of the results
from the technical pretests
- D6.3 Usability Assessment Report [M24] – as a summary of the results from the
Usability pre-tests
- D6.4 Final Evaluation Report [M36] – as a summary of the findings resulting from the
impact, usability and process evaluation.
In this sense the deliverable at hand is also relevant for work package 7 ‘Di ssemination and
Exploitation’ – it contributes to the creation of sound results that can be disseminated on
European level in order to enable other cities and airports to learn from the findings of the
DORA project.
1.4 Aims of the Project
This Chapter summarises the aims of the project as they were described in the ‘Description
of Work’ document. Aims are being understood here as high level objectives that descr ibe
the overall service that will be the result of the project and the single technical features of it.
The objectives, that will be subject of the following chapter, will be the measureable goals of
the project that will accomplish the aims of the project and that will be evaluated within the
scope of Work Package 6 of the DORA project. Single components of each project aim, that
were defined in the scope of the DORA project requirements specifications (Task 2.4.
Technical Requirement Analysis and Specification), are added with acronyms in brackets
after each headline. Their full names are mentioned in the descriptions.
Design and Implementation of a DORA Service Platform (PLA) and a Long Term Door-to-
Door Journey Planner (DJP)
As the basic DORA component a seamless and integrated Door-to-Door Journey Planner
(DJP) will be realized to integrate existing transport mode specific real time information
services in one overall intermodal traffic information platform – the DORA Service Platform
(PLA) for land and air transport. The intermodal platform will gather and analyse the
required traffic information in real time, which is necessary to provide needed information
to the air passenger on his journey to the airport, with a seamless information covering all
transport modes and all stages of the journey, and suggest optimal routes for travels to and
from airports. The platform will also be designed for usage by further interested third
parties, such as travel agencies, airlines, airports, etc. By open interfaces which will be
established, they can embed the DORA functionalities in already existing or new
DORA Deliverable D6.1
13 / 74 13 / 74
applications. The concrete target of the objective is to make the Intermodal platform for Air
and Land transport available in its full functionality. Furthermore, the Journey Planer will
enable the end user applications for routing optimization and will integrate technology
solutions for recognition of waiting queues and indoor location services in airports.
Design and Implementation of End-User-Applications for Seamless Mobility Information
(APP /SWA /WEB)
Based on the concept of the Door to Door Journey Planner (DJP), described above, the DORA
project will design and implement applications (APP) for smartphones and the internet.
These applications are enabling the end users to receive seamless mobility information
covering all transport modes and the entire mobility chain from the origin to the airports
and planes, including the final destination. The route, the mode change nodes and the
indoor path in the terminal is shown on maps to support the orientation of the passenger.
Detailed information on expected disruptions like road works and un-predictable events are
given. The smartphone application will be realised as an intermodal routing Planner that
integrates and processes the available real time information on land transport means,
terminal procedures, and air transport. A trip monitoring function of the smartphone
applications will register disturbances on the selected route and provides alternative routes
when necessary. Beside the smartphone applications, the project will also create a
corresponding web application (WEB) and web interfaces for the end users with the same
functionality, to be integrated to web-portals or existing and new mobile applications of
airlines, airports, and further interested stakeholders to allow a broader usage.
Following components are being realized in the scope of this goal:
DORA App (APP) for iOS and Android
DORA WEB GUI (WEB)
Possibly: Smart Watch App (SWA)
Implementation of Personal Information Services (FIS / ILR / SRS / TMS)
The DORA mobility information platform and the smartphone application will be also
designed to ensure personalized services, in order to meet requirements of individual
passengers and specific end users’ groups, such as people with reduced mobility, families
with children, frequent travellers, etc. Thus, the target is that the end users individually can
configure the smartphone application so that the provided routing information is calculated
in accordance with individual mobility preferences and constraints. The personal information
services, being targeting at least three different travellers groups, will be evaluated by the
end users during the planned project trials. Following components can be subsumed under
these goals:
DORA Deliverable D6.1
14 / 74 14 / 74
Flight Information Service (FIS)
Intermodal Landside Router (ILR) for Berlin and Palma
Strategic Routing Service (SRS)
Trip Monitoring Service (TMS)
Implementation of an Incident and Information Management for Airports (MIM)
The Incident and Information Management for Airports system (AIRVIS – Sec. 1.3) is based
on a cooperation of all operation centers involved in control and information of airport-
bound transport and ensures that in case of disruption the airport remains accessible and air
passengers are informed consistently. The DORA information portal (Mobility & Incident
Management Panel – MIM) will also be linked with this incident and information
management system for land and air transport, which is already in place in Berlin. The AIRVIS
system will be extended by terminal strategies including security gates terminal incidents.
The system will be transferred to Palma de Mallorca in the scope of the project.
Design and Implementation of a System for Detection of Waiting Time in Airports (WTS)
It is the traveller’s responsibility to arrive at the airport with enough time to complete all
ticketing, baggage check, and security clearance procedures, but the waiting times at check -
in and security control are not predictable. To handle this, the DORA project will design and
implement a waiting time detection system (Waiting Time Service – WTS) in the airports
based on image recognition. The information gathered in this way will be processed by the
DORA information platform and corresponding end user applications for route optimisation.
To ensure privacy of the passengers and employees in the airports, the planned video
observation procedures will be fully anonymised without possibility to recognise individuals
on the images. Additionally, users can also contribute into detection of delays and queues by
verifying and uploading events into the system. These notifications can be triggered by
detection of any changes in the regular or current moving pattern of the user in combination
with the current location.
Design and Implementation of an Indoor Location and Routing at the Terminal (IP / IR)
The DORA project will explore innovative technologies, e.g. on the WLAN or BLE or beacon
technologies, for indoor location in the airports to be used by the DORA system to suggest
optimal routes to the passengers through terminals, security gates, etc. The passengers will
be located in the airport buildings by analysis of the data available in the WLAN base stations
installed in the airports, so that optimal routes through the buildings can be estimated based
on the passengers locations. Beside usage of the WLAN based stations, the project will also
elaborate further opportunities for the location service such as the development of Indoor
DORA Deliverable D6.1
15 / 74 15 / 74
Location beacons using WiFi and/or BLE radio systems. . After selection of appropriate
approach, the location1 Wireless Local Area Networks service will be implemented and
linked to the overall DORA platform, including its functional testing. The location service will
also be a part of the planned field trials. Components of this goal are:
Indoor Positioning (IP)
Indoor Router (IR)
1.5 Objectives
In this chapter the objectives of the DORA project will be described. This is a first step to
identify appropriate methods on how to measure if they were achieved. The goal of this
chapter is to identify and describe all of the objectives that will be evaluated within this
project. In the next chapter it will be described which methods will be made use of to
measure them. The objectives will be described ordered along their task in Work Package 6,
whereas Task 6.1 will not be regarded since it is the drafting of this deliverable ‘Evaluation
Plan’.
1.5.1 Objectives related to Task 6.2 Technical Evaluation and Assessment
These objectives are addressing the existence, functionality and availability of the technical
components of the DORA service. The results from this evaluation will be summarized in
D6.2 ‘Technical Evaluation and Assessment report’. This is an intermediate report which will
summarize the results from the technical pre-tests (Validation of functionality / Performance
assessment / Troubleshooting tests / Platform test). The following DORA objectives will be
evaluated in this task:
Objective 6.2.1 Development of Intermodal Platform for Air and Land Transport
Main result: The platform is available in its full functionality, enables the Smartphone
application for routing optimization, and integrates technology so lutions for recognition of
waiting queues and indoor location service.
Verification: Successful technical testing of the integrated DORA solution
Objective 6.2.2: Availability of Mobile Smartphone Application for Seamless Mobility
Information
Main result: The smartphone application is available in all project areas with wireless
Internet access and is able to show the best possible routes to/from and in the airports in
the areas involved in the trials.
Verification: Successful technical testing of the smartphone application
DORA Deliverable D6.1
16 / 74 16 / 74
Objective 6.2.3: Functionality of Personal Information ServicesMain result: It is technically possible for the end users to configure the smartphone
application according to the individual mobility preferences and personal constraints.
Verification: The personal information service validated by real end users
Objective 6.2.4 Integration of Mobility and Incident Management Panel into Operation Centers for AirportsMain result: The Intermodal platform for Air and Land transport is available and connected
to different operation centres.
Verification The information management system, is integrated into different operation
centres in Berlin and Palma de Mallorca
Objective 6.2.5: Functionality of System Component for Detection of Waiting Time in AirportsMain result: The DORA system can recognize the queues at designated points in the airports(queue detection system).
Verification: The queue detection and related calculation of the waiting time in airports is integrated with the DORA system.
Objective 6.2.6: Functionality of Indoor Location and Navigation Service componentMain result: The fully functional location service is implemented and linked to the overall DORA platform
Verification: The indoor location and navigation service is integrated with the DORA system.
1.5.2 Objectives related to Task 6.3 Usability Assessment
This objective focuses on the usability of the overall DORA service including all of its components. It addresses only prototype variants of the system and seeks to identify problems in its usability in order to eliminate them during the development phase. In contrast to the previous group of objectives that are dealing with the technical readiness of the components, the objectives in this group are dealing with the usability of the components by users. Results from this evaluation will be summarized in D6.3 ‘Usability Assessment report’ which will summarize the results from the Usability pre-tests (test ofconceptual variants / first prototype test / pre-test before trial phase). The main objective of this task can be formulated as follows:
Objective 6.3. Identification of usability problems of prototypes of the DORA serviceMain result: Prototypes of the DORA service were tested in trials and pre-tests with involvement of real end users. Usability problems were identified summarized and reported back to the developers. The prototype tests comprise all components of the service including the platform as well as the application. It consists of the identification of usability problems concerning three steps during the development process:
DORA Deliverable D6.1
17 / 74 17 / 74
Figure 1 Interlinkage of DORA Objectives
- GUI Concept of DORA application and website
- First Prototype test
- Final pre-test before trial phase
1.5.3 Objectives related to Task 6.4 Evaluation of Impact and Process
The objectives subsumed under this category, are in so far crucial ones for the DORA project as they describe the quality of the result of the project. In contrast to the previous groups of objectives that focus on the possible improvements of the technical and usability features during the development phase, the objectives described in this chapter are dealing with the description of the final product of DORA project. The results of the impact and process evaluation will be summarized in D6.4 ‘Final evaluation report’.
These objectives and their interlinked effects and impacts are shown in the following graph. Later on they will be described in detail and with a focus on their measureable impacts.
DORA Deliverable D6.1
18 / 74 18 / 74
Objective 6.4.1: Reduction of travel timeMain Result: By making use of the DORA integrated information system including the application, the overall time needed for a typical European air travel including the necessary time needed for transport to and from the airports will be reduced by up to 20 %. Thereby DORA can have a significant impact for reaching the high level objective of the EU for 2050 that 90 % of the travels involving air transport within Europe can be completed in 4 hours door to door. DORA comprises three features that will contribute to achieve this objective:
- real time based intermodal routing services for landside transport
- waiting time reduction in terminal based on better information provided by thedetection system
- personalized indoor routing
Objective 6.4.2: Passengers can make informed decisions: Also this goal contributes to make travel times shorter: With DORA passengers will have information that enables them to organize their trips in a time-efficiently and adjusted to personal needs (this is important for mobility-impaired people). To reach the goal that 90 % of DORA App users consider the information provided as useful, the DORA App will provide information on:
- expected and real time travel time for various modes and multimodal travel chains
- delays in air and landside transport
- alternative routes in case of disruptions or technical breakdowns in transport system
- personalized information reflecting the user´s preferences
Objective 6.4.3: Air Transport is well connected to other modes – Accessibility of Airport This objective contributes to travel time reduction as well. But by ensuring the accessibility of the airport through providing information on different modes of transportation with DORA, it will also make the journey to/from the airport less stressful for passengers. Thus the goal is that 90 % of DORA App users consider information on how to travel to and from the airports as helpful. More specifically two features will provide information:
- the DORA App will provide multimodal route recommendations covering landside and air transport information
- the Operation Centre Application ensures accessibility of airports even in case of
disruptions
Objective 6.4.4: Improved Service Quality in Airport terminalThis objective is addressing the reduction of waiting and walking times at the airports. DORA will provide information to reduce waiting and walking times in the terminal by up to 20 %. Two features will be implemented within DORA to achieve this goal:
- the DORA waiting time detection system that helps passengers to complete clearance procedures in terminal
- the personalized DORA Indoor routing system that helps passengers to find the most comfortable way to the plane in the terminal
DORA Deliverable D6.1
19 / 74 19 / 74
Objective 6.4.5: Shift to greener landside transport modes Travelers often consider public transport to the airports too uncomfortable for the transport of bulky luggage or too unreliable to arrive in time at the terminal. Also schedules , tariffs or even the existence of alternative travelling services are not known so that passengers frequently choose to travel by taxi or car. In order to overcome these barriers DORA will implement a time-reliable and comprehensive mobility information system that shows the up-to-the minute transport options, indicates related costs and travel times and personalized needed transport infrastructure. This way, DORA will contribute to reach a share of 50 % of public transport trips to the airports in Berlin and Palma de Mallorca and the corresponding trips from the airports to the final destinations. To reach this goal the DORA App will provide an intermodal route planner with information on travel time, costs and CO2 which might show the attractiveness of public transport.
Objective 6.4.6: Inclusive transport services responding to special needs of travelersTo make air transport accessible and user-friendly to everyone, travel opportunities need to be equitable for all groups of travelers including those with any type of disabilities. The personalized indoor routing services provided by DORA will assist these user groups and help to provide a seamless, barrier-free access of the plane and access to special assistance services at the airports. Thus the DORA App provides personalized travel informationincluding indoor routing for special user groups such as handicapped or elderly persons and families. This way the goal can be reached that 10 % of DORA App users including users with and without special mobility needs use indoor routing services.
Objective 6.4.7: Robustness in case of disruptionsAirports are critical infrastructure and may easily be affected by man -made and natural threats. The potential threats are manifold and range from unattended luggage in the terminal, to accidents and break downs in the transport infrastructure etc. The DORA Operation Centre Application will respond to the occurrence of these threats by providing predefined substitution processes for managing air and landside traffic and consistent passenger information in stations, streets and terminals. It will also provide passenger with alternative transport options. Thus the objective is to develop and implement a No. of predefined substitution processes and information strategies.
Objective 6.4.8: Results from Process Evaluation are availableWithin the DORA project a process evaluation will be carried out. It will assess the main drivers and barriers that occurred during the main phases of the project, including the development and testing phase. It will also assess how the partners reacted on these barriers and drivers: How did the overcome barriers and how did they make use of the unexpected drivers to the project process? This information is essential for the analysis whether the project will have been implemented in a successful way. Thus, and together with the results from impact and result evaluation, these findings from process evaluation can give valuable hints for the transferability of the DORA service to other airports in Europe. DORA will prepare a Process Evaluation Report that summarizes the drivers, barriers, solutions and their relevance for the transferability to other airports that were identified
DORA Deliverable D6.1
20 / 74 20 / 74
during the project lifetime.
Objective: Field TrialsThis task comprises field trials to proof the DORA concept in Berlin and Palma de Mallorca. 500 real end users will be involved in the tests and feedback on usability. This objective is verifiable by the fact that 500 people validated the successful conducting of the field trial with the fully operational and implemented DORA system at both airports.
The following objectives are listed in the DoW as such. But in fact they will be assessed or verified within the deliverables of other Work Packages than WP 6.
Objective 6.4.9: New Cooperation and Business Framework (Deliverable 3.5)Main results of this objective is that appropriate cooperation and business frameworks for DORA have been analyzed and that the most suitable and sustainable solution(s) were selected to develop a cooperation and business framework. The outcome will be a report including the definition of a business framework for various identified stakeholders interested to use the DORA system.
Objective 6.4.10: Wide Public AwarenessThis objective seeks to showcases the pilot of the DORA solution at Berlin Airport and PMI to an extend that it will be widely reported and attract public interest. The outcomes, the Scientific and technological advancements, will be published in leading academic and industry journals. Also the Establishment of an Advisory Group is part of this objective. The visible results of this objective are that the pilot showcases are covered in traditional and social media and in 2-6 scientific or industry journal publications produced annually. The Advisory group meetings will be held regularly. The results of the meetings will be documented.
DORA Deliverable D6.1
21 / 74 21 / 74
Figure 2 Dora Airports
1.6 City and Airport Description Berlin
The proof of the DORA concept and the test of the system will be executed in the field trials
at the airports of Palma de Mallorca and Berlin (Figure 2 Dora Airports), including respective
transportation regions.
The two airports reach more than 40 million of air passengers per year which corresponds to
almost 5 % of all intra-European flights. Germanys 2nd largest airline Air Berlin holds a major
hub at Berlin-Tegel airport and serves connection to Palma de Mallorca up to six times a day,
which together with other airlines amounts to a total of 690,000 passengers travelling
between these two destinations in 2013.
DORA Deliverable D6.1
22 / 74 22 / 74
In addition, the trial sites cover a range of typical travelers that are highly relevant for intra-
European air transport in general:
- Significant number of tourists, in particular from Germany, starting a journey from Berlin.
- Air Commuters: residents of Palma de Mallorca migrated from Germany, which number is increasing, but still work in Germany.
The DORA system will be realized as a technical solution that is easily to be transferred to
other airports. For Berlin in the long-term it is envisaged to realize the DORA system for the
new Berlin airport BER. At present, it is planned to open the BER in 2017 which is in line with
the DORA time schedule. However, to avoid risks for the DORA project and to ensure that
the DORA system can be evaluated in an operating Berlin airport, the technical requirements
for a DORA application in Berlin will be analysed for the existing airport ( SXF) and the future
airport (BER). The implementation of the system and necessary hardware components will
be done in the airport actually in operation at the time of the system testing.
At the Berlin airport 85 % of all departing passengers to destinations in Europa have flight
times less than 2 hours. This indicates that DORA can have a significant impact for reaching
the 4 hour travel time goal of the EU.
The Flughafen Berlin Brandenburg GmbH (FBB) ensures the air traffic infrastructure for the
German capital region of Berlin-Brandenburg by operating the regions two airports,
Schoenefeld (SXF) and Tegel (TXL). The FBB has around 1,500 employees and cooperates
with more than 300 companies to fulfil the requirements of FBB’s stakeholders such as
airlines, ground handling or air traffic control. All together more than 19,000 jobs are directly
linked to the airports. 2013 a total of 26.319.144 passengers flew to or from FBB’s airports.
This represents an increase of 4,2 % in comparison to 2012. With respect to the passenger
numbers, the metropolitan region of Berlin-Brandenburg ranks #3 in Germany and #13
within Europe. The regions positive development of the past years builds a demand for a
new infrastructure. Berlin Brandenburg Airport (BER) will fit this demand as the new single
airport location in south-east of Berlin close to the existing Schoenefeld Airport. BERs
ongoing construction will integrates more state-of-the-art Information and Communication
Technology (ICT) than many other European infrastructure projects. In DORA the FBB will
work on the technical concept, set up a test environment and work on reasonable business
models. FBB expects a continuing growth of passenger figures for the next years and sees a
significant increase of customer experience and satisfaction within DORA.
Berlin Brandenburg Airport will have the best possible connections. Having the six-lane A113
motorway and the six-lane B96a or else the four-lane B96 Berlin provides many alternative
DORA Deliverable D6.1
23 / 74 23 / 74
routes connecting the inner city and the airport. A high-tech traffic management system
called AIRVIS, observing the routes, offers a maximum of traffic information and guidance
before and on the trip. Including the Berlin-Brandenburg public transportation AIRVIS
ensures a 24 hours accessibility of Berlin’s inner city.
All air traffic in the region will be operated from the Schoenefeld site in southeast Berlin. BER
will be a new-generation airport: functional and cosmopolitan distinguished by its modern
architecture. A starting capacity of up to 27 million passengers is expected for 2012.
Depending on passenger development, the airport can be expanded to accommodate up to
45 million passengers.
1.7 City and Airport Description Palma de Mallorca
Palma de Mallorca is a municipality and a Spanish city, capital of the Mallorca Island and the
autonomous community of the Balearic Islands. It is located in the western Mediterranean
Sea and southwest of the Mallorca Island, about 250 km east of the Iberian Peninsula. The
municipality covers an area of 208,63 Km2. The city is located in the centre of the bay of
Palma, about 13 meters above the sea level. With more than 400.000 inhabitants (2011),
Palma de Mallorca is the eighth largest city in Spain by population and the first of the
Balearic Islands. In addition, its metropolitan area includes nine localities reaching 560.2 40
inhabitants in an area of 1015,88 km2, being the 14th of Spain. More than 50 % of the
population lives in the Palma metropolitan area. Mallorca is a major holiday destination with
over 9 million tourists a year, all of them pass through the City of Palm a at arrival and many
of them return to visit the attractive historic city centre. The island has a high car ownership
rate with 900 vehicles per 1000 inhabitants. Mobility culture is still very much based on the
private car, with a modal share of 58 % of trips on the island. The main means of public
transport in Palma is the urban bus and in recent years several improvements to increase
the use of public transport have already been made, including intermodality measures to
connect public transport to cycling and car use. Most of the non-motorised trips in Palma are
made on foot. The city has created pedestrian areas on routes towards the historic city
centre and within the historic centre itself. The cycling lane network has a length of 43.5 km.
In 2011, a new public bicycle system called Bicipalma was launched. To improve accessibility,
attractiveness, environment and health, the different levels of government have designed a
favourable legislative and political framework for the promotion of sustainable mobility. This
is shown by the recent adherence of the municipality of Palma de Mallorca to the CIVITAS
Forum Network, by signing the CIVITAS Forum Network Declaration.
Aeropuerto Palma de Mallorca is the world's leading airport operator. It manages 46 airports
and 2 heliports in Spain and participates directly and indirectly in the management of
DORA Deliverable D6.1
24 / 74 24 / 74
another 15 airports worldwide. Palma de Mallorca International Airport is located eight
kilometres south-east of the city of Palma on the Balearic Island of Majorca, Spain. The
airport is operated by AENA (Spanish Airports and Air Navigation Authority). More
commonly known as Son Sant Joan Airport or Aeroport de Son Sant Joan, it is the third
largest airport in Spain after Madrid's Barajas Airport and Barcelona Airpor t. The airport
serves 22 million passengers and records 180,000 aircraft movements annually. It handles
15,000t of cargo. During the summer months it is one of the busiest airports in Europe, and
was used by 22.7 million passengers in 2012. The airport is the main base for the Spanish
carrier Air Europa and also a focus airport for German carrier Air Berlin. The airport was
awarded the Airport Service Quality (ASQ) Award in February 2012 by the Airport Council
International (ACI). AENA – Palma de Mallorca Airport will mainly participate in the
deployment of the field trials in Palma.
DORA Deliverable D6.1
25 / 74 25 / 74
2 EVALUATION APPROACH AND METHODOLOGY
2.1 Introduction
The evaluation of the DORA service including its single components and as an integrated
system is a complex tasks. Since the service will be developed from the very beginning it is
very important to carry out several intermediate assessments. Thus DORA wil l pursue an
iterative approach that will be described in the following chapter.
2.2 DORA Evaluation Approach
Evaluation activities in DORA will pursue an approach based on the three pillars:
Iterative evaluation,
Impact evaluation and
Process evaluation.
During the development phase of the system: To identify adaption to user needs in an early
stage, an iterative evaluation procedure will accompany all phases of the development. Test
persons will be involved in the concept phase and during the pilot test. The usability of the
service will be tested in pilot activities at the test sites with up to 500 real end users. Another
focus of this iterative evaluation will be to evaluate the technical functionality throughout
the development process. The technical evaluation is accomplished by measurements and
pre-tests for the partial components and systems. It is aiming at resolving and validating
interoperability issues as well as producing performance indicators.
Amongst others, an emphasis is set thereby on indoor navigation in airport buildings up to
the gate. The results of these tests will be analyzed and incorporated in the development
process of the system. This way this iterative evaluation will support the development of a
highly user-friendly and technically well-functioning system.
Evaluation of the impacts (impacts and outputs) of the project will be conducted during and
after the pilot test of the system. By asking the question “Were the objectives of the project
achieved?” this pillar of the DORA evaluation approach represents the core of the evaluation
activities. By evaluating the expected impacts with the corresponding indicators and by
applying suitable measurements and methods for data gathering (qualitative methods with
questionnaires, interviews, focus groups, if applicable control site), the successes of the
project can be demonstrated. Since the success of the system strongly depends on the user
satisfaction, this indicator will be assessed in depth. Surveys addressing the task groups (end
users) and experts will be conducted to find out to which degree the user is satisfied with
DORA Deliverable D6.1
26 / 74 26 / 74
Figure 3 DORA Evaluation Approach
the newly developed service.
Evaluation of the process that leads to the project’s implementation. By conducting surveys
with project partners and other relevant stakeholders, it will assess which barriers hindering
the development and implementation processes, which drivers that had a positive influence
on these processes occurred and which were the lessons learnt from these experiences. The
surveys will be carried out three times during the project lifetime. Together with the result
from the impact evaluation the results from process evaluation will deliver a comprehensive
picture of the expectable benefits from implementing the system, as well as on the process
of the process of its implementation. This is important information for other cities/airports
that are planning to transfer the system to their cities. Moreover process evaluation will
deliver insights on possible improvements of the ongoing implementation process of DORA.
The DORA evaluation approach is visualized in the following figure:
Due to the complex structure of the DORA service with its various components and different
development and testing phases, the evaluation approach seeks to be simple and sound.
Thus wherever possible, evaluation activities should be combined between the sin gle tasks
of evaluation and their corresponding partners, that will carry out the evaluation activities.
In the following figure a simplified time and task structure will be shown that illustrate s the
relation between the evaluation activities and the month in which it will be carried out. It
will focus on the tasks of WP 6 ‘technical evaluation’, ‘usability evaluation’ and ‘impact
evaluation’.
DORA Deliverable D6.1
27 / 74 27 / 74
Figure 4 Evaluation tasks and their relations
Deadlines Technical Evaluation Objectives
6.2.1 - 6.2.6
DORA
Development Steps
Usability Evaluation
Objective 6.3
Impact Evaluation Objectives 6.4.1 –
6.4.7Proj.
MReal
MYear
1 Jun 2015
24 May 2017 Deadline WP 6.3
30 Nov 2017 Deadline WP 6.2
36 May 2018 Deadline WP 6.4
Version I
Component X
Version II
Component X
Concept for Web – App-
GUI
Tech. Eva. Component X
Tech. Concept Eva.
Usability Concept Eva.
Integrated Prototype
Usability Prototype
Eva. Tech.
Prototype Eva. Integrated
Final Product (Alpha
Version)
Tech. Final Product Eva.
Usability Final Product
and Final Report
Final Report Tech.
Assessment
Impact Eva.
Field Tests
and Final
Report
Field Test
DORA Deliverable D6.1
28 / 74 28 / 74
This graph gives a rough overview on the times at which the single evaluation steps will be
carried out. Since it is not definite at what times the single steps of the development of the
components and their integration into the final DORA service will take place, times indicated
in the table are showing the deadlines of the deliverables of the Work package 6 only.
Besides deliverable 6.1 ‘Evaluation Plan’, these are:
D6.2 in month 30: Technical Evaluation and Assessment report. It will summarize the
results from the technical pre-tests and describe alternative and/or backup
approaches for proper and uninterrupted service provisioning.
D6.3 in month 24: Usability Assessment report. It will summarize the results from the
Usability pre-tests and represent a summary of the three intermediate short reports
containing information on the identified usability problems and recommendations
for improvements that will be provided to the developers of the system during the
development phase.
D6.4 in month 36: Final evaluation report that will summarize the findings resulting
from the impact, usability and process evaluation. It will be the central document to
demonstrate to which degree the objectives of the project were achieved and how
they were achieved (process evaluation) and contain recommendations for
transferability to other cities.
The Graph further lists the DORA development steps. They can be described according to the
‘Description of Work’ with the following tasks and delivery dates:
‘Version I Component X’: The architecture of the components are being developed
with task ‘3.2.1 Waiting Time Detection (= Waiting Time Service: WTS) ’, ‘3.2.2
Indoor-Location and Indoor Routing (IP / IR)’, ‘3.2.3 Intermodal Landside Router (ILR)
and Flight Routing Service (= Flight Information Service: FIS)’, ‘3.2.4 Trip Monitoring
Service (TMS)’, ‘3.2.5 Door-to-Door Journey Planner (DJP) and Strategic Routing
Service (SRS)’. First versions are being developed in the scope of work package ‘4.1
and 4.2 which is due end of year 2016 (M18/M19).
Version II Component X: Here the same components as in version I are dealt with.
They now were improved or developed further according to the results of the
technical evaluation carried out after the first development step.
Concept for Web-App-GUI: This concept demonstrates the first draft of the task ‘3.3
Specification of DORA Applications’ that includes tasks ‘3.3.1 Operation Centre
Application’ and ‘3.3.2 End User Applications’. It is being developed in the scope of
work package 3. These subtasks are due in month 14.
DORA Deliverable D6.1
29 / 74 29 / 74
Integrated Prototype: It is the first draft for the integration of the system in which all
single components are technically integrated. It is being developed in the scope of
work package 4 Software Development and Integration. It will be developed between
months 10 to 24.
Integrated Final Product (Alpha Version): It contains the components of the first
integrated prototype that were improved or further developed according to the
findings of technical and usability evaluation. It is being developed in the scope of
work package 4 Software Development and Integration. The integrated final product
will be due in month 24.
‘Field Test’ describes the final product of DORA service. It will be tested by involving a
number of end users to demonstrate if the DORA objectives were achieved. It will
focus on the evaluation of impacts and user satisfaction. The field test will be carried
out within the scope of work package ‘5 Pilot Execution’ and be finalized in month 36
with delivering the final evaluation report.
DORA Deliverable D6.1
30 / 74 30 / 74
2.3 Methods and Operationalization
2.3.1 Technical Evaluation
DORA developments are divided into three main categories: Services, Applications and
Technology
Each category follows a different development methodology (Table 1).
DORA Deliverable D6.1
31 / 74 31 / 74
Table 1: Development Methodologies
Development Category Methodology Details
Services:
D2D Journey Planner (DJP)
Indoor Router (IR)
Intermodal Landside Router
(ILR)
Flight Information Service (FIS)
Trip Monitoring Service (TMS)
Strategic Routing Service (SRS)
Waterfall It is expected that a clear view of
the final outcome is defined and
remains stable until the
completion of the project. The
different components are
progressing in parallel.
Applications:
Operation Center (MIM)
Web (WEB)
App (APP)
Agile The aspects of the applications
are expected to evolve
throughout the project’s lifetime.
The final outcome will be subject
to gradual inclusion and update of
ergonomic requirements.
Technology:
Waiting Time Detection (WTS)
Indoor Location (IP)
Prototyping The final outcome involves
hardware components. A limited
number of prototypes revisions is
possible to be produced and the
process can last for a few
iterations in order to produce
mature and efficient products.
Evaluation Plan Approach for Waterfall Methodology Related Developments
The evaluation/validation plan to be followed for the Services is aligned with the related
phases of the waterfall methodology (Requirements Analysis, Design, Development, Testing)
and therefore broken down to sub-plans to be followed for the proper completion of each
phase. The forward evolution of the waterfall methodology necessitates that each phase
should be evaluated positively since it defines the starting point for the next one (evaluation
of the Requirements Analysis was performed during the preparation of D6.1.)
i. The output of the Requirements Analysis was subject to verification that the Use Cases
wish list is properly addressed and there is not any part of system behavior that has not
DORA Deliverable D6.1
32 / 74 32 / 74
been mapped to any adequately described requirement. The Technical Requirements
and Specification Task (T2.4) took place from M4 (September 2015) to M9 (February
2016) while the Use Cases were produced by the end of M7 (December 2015). During
M8 (January 2016) the existing set of requirements that had been produced up to that
point was analyzed against the Use Cases so as to be consolidated as well as to be
completed in case of identification of any non-addressed aspect. The outcome of this
process was provided as feedback to Task 2.4.
ii. The design phase as far the services are concerned is expected to have reached a
mature state by the end of the first year (M12). The aspects addressed by the
requirements will have been taken into account in the reference architecture as well as
in the design of the services. The work produced by Tasks 3.1 & 3.2 will be analyzed on
the basis of the consolidated requirements during M11 and M12 and feedback will be
provided to the design teams.
iii. The services design teams will identify the development roadmap of all the architecture
components to be produced during M10 and M11. A preliminary high-level roadmap is
provided in the next paragraphs to indicate the various correlations among the services.
Times of different development steps could change and the evaluation phases will adapt
accordingly, but deadlines of deliverable will be kept. For each version/release of the
development systems the set of requirements satisfied will be identified. For each of the
requirements relating to a specific version release a set of tests and benchmarking
indicators mainly relating to performance will be detailed. The combination of release
dates and test and benchmarking indicators will be used to plan the activities that will
be performed in the context of the actual evaluation task (Task 6.2).
iv. The execution and output of the evaluation activities planned in the context of Task 6.2
will realize the testing phase of the developments and will be carried out in the staging
environment. The findings will be analyzed and feedback will be provided to the
development teams for improvements or resolution of bugs.
v. The context of the evaluation process relating the overall integration will be addressing
the maintenance phase of the services development methodology. The
evaluation/validation actions performed in the context of the operation of the overall
platform will be resulting (if any) in feedback to be taken into account by the
responsible development team.
Evaluation Plan Details
The following matrix summarizes the integration roadmap among the various components
as well as the roadmap regarding interface definition and setup among these. In the
indicated projects months a number of aspects will evaluated as indicated in the next
paragraphs. The full integration phase is expected to take place between M19 and M22. Any
DORA Deliverable D6.1
33 / 74 33 / 74
changes in the roadmaps presented will be reflected on the Evaluation Plan and the
scheduling of the corresponding activities.
Table 2: Evaluation Roadmap for Services
PS ILR SR FS TM IR DJP Open API
B2B API
PS M14-M15
ILR M18-M19
M14-M15
SR M14-M15
M13-M14
M14-M15
M14-M15
FS M18-M19
M14-M15
M19-M20
M13-M14
TM M14-M15
M15-M16
M14-M16
IR M18-M19
M14-M15
DJP M18-M19
M16-M17
M18-M19
M16-M17
M18-M19
M16-M17
- M21-M22
M14-M15
Integration/Basic Transactions – I/F Setup
D2D Journey Planner (DJP), Intermodal Landside Router (ILR), Strategic Routing (SR S), Flight Information
Service (FIS), Trip Monitoring Service (TM), Indoor Router (IR)
D2D Journey Planner
The specification of the DJP (D2D Journey Planner) service started in M7 and completes in
M14 whereas the development of the service lasts from M12 to M21.
DJP interfaces with the following services:
Intermodal Landside Router (ILR)
Strategic Routing Service (SRS)
Flight Information Service (FIS)
Trip Monitoring (TMS)
Indoor Router (IR)
The following table summarizes the development roadmap along with the
features/requirements to be evaluated per release. The evaluation of the final release will
ensure that all the UC related functionality is available and thus Objectives 6.2.1, 6.2.3, 6.2.4,
6.2.5 and 6.2.6 have been achieved.
DORA Deliverable D6.1
34 / 74 34 / 74
Table 3: DJP Development Roadmap
Month M14-M15 M16-M17 M18-M19 M19-M22
Details Data structures and REST functionalities verifiedOpen B2B API
DJP_003Basic Transactions
DJP_004Basic Transactions
DORA Open APIFull Integration
PS I/F setup Integration Support of UCs
ILR I/F setup Integration Support of UCs
SR I/F setup Integration Support of UCs
FS I/F setup Integration Support of UCs
TM I/F setup Integration Support of UCs
IR I/F setup Integration Support of UCs
Indoor Router
The specification of the IR (Indoor Router) service started in M5 and completes in M12
whereas the development of the service lasts from M10 to M21. IR interfaces with the
following services:
D2D Journey Planner (DJP)
Strategic Routing (SR)
And also with the following prototypes:
Queue Detection System (QD)
Indoor Location System (IL)
The following table summarizes the development roadmap along with the aspects to be
evaluated per release. The evaluation of the final release will ensure that all the UC related
functionality is available and thus Objective 6.2.6 has been achieved.
DORA Deliverable D6.1
35 / 74 35 / 74
Table 4: IR Development Roadmap
Month M12-M13 M14-M15 M16-M17 M18-M19 M19-M22
Details Basic Transactions
Data structures and REST functionalities verified
Basic Transactions
Basic Transactions Full Integration
DJP I/F setup Integration Support of UCs
SR I/F setup Integration Support of UCs
IL Integration Support of UCs
QD Integration Support of UCs
Intermodal Landside Router
The specification of the Intermodal Landside Router (ILR) service started in M7 and
completes in M12 whereas the development of the service lasts from M10 to M21.
ILR interfaces with the following services:
Strategic Routing Service (SRS)
D2D Journey Planner (DJP)
The following table summarizes the development roadmap along with the aspects to be
evaluated per release. The evaluation of the final release will ensure that all the UC related
functionality is available and thus Objectives 6.2.1, 6.2.4, 6.2.5 and 6.2.6 have been
achieved.
Table 5: ILR Development Roadmap
Month M14-M15 M16-M17 M18-M19 M19-M22
Details Data structures and REST functionalities verifiedOpen B2B API
Basic Transactions
Basic Transactions
DORA Open APIFull Integration
DJP I/F setup Integration Support of UCs
SR I/F setup Integration Support of UCs
DORA Deliverable D6.1
36 / 74 36 / 74
Flight Service
The specification of the Flight Search (FS) service started in M7 and completes in M12
whereas the development of the service lasts from M8 to M19.
FS interfaces with the following services:
Strategic Routing (SR)
D2D Journey Planner (DJP)
The following table summarizes the development roadmap along with the aspects to be
evaluated per release. The evaluation of the final release will ensure that all the UC related
functionality is available and thus Objective 6.2.1 has been achieved.
Table 6: FS Development Roadmap
Month M14 M16 M18 M19
Details Data structures and REST functionalities verifiedOpen B2B API
Basic Transactions
Basic Transactions
DORA Open API
DJP I/F setup Integration Support of UCs
SR I/F setup Integration Support of UCs
Trip Monitoring
The specification of the Trip Monitoring (TM) service started in M7 and completes in M12
whereas the development of the service lasts from M10 to M19. TM interfaces with the
following services:
Door2Door Journey Planner Service (DJP)
The following table summarizes the development roadmap along with the aspects to be
evaluated per release. The evaluation of the final release will ensure that all the UC related
functionality is available and thus Objective 6.2.4 has been achieved.
DORA Deliverable D6.1
37 / 74 37 / 74
Table 7: TMS Development Roadmap
Month M14-M15 M15-M16 M18-M19 M19-M22
Details Data structures and REST functionalities verifiedOpen B2B API
DORA Open API
Basic Transactions
Full Integration
DJP I/F setup Integration Support of UCs
Strategic Routing Service
The specification of the Strategic Routing (SR) service started in M7 and completes in M14
whereas the development of the service lasts from M10 to M19.
SR interfaces with the following services:
Intermodal Landside Router (ILR)
Door2Door Journey Planner Service (DJP)
Flight Search (FS)
Indoor Router (IR)
The following table summarizes the development roadmap along with the aspects to be
evaluated per release. The evaluation of the final release will ensure that all the UC related
functionality is available and thus Objectives 6.2.1, 6.2.3, 6.2.4, 6.2.5 and 6.2.6 have been
achieved.
Table 8: SRS Development Roadmap
Month M13-M14 M14-M15 M18-M19 M19-M22
Details Data structures and REST functionalities verified
Data structures and REST functionalities verified Basic
Transactions
Full Integration
ILR I/F setup Integration Support of UCs
DJP I/F setup Integration Support of UCs
FS I/F setup Integration Support of UCs
IR I/F setup Integration Support of UCs
DORA Deliverable D6.1
38 / 74 38 / 74
Evaluation Plan Approach for Agile Methodology Related Developments
Agile development methodology is expected to be applied for web and app based
applications. The applications will be using the services infrastructure but they also focus on
other aspects relating to ergonomic features, responsiveness and performance. All the
qualitative aspects will be evaluated in the context of Task 6.3 while functional aspects will
be evaluated in Task 6.2.
Although the agile developments are expected to be highly dynamic with ev aluation phases
being an integral part of each design-development-test iteration, the evaluation plan from
the project perspective covering the main requirements that have been collected in the
context of T2.4 is presented in the sequel to be followed without hindering the actual
methodology (e.g. Scrum).
The analysis of the requirements performed during M8 included also to requirements
addressing applications.
The final versions of the applications will be validated against the UC envisaged functionality
thus Objectives 6.2.2, 6.2.3, 6.2.4, 6.2.5 and 6.2.6 have been achieved.
Operation Centre Application
The specification of the Operation Centre Application starts in M9 and completes in M14
whereas the development lasts from M10 to M24. Since the Operation Centre will be
developed in a number of design-develop-test iterations according to the agile methodology
principles, three evaluation phases will be scheduled for M18, M21 and M24.
Web & APP
The specification of the WEB application and Mobile App starts in M9 and completes in M14
whereas the development lasts from M10 to M24. Similar with the Operation Center the
applications will be evaluated during the same phases in M18, M21 and M24.
Evaluation Plan Approach for Prototypes
The prototypes that will be produced in the project (Queue Detection, Indoor Location) are
enabling technologies that are utilized by both applications and services. They ar e expected
to produce information which other components use to perform more complex calculations
and associations, sometimes transparently to the end user, for the provision of added value
services.
The process for evaluating the prototypes is quite similar to the waterfall methodology
related developments. The starting point for the composition of the requirements is the
wish list of features relating to the system components utilizing the prototypes. The
requirements for the prototypes were produced in M7 (December 2015). Similar to the
DORA Deliverable D6.1
39 / 74 39 / 74
waterfall methodology case the prototype requirements were evaluated against the wish list
so as to identify any missing aspects or non-addressed functionality.
Once the requirements were consolidated (M9) a plan for the production of prototypes was
composed and is appended in the next paragraphs. Since the number of prototype revisions
is restricted, the main factors that may impose the need for a revision have been clearly
identified and correlated with the actual testing and evaluation results that have to be
collected. The progress of the overall system components has been taken into account so
that the evaluation plan for the prototypes aligns with the availability of those components
(or mockups) that need to be present for validating the prototypes as close to real
conditions as possible.
Proper functionality of the prototypes is closely related with the achievement of Objectives
6.2.5 and 6.2.6.
Waiting Time Detection
The Waiting Time Detection prototypes are will be developed from M8 to M19. A first
evaluation of the initial versions will be performed by Eureva in its premises during M12 -
M14. whereas later and more mature versions will be evaluated on site (airports) during
certain experimentation phases that will be scheduled in advance in coordination with the
airport administration. It is expected that the on site evaluation will be performed from
M16-M19.
Indoor Location
The design of the Indoor Location prototypes (DORA WiFi Beacons) started in M5 and will
evolve until M12 while related software and integration tasks start in M8 and last until M18.
The first prototype were provided in M8. The prototype will be evaluated both by UPVLC and
CSE in their premises during M9 and M10. CSE will focus on laboratory testing (CSE setup)
for battery and power management issues prior to delivering a number of Beacons to
UPVLC. UPVLC will focus then on usage of the Beacons for accurate position resolution.
Evaluation of the prototypes also under close to real conditions will be attempted, therefore
the beacons will be installed in public places (UPVLC setup) where ambient conditions,
crowd and building peculiarities can be affecting the performance of the system.
The first phase of the evaluation therefore is expected to produce feedback regarding:
Tx power adjustment
SSID management
Fingerprinting
Ambient conditions impact
Beacon transmission intervals
DORA Deliverable D6.1
40 / 74 40 / 74
Battery life in relation to the above aspects
The collected results will be analyzed and potential design changes will be identified during
M10. In view of the actual power consumption details the BLE implementation feature on
the same or separate boards will be analyzed in M10. A next run of prototypes addressing all
the collected feedback as well as the necessity for BLE will provided in M12. In the second
prototype attention will be paid so that further customization will not need hardware
redesign. The second prototype will address several other aspects such as casing, mounting
and battery recharging. The evaluation of the second prototypes will be performed inside
airports between M16 and M18.
2.3.2 Usability Evaluation
In this chapter it will be described why and how Usability Evaluation will be carried out.
The main goal of the Usability Evaluation is to assess usability criteria in order to identify
possible problems with the design of the interfaces at an early stage during the conceptual
phase and during its integration phase. Also the usability evaluation in DORA will follow an
iterative approach. It will be carried out at three steps during the development phase of the
DORA system (cp. blue boxes in Figure 4 Evaluation tasks and their relations). The subjects of
usability evaluation during these development steps will be:
- Usability Evaluation of the Concept for the Web and App GUI
- Usability Evaluation of the DORA Prototype
- Usability Evaluation of the Integrated Final Product (DORA system Alpha Version)
When it comes to usability evaluation of websites or Apps, there is a wide range of possible
criteria that can be assessed. Those criteria are described in various norms such as with the
norm of the International Organization for Standardization ISO/IEC TR 25060:2010. It
contains “… formats (CIF), that document the specification and evaluation of the usability of
interactive systems. It provides a general overview of the CIF framework and contents,
definitions, and the relationship of the framework elements. The i ntended users of the
framework are identified, as well as the situations in which the framework may be applied. ”
Moreover standards for systems and software engineering, systems and software products,
quality requirements and evaluation (SQuaRE), a common industry format (CIF) for usability
as well as a general framework for usability-related information are included. [7]
Other Standards that include criteria for usability evaluation are the DIN EN ISO 9241
‘Ergonomics of the interaction between humans and systems’, released from the German
national standardization institute. It seeks to set the frame for the evaluation of dialogue
systems and presents standards for keyboards, dialogue menus, user guidance,
DORA Deliverable D6.1
41 / 74 41 / 74
terminologies, display of information, requirements for optical electronical displays, etc.. or
‘DIN EN ISO 14915’ for the software ergonomics for multimedia user interfaces, that
includes requirements for design, navigation and combination of media.[6]
There are also plenty of guidelines and standards that have to be taken under consideration
when designing and developing websites or apps that can meet the special requirement for
people with disabilities. Since these groups are relevant for the development of the DORA
system, accessibility standards will be in the focus of the usability evaluation. The ‘World
Wide Web Consortium (W3C)’ has developed Web Content Accessibility Guidelines (WCAG)
which can be found for example at the website of the Web Accessibility Initiative. The WCAG
were developed “… through the W3C process in cooperation with individuals and
organizations around the world, with a goal of proving a single shared standard for web
content accessibility that meets the needs of individuals, organizations, and governments
that presents criteria that will serve as an example for the criteria that will be assessed
within the usability evaluation of the DORA service:
Figure 5 Web Content Accessibility Guidelines 2.0 - success criteria examples for DORA
Text Alternatives: Provide text alternatives for any non-text content so that it can
be changed into other forms people need, such as large print, braille, speech,
symbols or simpler language.
Time-based Media: Provide alternatives for time-based media
Adaptable: Create content that can be presented in different ways (for example
simpler layout) without losing information or structure
Distinguishable: Make it easier for users to see and hear content including
separating foreground from background
Keyboard Accessible: Make all functionality available from a keyboard
Enough Time: Provide users enough time to read and use content
Seizures: Do not design content in a way that is known to cause seizures.
Navigable: Provide ways to help users navigate, find content, and determine where
they are.
Readable: Make text content readable and understandable.
Predictable: Make Web pages appear and operate in predictable ways.
Input Assistance: Help users avoid and correct mistakes.
Compatible: Maximize compatibility with current
and future user agents, including assistive
technologies.
The iterative usability approach pursued by DORAA
foresees three assessment during the design process. This
Formative evaluation is a type of
usability evaluation that helps to
"form" the design for a product
or service. Formative evaluations
involve evaluating a product or
service during development,
often iteratively, with the goal of
detecting and eliminating
usability problems.[10]
DORA Deliverable D6.1
42 / 74 42 / 74
formative approach seeks to save financial and working time resources linked to the
technical development of the system. Thus formative usability evaluation methods will be
taken under consideration for the application in DORA (selection)[12]:
Focus groups: a small group of potential end users or developers (5-8) will discuss
the requirements of a system
User diaries: the users document their behavior and experiences after interacting
with the system
Field observation: users will be observed and interviewed in their natural
environment
Retrospective interview: after the interaction with a system, the user are
interviewed, face to face, via telephone or video connection
Questionnaires: users will be asked to reply to questions in a questionnaire after
they were interacting with the system
‘User Experience Questionnaire (UEQ)’. It is structured to “… allow a quick
assessment of the user experience of interactive products. The format of the
questionnaire supports users to immediately express feelings, impressions,
and attitudes that arise when they use a product.”[11]
Heuristic evaluation: usability experts review the interface and compare it against
accepted usability principles
Thinking out loud: users say aloud what they are doing during the use of a system,
why they do it and what they are thinking, what feelings they have.
Participatory Workshop: People are brought together to seek and discuss their
opinions concerning usability issues, and to elaborate solutions for identified
problems. It happens in a friendly and creative atmosphere
The most appropriate methods for the assessment of each DORA usability evaluation step
will be selected in the scope of the project. At places a combination of several methods can
be appropriate. Also, due to time, financial or personnel limitations, some methods might
have to be carried out in a reduced or simplified way. In order to carry the respective
method out in a sound manner, close attention will be paid to the preparation of the
usability tests.
The U.S. Department for Health and Human Services has released a guideline planning a
usability test.[8] It gives information concerning all features of the test including its scope,
purpose, schedule and location, required equipment, the participants and their roles, etc.
These guidelines (cp ANNEX) will be taken under consideration for the preparations of the
usability evaluation in DORA.
In the following, the features of the three user tests that will be carried out in DORA will be
DORA Deliverable D6.1
43 / 74 43 / 74
described. Details for the implementation of the tests, are still subject of specification. Times
for carrying them out (In the DoW all usability evaluation activities are scheduled from
month 16 = Sept. 2016, to 24 = May2017), DORA partners to be involved or resources to be
allocated, will be defined during the further development process of the DORA system:
Table 9: Operationalisation - usability test of conceptual variants
1) Assessment of different conceptual variants of the system
Short
description
The different conceptual drafts of the Web and App GUI will be assessed by
target group persons evaluation will result in identifying which variant of the
system is the most preferred one.
Method(s) Participatory Workshop: Up to 5 members of the DORA Advisory Board and +
up to 5 usability experts will be brought together in a workshop. The
conceptual drafts will be presented and discussed along usability criteria and
qualitative questions. The same review workshop will be carried with
representatives of the predefined DORA User Groups. The results will be
analyzed, and reported in words and graphs to the developers of the system.
Time Preparation: Months 11 - 12
Conduct: Month 13
Reporting: Month 14
Partners
involved
Partners involved in development of the front end (App, Web GUI), VMZ,
ETRA, TUB
Specifics The dates for conducting this evaluation were changed compared to the
times indicated in the DoW. This is because the first draft of the conceptual
Web and App GUI will be finalized in month 14 (July 2016) already.
Tasks - Identification of reviewers
- Planning of Workshops
- Preparation of usability criteria
- Analysis of results and reporting
DORA Deliverable D6.1
44 / 74 44 / 74
Table 10: Operationalisation - usability test of first prototype
2) Testing first prototype
Short
description
Testing first prototype of the system in order to identify usability problems.
This prototype includes first functioning components. The test persons will
have to accomplish several tasks and report their experiences made while
using it in a simple questionnaire. This way, occurring problems and
corresponding improvements with handling the system will be identified.
Method(s) To complete this assessment the test users will report their experiences with
using the system in a questionnaire that ask in a structured way according to
predefined usability criteria. There are standardized questionnaires available
that will be used for it: The ‘User Experience Questionnaire (UEQ)’ contains
26 bipolar items on a scale of 7 steps. Scales are ordered along criteria such
as Attractiveness, perspicuity, efficiency, stimulation, novelty or
dependability.
Ideally these test users will be chosen according to the user groups as
defined in WP 2.2. The number of test persons should be at least 5.
Additionally up to 5 usability experts will be asked to accomplish the tests.
The analysis will be accomplished by applying the same method as for the
assessment of different conceptual variants.
Time Preparation: One month before the development of the first prototype will
be finished (Task 4.4 Integration and Initial Testing (M16-
M24))
Conduct: During first month of existence of first prototype (M16-M24)
Reporting: One month after conduct of tests (M16-M24)
Partners
involved
Partners involved in development of the front end (App, Web GUI), VMZ,
ETRA, TUB
Specifics The tests should be carried out towards the beginning of the time period of
Task 4.4 Integration and Initial Testing (M16-M24). Towards the end of this
period the last system pre-test before trial phase should be conducted. Each
set of usability evaluation takes 3 month, accordingly the first prototype test
could be conducted in the month 16 – 18, and the last pretest in the month
DORA Deliverable D6.1
45 / 74 45 / 74
22 – 24.
This test cannot be carried out under real time conditions. For example it will
not be possible to plan and travel the whole trip between Berlin and Palma
in reality. Thus, 2-3 test scenarios have to be simulated (programmed) for
the test users.
Tasks- Identification of test persons
- Identification of interviewees
- Preparation of guidelines and selection of usability criteria (by experts) for
the interviews
- Analysis of results and reporting
- Creation of trip simulations (test scenarios) for test users
Table 11: Operationalisation - last usability test before trial phase
3) Last system pre-test before trial phase
Short
description
This test will evaluate the entire system with regard to the main usability-
objectives of the project (objective 6.3. Identification of usability problems of
prototypes of the DORA service).
Method(s) This test should be conducted under ‘real’ conditions: Up to 5 experts should
test the Web and App service throughout the entire trip between Berlin and
Palma. This can be accomplished by mobilizing members of the DORA
consortium when they travel between these cities for DORA project
meetings.
These test persons should report their behavior and experiences after
interacting with the system in user diaries. The user diary will be structured
along the User Experience Questionnaire (UEQ). Afterwards they should
present their experiences in a focus group meeting during i.e. a DORA
consortium meeting, in which further necessary usability adjustments of the
system will be discussed.
Time Preparation: One month before the development of the Integrated Final
Product (Alpha Version) will be finished (Task 4.4 Integration
and Initial Testing (M16-M24))
Conduct: During the first month of existence of the integrated final
DORA Deliverable D6.1
46 / 74 46 / 74
product (M16-M24)
Reporting: One month after conduct of tests (M16-M24)
Partners
involved
Partners involved in development of the front end (App, Web GUI), VMZ,
ETRA, TUB
Specifics These tests should be carried out towards the end of the time period of Task
4.4 Integration and Initial Testing (M16-M24), for example during month 22
– 24 (cp 2, Testing first prototype)
Tasks- Identification of test persons
- Identification of interviewees
- Preparation of a template for the user diaries
- Preparation of a focus group meeting in Berlin or Palma de Mallorca
- Analysis of results and reporting
Note: The usability evaluation has many overlaps with the technical evaluation. Criteria such
as functionality, robustness or responsiveness and performance of the system that will be
evaluated in task 6.2 Technical Evaluation do also influence the usability of the system. Thus;
both of these evaluation activities will have to be accomplished in close coordination . This
refers to the planning, conduct, analysis and reporting.
The results of each step of usability evaluation carried out within this work package will be
(three) short intermediate reports about the identified usability problems and with
recommendations for improvements. They will be disseminated within the project
consortium. These short reports will be summarized in the deliverable “D6.3 Usability
Assessment report” to be delivered in month 24.”
DORA Deliverable D6.1
47 / 74 47 / 74
2.3.3 Results Evaluation
This chapter is describing the evaluation approach and methods to be used for the
evaluation of the results of the DORA project. According to the description of work that was
granted by the commission, this section comprises:
- Impact Evaluation
- Evaluation of User Satisfaction
- Process evaluation
It should be mentioned from the beginning of this chapter, that actually only the first both
elements, impact evaluation and evaluation of user satisfaction, are purely referring to the
assessment of the results of DORA. Process Evaluation however describes the results of the
assessment of the project process but it does not focus on the results of the project alone. It,
moreover, also looks at the processes of the entire time span of the project. Thus it can be
regarded as an evaluation task that is ongoing throughout the project but that will be finally
accomplished by compiling a final report at the end of the project.
Impact Evaluation (Task 6.4.1). It can be defined as the following:
“Impact evaluation illustrates changes which are attributed to an intervention such as a
project, measure or policy which was planned and implemented to reach a formulated goal.
In contrast to outcome monitoring which examines whether targets have been achieved,
impact evaluation is structured to answer the question: How would outcomes (…) have
changed if the intervention had not been undertaken?”[4]
Thus, and in order to carry out a sound impact evaluation the following steps have to be
taken under consideration:
- Before data collection.
This is the basis to be able to compare the data with the after data collection. This
comparison will show the effect of the measure. Thus, the first step is to establish a
baseline (before data) of data collections, possibly in different points of time. This
way long term developments can be identified.
- Establishing a Business as Usual scenario.
This step is necessary to identify what would have happened if the measure was not
introduced. There are plenty factors that possibly can influence the subject under
evaluation. An example for DORA could be that even without using the DORA App
travel times of travelers are being reduced because new, shorter connections
between sites were built during the time of data assessment. These effects have to
be excluded from the ‘After Measurements’ to know the exact effect that can be
attributed to a measure.
DORA Deliverable D6.1
48 / 74 48 / 74
Figure 6: Baseline and Business-As-Usual Scenario [4]
- Collection of After- Data (ex-post) after the measure was implemented.
This is to compare the situations before and after the implementation and to analyze
the results from this comparison. As mentioned, effects of other factors that had an
influence on the after data have to be excluded.
In
DORA, Impact Evaluation will focus on the expected impacts to be evaluated during pilot
execution at Berlin and Mallorca airports (described in chapter 1.6 under the heading
‘Objectives related to Task 6.4 Evaluation of Impact and Process’). The goal of the evaluation
activities is to prove if the objectives of the DORA project were achieved and to which
extend this happened.
An important difference to the impact evaluation methodology, described in Figure 6, is
however that in DORA the impacts of the project cannot be measured over time, before and
after the implementation of the new service. There will be only one central pilot execution
of the service at the airports in Berlin and Palma de Mallorca. These field test will take place
at one day. Long term effects or impacts resulting from applying DORA at airport can thus
not be measured within the lifetime of DORA since this project ends with the pilot test.
Consequently Impact Evaluation in DORA will focus on the measurements that can be carried
out during the pilot test of the final product. The results of these analyses will be
DORA Deliverable D6.1
49 / 74 49 / 74
documented in the final evaluation report. This will also include recommendations for future
implementations that resulted from the assessments carried out within DORA .
Since the results of this assessment should show to what extend the project’s outcome
meets the formulated objectives, the step of choosing adequate methods and indicators is of
outmost importance.
The indicators should be chosen in a way that they are clearly matching the objectives. A
pragmatic criterion for their selection is also that they are reliably an d feasibly measurable.
Possibilities of ‘bundling’ should be proven (one indicator can be taken to measure the
effects of several components or features of a project).
As mentioned above the objectives 6.4.1 to 6.4.7 are subject of the Impact Evaluation
carried out within the DORA project. They are precisely described and include descriptions of
indicators already. The following table gives an overview on these objectives and their
corresponding indicators.
Table 12 DORA objectives and their indicators
Objective Nr.
Objective description Indicator Indicator description
6.4.1 The necessary time needed for
transport to and from the
airports will be reduced by up
to 20%
Travel Time Time needed for trips to and from
the airport (destinations:
Accommodation or home)
6.4.2 90 % of DORA App users
consider the information
provided as useful
Informed
decision
making
Number of passengers that
consider the information provided
as useful (real time travel info /
delays / alternative routes /
personalized information
provision)
6.4.3 90 % of DORA App users
consider information on how to
travel to airport by different
modes of transportation as
helpful
Multimodal
Accessibility
Percentage of users that perceives
information on accessibility of
airports by different transportation
modes as helpful
6.4.4 Service Quality: waiting times in
the terminal will be reduces by
up to 20 %
Service
quality
waiting time
Percentage of time reduction for
waiting at queues at airports
DORA Deliverable D6.1
50 / 74 50 / 74
6.4.5 DORA will contribute to the
goal that 50% of all trips to
/from the airports in Berlin and
Palma de Mallorca will be done
by public transport
Modal Split Percentage of travelers that reach
(or leave) the airports in Berlin and
Palma de Mallorca by Public
Transportation
6.4.6 10 % of DORA App users use
indoor routing services
Accessibility Percentage of DORA App users that
are making use of the indoor
routing services
6.4.7 Implementation of a number of
predefined substitution
processes and information
strategies for disruption cases.
(number = tbd)
Robustness
of service
Number of predefined substitution
processes and information
strategies for disruption cases
developed and implemented in the
DORA service.
After the objectives have been clearly defined and after the indicators to measure them
have been identified the next step in drafting the evaluation plan is to find appropriate
methods for conducting the measurements and data assessments. H ere method of ‘Control
Site Evaluation’ offers a number of advantages especially in evaluating the time -depending
impacts of 6.4.1 and 6.4.4. They should be assessed with a control site approach since it
allows to have a direct comparison between users having been exposed or not having been
exposed to the measure:
Control Site Evaluation:
The comparison of a group or area that was exposed to the effect of a specific intervention,
to a group or area that was not exposed to it, is being regarded as the ‘str ongest and thus
preferable evaluation design’ [4]. These groups have equivalent characteristics e.g. regarding
their size, composition (age / gender / social status) – the only difference is the exposition
toward the measure. Both groups – the one with and the one without exposition to the
measure will be tested before and after measure implementation at the same times and
with the same data collection methods. The reasons why this method is the most preferable
one are:
The comparison will show in a simple and sound way, which effect the measure will have.
Depending on the method used to assess the groups, specific effects can be easily compared
and analyzed Applying this evaluation design allows to clearly allocate the information
inquired to the single objectives of the measure.
DORA Deliverable D6.1
51 / 74 51 / 74
The most important advantage is that external factors can be excluded: The results from
comparing the groups will show the real effect from this measure, because external effects
coming from other factors, are excluded. This exclusion is caused by the fact that external
effects (such as cheaper fuel prizes, that have not relation to the measure, but that would
possibly lead to an increased usage of private cars) are effecting both sites or both groups.
This way, impacts of the measure can be clearly separated from other factors that is
influencing the group concerning the indicator assessed.
Reflecting the indicators presented in Table 12, it can be stated that only for the objectives
6.4.1 and 6.4.4 a direct comparison between users and non-users is necessary to evaluate
the impact of the measure. Since both indicators are not related to each other, it is from the
methodological point of view also possible to measure them separately.
All other indicators (6.4.2; 6.4.3; 6.4.5; 6.4.6; 6.4.7) can be assessed ex -post by users of the
DORA service using by answering a certain amount of questions addressing the before
mentioned objectives.
Operationalization
As described in the DOW, it is planned to have one central pilot execution of the service at
the airports in Berlin and Palma de Mallorca at one day. It is, moreover, envisaged to
conduct this field test with 500 test users at both airports.
This will be the best possibility to apply the control site evaluation and the evaluation of the
other criteria at the same time.
Application of a control site evaluation in DORA for the indicators 6.4.1 and 6.4.4
Compared to the Evaluation design as illustrated in the graph above ‘Baseline and Business-
As-Usual Scenario’, the DORA approach, proving the planned effects by the application of a
control site evaluation can be illustrated as follows:
DORA Deliverable D6.1
52 / 74 52 / 74
Figure 7 Control Site Evaluation in DORA
However, in regard to the operationalization there are several requirements that have to be
taken under consideration when conducting a control site evaluation:
- Orienting on the formal definition of the “Control Site Evaluation”, this kind of
evaluation would be possible as long as 50% of the test users at each airport would
use the DORA app and 50% are accomplishing the same tasks without using the app.
- The method and practical implementation of the measurement process should
safeguard the comparability of times and perceived quality (assessing persons should
have same starting and ending points at the same time).
- The people to carry out the tests should be chosen randomly but their travel
behavior should orient on the average traveler on this route.
This means that the results from the tests (e.g. finding routes) accomplished by the control
group that is not using the DORA app would represent the ‘before’ data. The results from
the tests accomplished by the control group that is using the DORA app would represent the
‘after’ data collection. Since the indicators 6.4.1 and 6.4.4 are addressing forerun and follow-
up movement to or from both airports as well as the waiting times at both airports, it is
necessary to have altogether 6 groups of DORA users and non-users. As already mentioned,
both indicators are independent from each other. This means that time comparisons within
the groups at all 6 locations of the travel chain can be conducted independently.
On the example of the forerun to the new Berlin airport BER, this would mean that two
persons, randomly chosen, have to start from the same location and travel towards the
airport. One should have the access to usually available travel information and one should
use the DORA service and has been briefed before. By receiving the location at the target
DORA Deliverable D6.1
53 / 74 53 / 74
airport (e.g. terminal entrance) at the same time it is safeguarded that all framing conditions
are homogenous except the information provided by the DORA service. Measuring the
amount of time between having the destination and arrival at the destination and arrival at
the airport and comparing the DORA user and non-user with each other enables the
evaluator to analyze a potential travel time reduction. In order to provide reliable results this
procedure has to be performed with an appropriate number of user and non-user tandems
representing the different kinds of travelers for every of the six parts of the travel chain. The
more tandems can be motivated by the project to participate on the field test the more
reliable the final results are. After the tests the participants of both groups have to complete
a questionnaire with questions addressing all indicators explicitly. Technical functionalities
(Important: ‘technical evaluation can be assessed with the same questionnaire)
The technical part of the operationalization and also the concrete realizable sample size for
the control site evaluation as well as the concrete technique to do the measurements are
strongly depending on the final product and the related technical options for measuring the
required information. This has to be decided during the course of the research project. A
preliminary overview of possible options is shown in Table 13.
Table 13 DORA Indicators and field test data gathering methods (control site approach)
Objective
Nr.
Indicator Indicator
description
Field test data gathering method
6.4.1 Travel
Time
Time needed for
trips to and from
the airport
(destinations:
Accommodation or
home)
Groups using the service and control groups
not using the service, have to reach starting at
the same time from the same origin the same
location;
Measurements are taken separately for the
forerun / follow-up movement in Berlin and in
Palma;
Documentation of travel times and mode
choices during the test in a travel protocol by
all participants of the field test
6.4.4 Service
quality
waiting
time
Percentage of time
reduction for
waiting at queues
at airports
Groups using the service and control groups
not using the service, have to process queues
at the airport entering the airport at the same
time;
Measurements are taken separately for the
start and end in Berlin and in Palma airports;
DORA Deliverable D6.1
54 / 74 54 / 74
Measuring the Indicators of the objectives 6.4.2; 6.4.3; 6.4.5; 6.4.6
The evaluation of the indicators of the objectives 6.4.2; 6.4.3; 6.4.5; 6.4.6 is vital to assess
whether the developed DORA service has fulfilled its task to make the whole journey for the
travelling individual more seamless and easier. This part of the evaluation also contains the
Evaluation of User Satisfaction (Task 6.4.2) which assesses specifically whether the majority
of DORA testers consider the information provided by Dora as useful. Additionally, user
satisfaction seen as the general perception of the service is a key indicator to the further
success or failure of the DORA service. Thus this indicator will be evaluated in depth. The
single components of the service will be evaluated by conducting surveys end users during
the trial phase. The surveys addressing users will be carried out during test s ite
implementation at Berlin and Mallorca pilot sites for those parts of the travel chain that
include forerun / follow-up movement and the movement at the airport at both pilot sides.
The survey will consist of a predefined set of closed questions in a s tandardized
questionnaire to measure the indicators in form of variables on nominal or ordinal scale.
Since it is not sure whether the test users will be real travelers or supernumeraries the
number and complexity of the questions will be reduced to a minimum. Moreover, it
depends on the way the field test will be realized, whether the indicators will be measured
separately for each part of the travel chain, for the parts from origin / destination to the gate
or for the whole travel chain.
In regard to the users it is planned to address, if possible, the in D 2.2 described user groups
by equal shares in order to take measurements for all, for the user -groups typical travel
chain. The way the measurement are being technically performed, depends firstly on the
way the field test is being carried out and secondly on the technical possibility to provide an
online questionnaire in the DORA App or as an external online questionnaire. If both options
are not possible to be realized within the project the interviews will have to be performed as
face to face interviews. In each case the interview will be of a retrospective nature after the
test user has successfully finished his trip.
Due to this uncertainty Table 14 below illustrates objectives, indicators and an overview of
potential measurement methods that have to be determined closer throughout the project
runtime.
DORA Deliverable D6.1
55 / 74 55 / 74
Table 14 DORA Indicators and field test data gathering methods (survey)
Objective Nr.
Objective description Indicator Field test data gathering method
6.4.2 90 % of DORA App
users consider the
information provided as
useful
Informed
decision
making
Can be assessed separately for each
part of the travel chain, for the parts
from origin / destination to the gate or
for the whole travel chain
The test user has to be questioned
after having successfully finished the
trip
Questions can be formulated online as
a part of the DORA App, as an external
online questionnaire or face to face by
interviewers
Answer categories should be on ordinal
scale (1 worst 5 best) or nominal scale
6.4.3 90 % of DORA App
users consider
information on how to
travel to airport by
different modes of
transportation as
helpful
Multimodal
Accessibility
6.4.5 DORA will contribute to
the goal that 50% of all
trips to and from the
airports in Berlin and
Palma de Mallorca will
be done by public
transport
Modal Split Can be assessed separately for forerun
and follow-up movement at both pilot
sites, for test users using the DORA
service for the parts from origin /
destination to the gate or for the whole
travel chain
The test user has to be questioned
after having successfully finished the
trip
Questions can be formulated online as
a part of the DORA App, as an external
online questionnaire or face to face by
interviewers
Answer categories should be on ordinal
or nominal scale
6.4.6 10 % of DORA App
users use indoor
routing services
Accessibility Can be assessed separately at both
pilot sites, for test users using the
DORA service for the parts from origin /
destination to the gate or for the whole
travel chain
The test user has to be questioned
after having arrived at the gate
DORA Deliverable D6.1
56 / 74 56 / 74
Questions can be formulated online as
a part of the DORA App, as an external
online questionnaire or face to face by
interviewers
Answer categories should be on ordinal or
nominal scale
Specifications concerning objective 6.4.5 (example):
The questions for measuring the variables related to this indicator could look as follows:
Did the DORA App inform well in regard to your public transport travel options? YES/NO
Did you choose the Public Transport for the part of this part of the travel chain? YES/NO
Was DORA the reason why you chose the public transport? YES/NO
Do you think the information on travel alternatives provided by DORA will motivate you to use more
often the public transport for reaching / leaving the airports? (no, rather no, rather yes, yes)
Consequently, the assessment of these objectives in the scope of Impact Evaluation delivers
results regarding the satisfaction of the users concerning the DORA service. However, the
perception of the quality of the service that leads to a high user’s satis faction, is composed
by a number of additional factors such as reliability, safety, privacy issues, comparison to
other apps, availability and accessibility of service, etc. In order to assess this range of
factors, DORA will add specific questions in to the questionnaire that will also be handed out
to the participants of the test execution after the trial. These questions will address the
mentioned additional factors directly.
The partners who will support the conduct of these evaluation activities will have to be
clarified in the scope of the detailed planning of these evaluations. Partners who committed
themselves for supporting WP 6 are listed in chapter ‘4 OPERATIONALIZATION / ROLES AND
RESPONSIBILITIES’. The results from the evaluation activities carried out within the
evaluation of impacts and user satisfaction will be analyzed and documented.
Recommendation for future implementations will be formulated. These findings will be
summarized in “D6.4 Final evaluation report” in month M36.
---
Basing on the aim that the system resulting from DORA should be easily transferable to
other airports throughout Europe; DORA will carry out a Process Evaluation (task 6.4.3). It
will deliver a comprehensive picture of the modalities required for its successful
implementation and important results for the improvement of the ongoing implementation
process of the DORA project. The assessment will be carried out by conducting three surveys
with the most important stakeholders of the project. Questionnaires will be developed to
DORA Deliverable D6.1
57 / 74 57 / 74
contain tables giving an overview of fields in which such barriers or drivers could occur. In
order to facilitate a learning process during the project lifetime, short, internal reports of
each reporting period will be distributed within the project.
Since the concept of process evaluation remains a rather new way of evaluation projects
comprehensively, and since the implementation of the method requires thorough
preparation, the following chapter, will describe the purpose, approach and method in
depth.
2.3.4 Process Evaluation
Purpose: The method of process evaluation is grounded on the experience that most
measures differ from the originally intended plan – what was planned in the very beginning
mostly is different from the final product. This is caused by a range of factors that are
influencing the processes of planning, developing, testing, implementing and operating a
product or project and moreover, this is a natural process of adaptation that every project is
going through to a certain extend. Process evaluation seeks to assess these dynamics of a
measure in order to understand what has influenced the measure process in an either
positive or negative way. Thus it looks at the strength and weaknesses or in other words
drivers and barriers that are effecting these processes. It is looking on HOW an outcome or
product is being realized rather than on its impact.[4]
This helps to understand which failures, changes, or delays have occurred during the
planning and implementation process, but it also tries to find explanations why processes
have worked well or what influenced them in a way that they worked even better than
expected. For our DORA project this means that these results from this evaluation thus can
deliver very useful information for:
- Transferability: Other cities or airports to learn from the experiences Berlin and
Palma de Mallorca and their airports were making in developing and implementing
the DORA services. Failures occurred can be avoided and events that had a positive
effect in DORA can be repeated.
- Improvements of the ongoing development and implementation processes in DORA.
It will detect barriers to the process that can be avoided in the further process and
detects positive events which can be maximized in their effect for the further project
processes.
- Documentation of positive and negative events that occurred during the project
process. This can be very helpful for example to justify eventually occurring delays to
the management level.
DORA Deliverable D6.1
58 / 74 58 / 74
Approach: With process evaluation these results can be achieved, because it asks in a
systematic way for the events a project was opposed to. Moreover it asks different people
that were participating in the project process. The opinions of developers, transportation
planners, public transportation operators, administrations, municipal bodies, managers or
scientists will be assessed. Together, the different viewpoints can provide valuable insights
into the measure process. This demonstrates the approach behind process evaluation: It
listens to stories that different kinds of people are telling about one project. These ‘stories
behind the figures’ are the basis for learning from the experiences made.
Barriers and Drivers: Process evaluation is assessing which barriers and drivers had an effect
on a project process. It especially refers to unforeseen events that were shaping the project
process. For the DORA project, barriers and drivers can be described as follows:
- Barriers are the events or conditions that have a negative effect in reaching the
working tasks of developing, testing and implementing the DORA service on the
airports in Berlin and Palma de Mallorca as initially planned.
- Drivers are the events or conditions that have a positive or stimulating effect in
reaching the objectives during developing, testing and implementing the DORA
service in both cities and airports.
There are several categories in which barriers and drivers can be grouped. They are shown in
Figure 8. For the practice in DORA a (fictive) ‘planning’ barrier could be that in the financial
planning the costs for a certain necessary technical device for measuring waiting times was
much more expensive than initially estimated. The negative effect on the measure process is
that now additional funding has to be looked for until the development process can be
continued. A ‘financial’ driver could be that another airport is interested in implementing
DORA and therefore contributes with additional funding to hire more developers to speed
up the implementation process. Other categories of drives and barriers, including some
abstracted examples are shown in the following figure:
DORA Deliverable D6.1
59 / 74 59 / 74
Figure 8 Categories of process barriers and drivers and examples [4]
Field Examples of barriers Examples of drivers
Political /strategic Opposition of key actors based on
political and/or strategic motives, lack
of sustainable development agenda or
vision, impacts of a local election,
conflict between key (policy)
stakeholders due to diverging believes
in directions of solution
Commitment of key actors based on
political and/or strategic motives,
presence of sustainable development
agenda or vision, positive impacts of a
local election, coalition between key
(policy) stakeholders due to converging
(shared) believes in directions of
solution
Institutional Impeding administrative structures,
procedures and routines, impeding
laws, rules, regulations and their
application, hierarchical structure of
organisations and programs
Facilitating administrative structures,
procedures and routines, facilitating
laws, rules, regulations and their
application, facilitating structure of
organisations and programs
Cultural Impeding cultural circumstances and
life style patterns
Facilitating cultural circumstances and
life style patterns
Involvement,
communication
Insufficient involvement or awareness
of key (policy) stakeholders,
insufficient consultation, involvement
or awareness of citizens or users
Constructive and open involvement of
key (policy) stakeholders, constructive
and open consultation and involvement
of citizens or users
Planning Insufficient technical planning and
analysis to determine requirements of
measure implementation, insufficient
economic planning and market
analysis to determine requirements for
measure implementation, lack of user
needs analysis: limited understandings
of user requirements
Accurate technical planning and analysis
to determine requirements of measure
implementation, accurate economic
planning and market analysis to
determine requirements for measure
implementation, thorough user needs
analysis and good understandings of
user requirements
Organisational Failed or insufficient partnership
arrangements , lack of leadership, lack
of individual motivation or know-how
of key measure persons
Constructive partnership arrangements ,
strong and clear leadership, highly
motivated key measure persons, key
measure persons as „local champions“
Financial Too much dependency on public funds
and subsidies, unwillingness of the
business community to contribute
financially
Availability of public funds and subsidies,
willingness of the business community
to contribute financially.
Technological Additional technological requirements,
technology not available yet,
technological problems
New potentials offered by technology,
new technology available
DORA Deliverable D6.1
60 / 74 60 / 74
Activities related to barriers and drivers: In practice, what people do if a negative or positive effect to a project occurs, is that they want to find out what exactly happened. In the case of a methodological process evaluation this step is the assessment of barriers or drivers. But what happens next is that people think of what follows from this cognition. A positive example in DORA could be that in case another city or airport unexpectedly is interested in implementing DORA, the managers of the project could decide to contact even more cities and airports to boost the development of the service even more. In the method of process evaluation, this step is called assessment of activities related to barriers and drivers. This is a necessary step to fully assess the process of a measure and to find out what really happened and why. This assessment of activities is also essential for the reporting and documenting the measure process. The reader will be very interested in how problems have been solved and how positive events have been utilized to carry out the measure more efficiently.
Method for DORA: There are several methods that are suitable to assess the processes of a
project or measure. They are mainly differing in the degree of detail in how they are
observing the processes. For the DORA project, a project with many partners spread over
various countries in Europe, a rather practical and easy-to-conduct method that bases on
using standardized forms seems to be the appropriate choice.
Using Standardized forms to collect the data relevant for process evaluation has advantages
that make it the appropriate method for process evaluation in DORA: only one or a few
persons are required to complete the forms, it is an easy standardized form that can be done
by everyone even without special knowledge, finally the evaluation of the forms can be
achieved in a simple way because the filled in forms are easily comparable. This makes the
result also easily comparable with the result from other project sites which is an important
asset for assessing the transferability of the DORA service to other airports in Europe.
The following simplified structure of a template has been proven to be practicable:
1) General Information
- It should firstly contain a part with general information such as the name of the task
in DORA it refers to,
- then the time period the assessment refers to should be indicated
- also the target group of your working task should be named as well as the partners
that were involved in realizing your working task.
- It is also important to show which person has completed the form and who is the
person that can be contacted by the evaluator
DORA Deliverable D6.1
61 / 74 61 / 74
2) Barriers
- Should be described in a brief and clear way. Only the 1-3 most important barriers
should be named.
- Each barrier should be described by answering the questions: What happened and
how did it actually occur? Which negative impact did it have on the process of DORA?
3) Drivers
- Should be described in the same clear and brief way for the 1-3 most important
drivers as the barriers.
- Also for the drivers questions should be answered asking for what really happened?
How did it exactly occur and which positive impact on DORA did it have?
4) Activities
- Should be described in a brief and clear way by using simple words
- They should answer the question: What did you undertake to make use of the drivers
and what did you do to overcome the barriers mentioned in the previous part of the
completed template
5) Any other comment
- This is a very important part of the form because here you have the chance to
describe any other relevant information to explain the measure process. This can be
extraordinary conditions or your estimation of the risks to reaching the objectives of
DORA or your task in DORA. Also, you can add here pictures, illustrations or methods
you think that could be useful for others. With regard to the time needed for
completing and processing the form, also this information should be described briefly
and clearly.
Operationalization - Process Evaluation
First Phase:
Month 10: Finalization of Process Evaluation Questionnaire and guideline for completion
Month 11: First Process Evaluation: Questionnaires will be sent to DORA Partners for
completion within 2 weeks
Month 12: Evaluation and analyses of questionnaire results and summary in a short report.
The short report will be distributed to the partners.
Second Phase:
Month 22: Second Process Evaluation: Questionnaires will be sent to DORA Partners for
completion within 2 weeks
Month 23: Evaluation and analyses of questionnaire results and summary in a short repor t.
The short report will be distributed to the partners.
DORA Deliverable D6.1
62 / 74 62 / 74
Final Phase:
Month 33: Final Process Evaluation: Questionnaires will be sent to DORA Partners for
completion within 2 weeks
Month 34: Evaluation and analyses of questionnaire results and summary in a short report.
The short report will be distributed to the partners.
Month 35 and 36: Analysis of all three process evaluation results and reporting in the final
evaluation report
DORA Deliverable D6.1
63 / 74 63 / 74
3 TRANSFERABILITY
A major ambition of the DORA project, is that the system developed is easily transferable to
other cities and their airports in Europe. Thus the DORA system for seamless air mobility will
be comprehensively evaluated and tested in Berlin and Palma de Mallorca.
The results from the evaluation activities that are described in this evaluation plan are a
central point of reference for other cities to prove if the DORA system could be implemented
in their cities. They will provide insights in the expectable results from implementing DORA;
in its technical functionalities and requirements of each component as well as in the
integrated service, and it will provide information on the necessary requirements to manage
the process of implementation in a successful way. To ensure that cities can learn from the
positive and negative experiences made within DORA, the results of the evaluation activities
will be analysed carefully and recommendations that have to be taken under consideration
for other cities that wish to implement the system in their cities will be formulated. The
deliverable “D6.4 Final evaluation report” will summarize these findings resulting from the
impact, usability and process evaluation and describe the resulting recommendations.
The development of the DORA system is designed to be easily transferable to ot her cities
from the beginning of the development phase. Features to ensure transferability include:
DORA is based on a modular structure that combines service locally available
components with new technologies. The overall integrated system concept aims to
be easily technically transferable to other European airports and regions. This way,
the service components and products developed within DORA can be used as a
toolbox and can be integrated in further (existing or new) service applications such as
in operation centres of landside and air transport, airlines or travel agencies
addressing the overall mobility needs of different user groups.
DORA puts special emphasis on research, development and transferability of
innovative systems to optimize the procedures in the terminal based on new
technologies such as indoor location and waiting time detection and their integration
in user specific indoor-routing services. Especially the included development of new
cooperation and business models support the transferability to other cities.
DORA puts emphasis on the reproducibility or transferability of both the technology
and the business framework, i.e. the innovation as a whole. Hence the business and
cooperation framework will be easily adaptable to other contexts and configurations
beyond the DORA project to allow market expansion of the product to further areas
which in return will provide for incentives to continued innovation.
DORA Deliverable D6.1
64 / 74 64 / 74
DORA established an Advisory Board Group that is composed of representatives of
various stakeholder groups. This group is included in the evaluation activities (i.e. for
expert interviews). This also assures that they are functioning as supported for the
dissemination of DORA, to promote it and to create an interest in adopting and
further developing DORA beyond project lifetime. Some of this communication will
utilize organizational level targeting and some will utilize the personal and business
contacts of the consortium partners to allow for highly targeted contacts. Another
key element in this layer will be the ability to invite professionals for workshops and
on-site showcase demonstrations at PMI and TXL.
The findings from results-, usability- and process evaluation are essential for all
dissemination activities. They provide proved and valid facts and reasons as well as the
necessary preconditions to communicate and promote the DORA system. Together with the
structural drivers for transferring DORA to other cities the results from evaluation will build a
strong base for transferring DORA to other cities.
DORA Deliverable D6.1
65 / 74 65 / 74
4 ISSUES THAT CAN JEOPARDIZE THE EVALUATION
It is hard to foresee what will happen in the upcoming process of the DORA project, but it is
certainly helpful to think about issues that could have a negative effect on the planned and
necessary evaluation activities. This is the basis to be prepared and to avoid these problems
from the beginning or to prepare a set of possible solutions for overcoming these problems.
The success of the DORA project mainly depends on the technical d evelopment of the
components and the integrated services that will be the final project result. Thus, technical
problems would have a crucial effect on the evaluation: They can be caused by changes in
the global technical development, or by unexpectedly occurring technical problems, such as
problems with incompatibilities of systems or a lack of information that is necessary for the
development of the system although these problems are minimized by the specification of
requirements in T2.4. These problems could lead to delays and make it impossible to keep
the timeline to finalise the evaluation of the DORA System still in the lifetime of the project.
In addition to technical problems, other issues that could have a negative effect on the
evaluation activities could happen. These can be partners related risks, planning problems,
collaboration issues or any other external risks that are described in the DOW in chapter
“Critical Implementation risks and mitigation actions”.
A specifically risky issue is linked to the usability evaluation. The success of the three
usability evaluation steps is strongly linked to identifying appropriate tests persons, who
should ideally represent the pre-defined user groups. They should be familiar with technical
details and terms to be able to assess which usability problems are existing in the different
stages of the system. There are however only limited means to incentivise these experts to
participate in the evaluation which could make it necessary to pursue an alternative usability
evaluation method that is less valid. However, the DORA project is large and well connected
to various partners that can be motivated and mobilised to join the evaluation activities.
Another issue refers to the evaluation of impacts that is described in chapter “2.3.3 Results
Evaluation”. The validity of this evaluation increases a lot if it can be carried out by applying
a control site evaluation. This is however depending on a sufficiently high number of test
users in both cities at both airports. If this high number of test users cannot be reached,
other methods (described) will have to be made use of. This could lead to an increased effort
to re-schedule and re-plan this evaluation activity and might lead to less satisfactory
evaluation results.
To avoid unexpected evaluation risks from the very beginning, the project seeks to assess
delays at an early stage in the project by carrying out a ‘process evaluation’. This evaluation
DORA Deliverable D6.1
66 / 74 66 / 74
activity helps to identify and gather occurring barriers of the development process in a
systematic way. The assessment within process evaluation also includes the elaboration of
solutions to overcome these barriers.
This compilation of possible issues that could jeopardize the evaluation within DORA does
not claim to be complete. It is not possible to know what will happen in future. But it is
supposed to sensitise all partners involved in evaluation in order to ensure high quality
results from evaluation.
DORA Deliverable D6.1
67 / 74 67 / 74
5 OPERATIONALIZATION / ROLES AND
RESPONSIBILITIES
This chapter gives a quick overview on the DORA objectives and the related Work Packages ,
it shows also the involvement of the partners in the tasks of Works Package 6 and the
timeline for the deliverables of the single tasks as described in the DOW. Further details of
the distributions of work between the partners, the description of the tasks as well as times
for delivery are included in the chapter ‘2.3 Methods and Operationalization’ (see headlines
‘Operationalization at the ends of each subchapter).
Table 15 DORA objectives and the related Work Packages
Objective Related Work packages
6.2.1: Intermodal Platform for Air and Land
Transport
WP4, based on requirements and concept
from WP2 and WP3
6.2.2: Mobile Smartphone Application for
Seamless Mobility Information and SDK
WP4, based on requirements and concept
from WP2 and WP3
6.2.3: Personal Information Service WP4 – service development, WP6 – service
validation
6.2.4: Integration with Incident and
Information Management for Airports
WP5
6.2.5: System for Detection of Waiting Time
in Airports
WP4, based on requirements and concepts
from WP2 and WP3
6.2.6: Indoor Location and Navigation
Service
WP4, based on requirements and concept
from WP2 and WP3
7: New Cooperation and Business
Framework
WP3 – general DORA framework, WP7 –
exploitation of the framework towards the
stakeholders
8: Preparation of a usable platform for
planning trips
WP5 and WP6
9: Wide Public Awareness WP7
DORA Deliverable D6.1
68 / 74 68 / 74
Roles and Responsibilities of WP6
TUB is WP leader and leads the Tasks 6.1, 6.3, 6.4. The contributions will be reports
that either will deliver information on the improvement of the system during the
development phase, or on the quality and impacts of the system.
VMZ and UPVLC will contribute to the definition of the evaluation plan and the technical evaluation related to routing and platform together with indoor routing and navigation and usability assessments.
ETRA will contribute to the impact evaluation and the user satisfaction assessment within Palma de Mallorca´s Pilot.
ETRA will support the preparation and conduction of surveys.
FBB will provide a feedback of the test results to the evaluation team.
AENA and Eureva will participate in the impact evaluation by contributing to the definition of relevant indicators for the measurement and providing feedback to the preparation of surveys.
CSE will lead Task 6.2 and perform technical validation and performance evaluation of the overall platform but also of the separate components by the provision of testing mechanisms that will verify the operation of the platform against the defined requirements.
LUT supports creating the evaluation plan and conducting impact and process evaluation while ensuring interaction with the Advisory Group.
EMT will provide relevant indicators for the measurement of the PMI pilot.
VBB will contribute to Task 6.2 Technical Evaluation and Task 6.4 Evaluation based on project results, focusing on alternative and/or backup approaches/paths for proper and uninterrupted service provisioning as well as barriers, drivers and lessons learnt under the aspect of a continuous operation of DORA services.
The following figure gives an overview of the planned resources (man-month) of the each
partner involved in Work Package 6.
DORA Deliverable D6.1
69 / 74 69 / 74
Figure 9 WP 6 Effort per partner
Reporting
The following deliverables have to be prepared within work package six
Table 16 DORA deliverables and the related Work Packages
Deliverable Time
D6.1 Evaluation plan M9- Feb. 2016
D6.2 Technical Evaluation and Assessment report M30 – Nov. 2017
D6.3 Usability Assessment report M24 – May 2017
D6.4 Final evaluation report M36 – May 2018
These deliverables are split up in several sub-tasks or working steps. The following overview
compiles these sub-tasks, describes briefly their corresponding methods, the time frames of
these activities, the outcomes and the respective deliverable and the deadline for
finalisation.
Note: Boxes marked in yellow are describing deliverables and their deadlines for the
submission to the EC. The other deadlines are referring to working steps necessary to
complete the deliverables. These might be subject of changes depending on the progress of
the respective developments.
Eurescom: 2.00
VMZ: 4.00
ETR: 4.50
FBB: 2.00
Eureva: 0.50
UPVLC: 5.00
CSE: 6.00
TUB: 29.00
LUT: 3.00
EMT: 0.50
VBB: 1.00
DORA Deliverable D6.1
70 / 74 70 / 74
Table 17 Overview of DORA Evaluation Tasks
Technical Evaluation and Assessment: Work package 6 Task 2Sub-task Method Timeframe Outcome /Deliverable Deadline Validation of Interfaces
Verification of information flows across all interfaces. Simulation of Use Case scenarios.
Months 13-15 Intermediate Report to be included in the final D.6.2
Month 16
Open API Evaluation
Test data and stimuli executed against DORA Open API I/Fs
Months 15-21 Intermediate Report to be included in the final D.6.2
Month 22
B2B API Evaluation Test data and stimuli executed against DORA B2B API I/Fs
Months 15-19 Intermediate Report to be included in the final D.6.2
Month 20
Basic Transactions Evaluation among DORA components
Unit, Integration and performance Tests
Months 16-21 Intermediate Report to be included in the final D.6.2
Month 22
Initial App Evaluation
Testing of Apps with service interfaces and functionality mockups
Months 18-22 Intermediate Report to be included in the final D.6.2
Month 23
App Evaluation with real data
Evaluation of Apps with integrated services
Month 24 Intermediate Report to be included in the final D.6.2
Month 25
Prototype Evaluation
Laboratory and custom testbed experiments
Months 10-16 Intermediate Report to be included in the final D.6.2
Month 17
Prototype Evaluation
On site evaluation and performance measurements
Months 16-19 Intermediate Report to be included in the final D.6.2
Month 20
Overall Initial Integration Testing
Integration tests in staging environment with emulated input
Months 20-23 Intermediate Report to be included in the final D.6.2
Month 24
Final Integration Testing
On site testing with equipment
Months 24-29 Final D.6.2 integrating all previous intermediate reports
Month 30
Usability Evaluation: Work package 6 Task 3Sub-task Method Timeframe Outcome /Deliverable Deadline
Usability test of conceptual variants
Participatory Workshop Months 11 - 14 Short intermediate report (internal)
Month 14
Month 24 D6.3 Usability Assessment report
Month 24
Usability test of first prototype
Survey with standardized questionnaires
Month 16 -24 Short intermediate report (internal)
tbd
D6.3 Usability Assessment report
Month 24
Last usability test before trial phase
Tests: reporting in user diaries, discussion in focus group workshop
Month 16 -24 Short intermediate report (internal)
tbd
D6.3 Usability Assessment report
Month 24
DORA Deliverable D6.1
71 / 74 71 / 74
Results evaluation: Work package 6 Task 4Tasks Method Timeframe Outcome /Deliverable Deadline Impact evaluation (task 6.4.1)Evaluation of user satisfaction (task 6.4.2)
Control site assessment / time measurements / user tests / surveys with questionnaire and interviews
Month 19 - 36 D6.4 Final evaluation report
Month 36
Process evaluation: Work package 6 Task 4Sub - tasks Method Timeframe Outcome /Deliverable Deadline
Preparation and conduct of first phase
Survey with standardized questionnaire (to project partners)
Months 10 - 12 Short intermediate report (internal)
Month 12
Conduct of second phase
Survey with standardized questionnaire (to project partners)
Months 22 - 23 Short intermediate report (internal)
Month 23
Conduct of final phase and reporting
Survey with standardized questionnaire (to project partners)
Months 33 - 34 Short intermediate report (internal)
Month 34
Analysis of all results / reporting in “Final evaluation report”
Months 35 - 36 D6.4 Final evaluation report
Month 36
DORA Deliverable D6.1
72 / 74 72 / 74
6 REFERENCES
[1] http://ec.europa.eu/programmes/horizon2020/what-horizon-2020, seen: November
2015
[2] Chesbrough, H. (2003) Open Innovation: The New Imperative for Creating and Profiting
from Technology. Boston, MA: Harvard Business Press. 227 p. ISBN 1-57851-837-7.
[3] Adner, R. and Kapoor, R. (2010) ‘Value creation in innovation ecosystems: how the
structure of technological interdependence affects firm performance in new
technology generations’, Strategic Management Journal, vol. 31, iss. 3, pp. 306-333.
[4] Dziekan K., Riedel V., Müller S., Abraham M., …, (2013) Evaluation Matters – a
practitioners‘ guide to sound evaluation für urban mobility measures : Waxmann
Münster / New York / München / Berlin. ISBN 978-3-8309-2881-2
[5] Atteslander P. (2010) Methoden der empirischen Sozialforschung: Erich Schmidt Verlag
GmbH & Co.KG, Berlin. ISBN 978 3 503 126187
[6] Sarodnick F., Brau H., (2011) Methoden der Usability Evaluation – Wissenschaftliche
Grundlagen und praktische Anwendung: Verlag Hans Huber, Hofgreve AG Bern. ISBN
978-3-456-84883-9
[7] http://www.iso.org/iso/catalogue_detail.htm?csnumber=35786, seen: February 2016
[8] http://www.usability.gov/how-to-and-tools/methods/planning-usability-testing.html,
seen: February 2016
[9] https://www.w3.org/WAI/intro/wcag, seen: February 2016
[10] http://www.usabilitybok.org/formative-evaluation, seen: February 2016
[11] http://www.ueq-online.org/, seen February 2016
[12] http://www.usetree.de/wp-content/uploads/2015/07/Uebersicht-Usability-
Methoden.pdf, seen: February 2016
DORA Deliverable D6.1
73 / 74 73 / 74
7 ANNEX
GUIDELINES FOR PLANNING A USABILITY TEST
(According to U.S. Department of Health & Human Services)
“Planning a Usability Test
One of the first steps in each round of usability testing is to develop a plan for the test. The
purpose of the plan is to document what you are going to do, how you are going to conduct
the test, what metrics you are going to capture, number of participants you are going to test,
and what scenarios you will use.
Typically, the usability specialist meets with the site or product owner and members of the
development team to decide on the major elements of the plan. Often, the usability specialist
then drafts the plan, which circulates to management and the rest of the team. Once
everyone has commented and a final plan agreed upon, the usability specialist revises the
written plan to reflect the final decisions.
Elements of a Test Plan
Scope: Indicate what you are testing: Give the name of the Web site, Web application,
or other product. Specify how much of the product the test will cover (e.g. the
prototype as of a specific date; the navigation; navigation and content).
Purpose: Identify the concerns, questions, and goals for this test. These can be quite
broad; for example, "Can users navigate to important information from the
prototype's home page?" They can be quite specific; for example, "Will users
easily find the search box in its present location?" In each round of testing, you
will probably have several general and several specific concerns to focus on.
Your concerns should drive the scenarios you choose for the usability test.
Schedule & Location: Indicate when and where you will do the test. If you have the schedule
set, you may want to be specific about how many sessions you will hold in a day
and exactly what times the sessions will be.
Sessions: You will want to describe the length of the sessions and schedule the meetings
with the participants. If test will be carried out individually, remember to leave
time, usually 30 minutes, between sessions to reset the environment.
Equipment: Indicate the type of equipment you will be using in the test; desktop, laptop,
DORA Deliverable D6.1
74 / 74 74 / 74
mobile/Smartphone. If pertinent, include information about the monitor size
and resolution, operating system, browser etc. Also indicate if you are planning
on recording or audio taping the test sessions or using any special usability
testing and/or accessibility tools.
Participants: Indicate the number and types of participants to be tested you will be
recruiting. Describe how these participants were or will be recruited and
consider including the screener as part of the appendix.
Scenarios: Indicate the number and types of tasks included in testing. Typically, for a 60
min. test, you should end up with approximately 10 (+/-2) scenarios for desktop
or laptop testing and 8 (+/- 2) scenarios for a mobile/smartphone test. You may
want to include more in the test plan so the team can choose the appropriate
tasks.
Metrics: Subjective metrics: Include the questions you are going to ask the participants
prior to the sessions (e.g., background questionnaire), after each task scenario
is completed (ease and satisfaction questions about the task), and overall ease,
satisfaction and likelihood to use/recommend questions when the sessions is
completed.
Quantitative metrics: Indicate the quantitative data you will be measuring in your test (e.g.,
successful completion rates, error rates, time on task).
Roles: Include a list of the staff who will participate in the usability testing and what
role each will play. The usability specialist should be the facilitator of the
sessions. The usability team may also provide the primary note-taker. Other
team members should be expected to participate as observers and, perhaps, as
note-takers.””[8]