+ All Categories
Home > Documents > Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable...

Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable...

Date post: 17-Mar-2018
Category:
Upload: nguyenhanh
View: 220 times
Download: 2 times
Share this document with a friend
34
Vehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt safety D3.1 – System architecture and functional specifications Project contract n.: FP6-2004-IST-4 – 027014 Workpackage, workpackage title: WP3, Overall System Specification Task, task title: T3.1, Functional Specification – T3.2, System Architecture Deliverable n.: D3.1 Document title: D3.1 – System architecture and functional specifications Deliverable type: PUBLIC Document preparation date: 15.10.2007 Authors: L. Andreone (CRF), F. Visintainer (CRF), D. Gavrila (DC), U. Beutnagel-Buchner (BOSCH), M. Pieve (PIAGGIO), U. Neubert (TUC), R. Benso (FAB), L. Blankers (LCMG), R. Kloibhofer (ARC), A. Sikora (SFIDK), K. Meinken (USTUTT), R. Montanari (UNIMORE), D. Margaritis (HIT), M. Fowkes (MIRA) Project co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme
Transcript
Page 1: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

Vehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt safety

D3.1 – System architecture and functional specifications Project contract n.: FP6-2004-IST-4 – 027014

Workpackage, workpackage title: WP3, Overall System Specification

Task, task title: T3.1, Functional Specification – T3.2, System Architecture

Deliverable n.: D3.1

Document title: D3.1 – System architecture and functional specifications

Deliverable type: PUBLIC

Document preparation date: 15.10.2007

Authors:

L. Andreone (CRF), F. Visintainer (CRF), D. Gavrila (DC), U. Beutnagel-Buchner (BOSCH), M.

Pieve (PIAGGIO), U. Neubert (TUC), R. Benso (FAB), L. Blankers (LCMG), R. Kloibhofer (ARC),

A. Sikora (SFIDK), K. Meinken (USTUTT), R. Montanari (UNIMORE), D. Margaritis (HIT), M.

Fowkes (MIRA)

Project co-funded by the European Commission

DG-Information Society and Media

in the 6th Framework Programme

Page 2: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

2

Document Control Sheet Project name: WATCH-OVER

Workpackage, workpackage title: WP3, Overall System Specification

Task, task title: T3.1, Functional Specification – T3.2, System Architecture

Document title: WATCH-OVER_D3.1-System architecture and functional

specifications

Main author(s): F. Visintainer, L. Andreone (CRF)

Other author(s): A. Guarise (CRF), D. Gavrila (DC), U. Beutnagel-Buchner (BOSCH), M. Pieve, P. Cravini (PIAGGIO), U. Neubert, B. Fardi (TUC), R. Benso (FAB), M. Konijn, L. Blankers, G-J. van Bakel (LCMG), R. Kloibhofer, G. Triebnig (ARC), A. Sikora (SFIDK), K. Meinken (USTUTT), R. Montanari, L. Etzler, C. Ferrarini, I. Ducci (UNIMORE), A. Mousadakou, V. Portouli (HIT), M. Fowkes (MIRA)

Date of submission to consortium: 15.10.2007

Date of submission to European Commission: 15.10.2007

Revision history: see next page

Page 3: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

3

Revision History �������� ��� �� ��� ����������� �����

1.0 04.09.2006 A. Guarise, L. Andreone Document structure

1.1 22.09.2006 D. Gavrila, U. Beutnagel-Buchner, M. Pieve, M. Konijn, R. Montanari, C. Ferrarini

Document structure revision

2.0 16.10.2006 All authors involved so far Structure final revision 3.0 24.11.2006 A. Guarise, L. Andreone, D. Gavrila First input to the document content

3.1 15.12.2006 M. Pieve, P. Cravini, U. Beutnagel-Buchner, U. Neubert, B. Fardi

Partners’ input. Added Sections 3.1.2, 4.2.1, 4.2.3, 4.3.2 and 4.3.3

3.2 16.03.2007

R. Benso, G-J. van Bakel, R. Kloibhofer, A. Sikora, K. Meinken, R. Montanari, L. Etzler, C. Ferrarini, M. Pieve, I. Ducci, A. Mousadakou, V.Portouli, M. Fowkes, G. Triebnig

Added Section 3.1. Revision of Sections 3.3 and 4.2.2. Edition of chapter 5

3.3 12.04.2007 L. Etzler, K. Meinken, Revision of chapter 5 4.0 18.04.2007 A. Guarise, L. Andreone Revision of the first complete document draft

5.0 5.06.2007 F. Visintainer

Added CRF demonstrator description; updated data fusion; drafted technical annex; added TUC and LCMG contribution; general revision; added executive summary and conclusion

5.0 6.06.2007 A. Mousadakou Reviewed chapter 2 and 5.

6.0 8.06.2007 A. Sikora Added some aspects (thanks to LogicaCMG) into 3.3 Reworked 4.3.3

6.0 11.06.2007 L. Andreone Revision of the document and inputs to all paragraphs

6.2 25.06.2007 F. Visintainer Added contribution of SFIDK, DC and Piaggio; revised document

6.3 13.07.2007 L. Andreone Document revision

6.4 18.07.2007 F. Visintainer Added “Classification and main features of watch over actors”. Harmonised document according to the consolidated classification.

6.5 30.08.2007 F. Visintainer Integrated: UNIMORE corrections and last contributions of Piaggio, SFIDK and TUC

6.6 10.09.2007 L. Andreone Complete revision of the document

6.6 27.09.2007 R. Kloibhofer, D. Margaritis, A. Sikora, U. Neubert Revised and modified chapter 2,3,4

6.7 27.09.2007 F. Visintainer Changes according to Bosch contribution; changes in title of 4.3.1,4.3.2,4.3.3

6.8 28.09.2007 F. Visintainer General revision 6.9 07.10.2007 D. M. Gavrila Final revision. 7.0 10.10.2007 F. Visintainer Accepted revisions; overall check

Page 4: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

4

List of abbreviations ABS Antilock Braking System CA Collision Avoidance CAN Controller Area Network CMOS Complementary Metal-Oxide Semiconductor CSS Chirp Spread Spectrum DG Directorate General EC European Commission ECU Electronic Control Unit EU European Union GNSS Global Navigation Satellite System GPS Global Positioning System HMI Human Machine Interface ID Identifier IR Infra-Red ISO International Standard Organisation IST Information Society Technologies LAN Local Area Network LBT Listen Before Talk LCD Liquid Crystal Display LIDAR Light Detection And Ranging LVDS Low Voltage Differential Signaling MCU Micro Controller Unit MCPed Pedestrian communication module MCPtw PTW communication module MCVe Vehicle communication module MCVru VRU communication module OBU On Board Unit PC Personal Computer PTW Powered Two Wheeler RADAR Radio Detection And Ranging RF Radio Frequency RFID Radio Frequency Identification ROI Region of Interest Rx Receiver Ta.b Task (number a.b) TTFF Time To First Fix Tx Transmitter UC Use Case VRU Vulnerable Road User WPx,y Work Package (number x,y)

Page 5: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

5

Table of contents Document Control Sheet................................................................................................................. 2 Revision History.............................................................................................................................. 3 List of abbreviations ........................................................................................................................ 4 Table of contents ............................................................................................................................ 5 List of Figures ................................................................................................................................. 6 List of Tables .................................................................................................................................. 6 1. Executive Summary ................................................................................................................ 7 2. Use cases description: application scenarios, operative conditions and basic requirements ... 8

2.1. Scenarios and use cases................................................................................................. 8 3. System specifications............................................................................................................ 12

3.1. Overall system requirements and specifications ............................................................ 12 3.2. Vision sensing specifications ......................................................................................... 14

3.2.1. Functional specifications of vision sensing............................................................. 14 3.2.2. Camera specifications of vision sensing................................................................. 17

3.3. Communication specifications ....................................................................................... 19 4. System architecture............................................................................................................... 21

4.1. Overall system architecture ........................................................................................... 21 4.2. Components architectural design .................................................................................. 23

4.2.1. Vision based sensor............................................................................................... 23 4.2.2. Communication device........................................................................................... 23 4.2.3. Data fusion module................................................................................................ 28

5. Conclusions .......................................................................................................................... 33 References ................................................................................................................................... 34

Page 6: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

6

List of Figures Figure 1: WATCH-OVER vision sensor detection area.................................................................. 16 Figure 2: Undesired optical effects in an imager ........................................................................... 19 Figure 3: representation of WATCH-OVER actors ........................................................................ 21 Figure 4: Reference architecture and components selection......................................................... 22 Figure 5: Architecture of the Vision sensor.................................................................................... 23 Figure 6: Architecture of the on board communication device ....................................................... 24 Figure 7: Architecture of the wearable communication device....................................................... 25 Figure 8: System architecture of the CSS-based system for relative localization and communication

.............................................................................................................................................. 25 Figure 9: System architecture of the CSS-based system for relative localization, self-positioning

and communication ............................................................................................................... 26 Figure 10: Logical view on GNSS-based system (dashed area is optional)................................... 28 Figure 11 WATCH-OVER data fusion principle ............................................................................. 29 Figure 12: WATCH-OVER data fusion for VRU detection ............................................................. 31 Figure 13: CRF demonstrator vehicle......................................Errore. Il segnalibro non è definito. Figure 14: CRF prototype schematic architecture ...................Errore. Il segnalibro non è definito. Figure 15: Information processing structure of sensing devicesErrore. Il segnalibro non è

definito. Figure 16: PC architecture and communication structure........Errore. Il segnalibro non è definito. Figure 17: Architecture of the Video sensing platform .............Errore. Il segnalibro non è definito. Figure 18:WATCH-OVER DC demonstrator vehicle: Mercedes-Benz E-class limousineErrore. Il

segnalibro non è definito. Figure 19: PC architecture and CAN communication in the Daimler vehicleErrore. Il segnalibro

non è definito. Figure 20 Overview of the driver warning and automatic braking concept in the Daimler

demonstrator ...................................................................Errore. Il segnalibro non è definito. Figure 21: A variable list of objects where each objects is split into m CAN messagesErrore. Il

segnalibro non è definito. Figure 22 - PTW WATCH-OVER demonstrator – Piaggio MP3Errore. Il segnalibro non è

definito. Figure 23 – PTW system architecture .....................................Errore. Il segnalibro non è definito. Figure 24: space under the vehicle seat..................................Errore. Il segnalibro non è definito. Figure 25: space in the rear carrier .........................................Errore. Il segnalibro non è definito.

List of Tables Table 1: selected scenarios for the functional specifications ........................................................... 8 Table 2: Use case 1 summary description ...................................................................................... 9 Table 3: Use case 2 summary description .................................................................................... 10 Table 4: Scenarios of medium occurrence .................................................................................... 11 Table 5: Classification and main features of WATCH-OVER actors .............................................. 14 Table 6: Specifications of vision sensing component .................................................................... 15 Table 7: Specifications of the imager and the lens system............................................................ 18 Table 8: Communication device specifications.............................................................................. 20 Table 9: Communication tasks and special issues of the different radio modules.......................... 27 Table 10: Expected effects of data fusion ..................................................................................... 32

Page 7: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

7

1. Executive Summary The WATCH-OVER project (2006-2008) is a specific targeted project co-funded by the European Commission Information Society and Media, ICT for Transport, in the strategic objective "eSafety Co-operative Systems for Road Transport". Its main objective is the design and development of a cooperative system for the prevention of accidents involving vulnerable road users (VRU) in urban and extra-urban areas. The system is based on short range communication and vision sensors. WATCH-OVER examines the detection of vulnerable road users in complex but realistic traffic scenarios where VRUs (i.e. pedestrians, cyclists and motorcyclists) mix with non-VRU traffic (i.e. cars, trucks). This Deliverable provides the functional specifications and architecture of the WATCH-OVER system, which were investigated in the Work Package 3, Tasks 3.1 and 3.2 respectively. The outline of the document is as follows. The Deliverable first summarizes the use cases, as reported in Deliverable 2.1, which stand at the basis of the envisioned WATCH-OVER system. Although WATCH-OVER will consider multiple scenarios, its core objective will be to adequately handle the most frequent ones, namely, involving a crossing VRU (with or without occlusion) with the ego vehicle moving straight. The Deliverable subsequently covers the functional system specifications. First, an overview of the components of the WATCH-OVER system is given and of its instantiations on-board the CRF and Daimler vehicle demonstrators, on the Piaggio powered two wheeler and on the pedestrian modules. Thereafter, the individual system components (vision sensing, communication and data fusion) are specified. It should be noted that the functional system specifications should not be read as the requirements of a commercially viable VRU protection system, rather, a specification of a system the WATCH-OVER aims to actually develop within this project. As becomes apparent in the document, the WATCH-OVER system specifications represent a clear improvement versus the state-of-the art, as represented by the precursor EU project on VRU protection, SAVE-U (2002-2005). Among the main improvements are

• improved vision sensing (e.g. new Bosch NIR camera, reduction of false detection rate by factor 3-5, increase in processing speed to 10 Hz, increase in detection range down to 3m);

• the addition of a communication module (e.g. having a much wider detection range up to 100m, allowing detection of up to completely occluded VRUs);

• data fusion between vision sensing and communication for improved robustness. The Deliverable thereafter covers system architecture, subdivided by the main system components. Finally, this Deliverable provides a description of the envisioned three demonstrators within WATCH-OVER: two vehicles (CRF and Daimler) and one motorcycle (Piaggio). In particular, the HMI component is addressed, as planned currently.

Page 8: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

8

2. Use cases description: application scenarios, operative conditions and basic requirements

The automotive industry faces two major areas of research for the protection of road users, secondary safety (systems aimed at reducing the consequences of accidents) and primary safety (driving support systems providing safety related information that can be used to warn the drivers or to undertake specific emergency manoeuvres). Projects like WATCH-OVER are further developing stand alone systems for obstacle identification, and in parallel they are investigating the extent to which novel wireless communication technologies can complement the sensor based systems, in order to cover a wider range of road scenarios. The WATCH-OVER system is therefore a cooperative system using the communication as a ranging system in combination with a vision sensor based system. It fuses the information to have it available to warn the driver about the presence of a potential obstacle (pedestrian, bicyclist, rider) and enable the possibility to activate specific vehicle actuators (e.g. the brake) in order to avoid or to mitigate the accident.

2.1. Scenarios and use cases In [WATCH-OVER:D2.1], the selected use cases were divided into two different groups, the first one with an estimated high occurrence and high relevance for road safety and the second one including use cases of expected lower occurrence. In Table 1 the two selected use cases of the first group are listed. The corresponding use case IDs from [WATCH-OVER:D2.1] are listed in the second column. ��� ���������

����������������� ����� �

1 1 Pedestrian (or cyclist) crossing the road.

2 2-3 Pedestrian (or cyclist)

crossing the road occluded from parked or stopped cars or other obstacles.

Table 1: selected scenarios for the functional specifications

The relative use cases have been parameterised to enable the subsequent definition of system functionality and specifications. The key parameters, identified by experts and evaluated by end-users by means of an on-line questionnaire, are the following:

Page 9: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

9

- type of vehicle - type vulnerable road user (VRU) - relative trajectories - vehicle speed - vulnerable user speed - time to collision - time of the day - weather conditions

Table 2 lists the parametrisation of scenario UC1.

CASE NAME VRU crossing the road CASE ID UC1 STATUS Final GOAL High estimated occurrence TYPE OF VEHICLE Car, truck TYPE OF VRU Pedestrian, cyclist TYPE OF ROAD Straight road RELATIVE TRAJECTORIES Perpendicular crossing trajectories VEHICLE SPEED Up to 50 km/h VRU SPEED Slow (walking pedestrian, bicycle’s speed less than 25 km/h)

Fast (running pedestrian, bicycle’s speed more than 25 km/h) TIME TO COLLISION Detection time > time to collision : avoidable accident

Detection time ≅ time to collision : unavoidable accident; collision mitigation still possible

TIME OF THE DAY Day Dusk / Dawn Night

WEATHER All weather conditions SCENARIO DESCRIPTION STEP ACTION 1 A vehicle is driving in its own lane 2 A VRU is crossing the road 3 The vehicle is warned. Depending on the time advance, the

vehicle will avoid the accident or mitigate its consequences

Table 2: Use case 1 summary description

Table 3 lists the parametrisation of UC2.

CASE NAME VRU crossing the road occluded from parked or stopped cars or other obstacles

CASE ID UC2 STATUS Final GOAL High estimated occurrence TYPE OF VEHICLE Car, truck TYPE OF VRU Pedestrian, cyclist TYPE OF ROAD Straight road RELATIVE TRAJECTORIES Perpendicular crossing trajectories VEHICLE SPEED Up to 50 km/h VRU SPEED Slow (walking pedestrian, bicycle’s speed less then 25 km/h)

Fast (running pedestrian, bicycle’s speed more then 25 km/h) TIME TO COLLISION Detection time > time to collision : avoidable accident

Detection time ≅ time to collision : unavoidable accident; collision mitigation still possible

Page 10: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

10

TIME OF THE DAY Day Dusk / Dawn Night

WEATHER All weather conditions SCENARIO DESCRIPTION STEP ACTION 1 A vehicle is driving in its own lane 2 A VRU is crossing the road coming out from behind a parked

car or another occluding obstacle 3 The vehicle is warned. Depending on the time advance, the

vehicle will avoid the accident or mitigate its consequences

Table 3: Use case 2 summary description

The second group of scenarios is estimated to have a lower occurrence and a medium expected impact on road safety. ��� ���������

������

����������� ����� �

3 4 Vehicle turning left at an intersection, pedestrian crossing the road from the right to the left (or from the left to the right).

4 6-9 Vehicle turning right at an

intersection, pedestrian (or cyclist) crossing the road from the right to the left (or from the left to the right).

Page 11: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

11

��� ���������������

����������� ����� �

5 7-8 Vehicle on a crossroad, pedal cyclist crossing the road from the right (or from the left).

6 12-14 Powered Two Wheeler (PTW)

arrives from the left side (or from the right side) at intersection, paths perpendicular.

7 13 PTW arrives from left side at

intersection, paths perpendicular, occluded from parked car or other obstacles.

8 16 PTW (or pedal cyclist) and

vehicle travelling in opposite directions, vehicle turns in front of PTW.

Table 4: Scenarios of medium occurrence

Page 12: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

12

In these scenarios powered two wheelers are considered Vulnerable Road Users, like cyclists and pedestrians. Their functionalities are shown in the next section (Table 5). These scenarios are supposed to be evaluated concerning their effective affordability with the technologies under development. These scenarios will be experimented in the testing phase, where system performances will also be evaluated versus system applicability. The core objective of the system development is to design a system that is robust at least for the use cases 1 and 2, which correspond to the most frequent scenarios. The feasibility of the remaining use cases will be evaluated during the development and testing phases.

3. System specifications

3.1. Overall system requirements and specifications The WATCH-OVER project goal is the design and development of an integrated cooperative system for the prevention of accidents involving vulnerable road users (VRUs) in urban and extra-urban areas. The system will be based on the cooperation among on-board modules and devices inserted in wearable objects (e.g. helmets, shoes, watches, jackets, backpacks, consumer electronics) or integrated in Powered Two Wheelers. The project foresees the adaptation of low cost communication technologies, in combination with sensor technologies, in order to cover the most critical road scenarios (see chapter 2). The WATCH-OVER cooperative system should guarantee:

• a wider scenario coverage including blind spots; • simple and low cost on-board systems; • a flexible and open architecture.

The core of the system is the interaction between an on-board module and a user module. This interaction exploits innovative wireless short range communication technologies (as extension of vehicle autonomous systems). The cooperative low cost platform will extend the actual coverage of the state-of-art technologies and will be open to integrate localisation technologies. The main functionalities which will be supplied by the WATCH-OVER on-board platform are:

• real-time detection of pedestrians, cyclists, motorcyclists equipped with the WATCH-OVER module;

• calculation of the relative positioning of the user vs. drivers (relative motion analysis); • detection of dangerous situations (external scenario reconstruction, filtering of specific

situations); • appropriate warning to the driver, providing information only in really dangerous situations.

The vulnerable road user module will be able:

• to promptly answer to the vehicle’s stimulus, delivering its identification parameters; • to send back self-localisation parameters; • to give feedbacks to the road user with an appropriate HMI (visual or acoustic warnings).

In particular a set of wireless short range technologies will be adopted for the users’ detection mechanisms. The specificity of the application requires devices’ adaptation to face the different road scenarios. The selection of the interaction technologies, from a technical point of view, will be done taking into account the most relevant features, among which:

• communication issues (ranges, delays, robustness, etc.);

Page 13: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

13

• electromagnetic, radiation characteristics (scenarios will be tested in specific testing environments);

• localisation capabilities; • power consumption.

The following table summarises at high level the WATCH-OVER system in terms of actors involved, related means of transport, equipment and high level functions. The details of all the system components and units will be described and defined in the following sections, and they will satisfy the above described use cases, requirements and scenarios. It should be noticed that the specifications involve certain expectations and predictions based on algorithms and hardware still under development. Final WATCH-OVER system capabilities may deviate from the specifications described in this document.

Page 14: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

14

Actor Actor means of

transport Equipment Functionality

Vision sensor Detect and track VRU on frontal field of view

Processing system Calculate collision risk

Vehicle with pre-crash system (vision sensing only, Daimler demonstrator) HMI Driver warning and/or automatic

vehicle braking Vision sensor Detect and track VRU on frontal

field of view Radio frequency sensor/ Communication device

Send/receive RF signals for detection and localization/ Send warning signal to VRU

Data fusion system Calculate collision risk

Ego-Vehicle driver

Vehicle with full co-operative system for preventive safety (CRF demonstrator)

HMI Give warning to ego vehicle driver Exchange RF signals with ego-vehicle for localization (the latter is performed on ego-vehicle)

Communication device

Receive warning signal from ego-vehicle Display warning to VRU

Powered Two Wheeler

HMI (display + speaker) Give acoustic signal to VRU Exchange RF signals with ego-vehicle for localization (the latter is performed on ego-vehicle)

Communication device

Receive warning signal from ego-vehicle

Bike

Basic HMI (audio, vibration)

Give acoustic signal and/or vibration to VRU Exchange RF signals with ego-vehicle for localization (the latter is performed on ego-vehicle)

Communication device

Receive warning signal from ego-vehicle

Vulnerable Road User (VRU)

Pedestrian

Basic HMI (audio, vibration)

Give acoustic signal and/or vibration to VRU

Table 5: Classification and main features of WATCH-OVER actors

3.2. Vision sensing specifications

3.2.1. Functional specifications of vision sensing The specifications of the vision sensing component are summarised in the following table. In the remainder of the section several times comparisons are made with the the SAVE-U system on pedestrian protection, the relevant Deliverable are [SAVE-U:D7] and [SAVE-U:D27].

Page 15: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

15

VRU definition Pedestrian Size / Clothing / Age All Pose Standing or moving, upright

Size / Clothing / Age All Cyclist Pose Standing or moving, upright Size All

PTW Pose Standing or moving, upright

Vehicle Max. 14 m/s (~ 50 km/h), going approx. straight (no turning)

Pedestrian Max. 2 m/s (~ 7 km/h) in any direction

Cyclists Max. 4 m/s (~ 14 km/h) in any direction

Object velocities

PTW Max. 4 m/s (~ 14 km/h) in any direction

VRU detection range (camera+ algorithms)

3 – 25 m

VRU tracking range 1 – 25 m Field of view – horizontal 41°

Detection Area (see Figure 1)

Field of view – vertical 27° Longitudinal Inaccuracy within 20% of distance,

but not better than 2m System Positional Accuracy

Lateral Inaccuracy within 10% of distance, but not better than 1m

Number of Detectable VRUs All, after considering “group rule” (see desrciption below)

VRU Occlusion Max 10% Handling of non-VRUs Not considered

Target acquisition rate (init.) Max. 300 m System Processing Rate Target update rate (tracking) Min. 10 Hz

Correct detection percentage (tracked objects)

Min. 80% System Recognition Performance False detection rate

(tracked objects) Less than 1 per 3 minute (urban driving)

Table 6: Specifications of vision sensing component

VRU definition WATCH-OVER will consider pedestrian cyclists and PTW of all sizes, shapes, and clothing. All viewpoints will be considered (frontal, sideway, diagonal, etc) but the assumption is that VRUs are standing upright (or bicycling), e.g. not lying down on the road. Object velocities WATCH-OVER will cover vehicle speeds up to 50 km/h. This represents a realistic upper bound on vehicle speed, given accident statistics show that most benefit in injury reduction comes in 30-50km/h range, and that speed limits in urban environment are usually 50 km/h. In SAVE-U the maximum vehicle speed was 40 km/h. As in SAVE-U, WATCH-OVER will consider the vehicle driving approximately straight (no turning). WATCH-OVER will cover typical slow-to-moderately-fast movements of pedestrians (up to 2 m/s = 7 km/h) and cyclists (up to 4 m/s = 14 km/h). This is the same as in the EU project SAVE-U. Faster VRU movements are left for future work.

Page 16: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

16

Detection Area The detection area is shown below.

Figure 1: WATCH-OVER vision sensor detection area

Given the Pre-Crash scenario of WATCH-OVER, available sensing processing is concentrated in the area directly in front of the vehicle. The detection range 3m – 25m is increased from that of SAVE-U (5m –25m) in order to come closer to the vehicle. Once a VRU is detected, tracking goes down to 1m in front of vehicle. System Positional Accuracy Longitudinal positional inaccuracy is specified to lie within 20% of distance, but not better than 2m. For example, given a (true) VRU at 20m distance, the sensing system will have to report it within the distance range of 16m – 24m. Given a (true) VRU at 5m, the sensing system will have to report it within the distance range of 3m – 7m. Stronger demands are placed on lateral positional inaccuracy. The latter is specified to lie within 10% of distance, but not better than 1m. For example, given a (true) VRU at 20m distance on the vehicle axis, the sensing system will have to report a lateral offset of maximum 2m. Given a (true) VRU at 3m distance on the vehicle axis, the sensing system will have to report a lateral offset of maximum 1m. Number of Detectable VRUs – the “group rule” Often, pedestrians walk or bicyclists ride in group. The specific ‘group rule’ states that it is acceptable for a VRU sensor system not to detect all individual VRUs within the group, as long as each undetected one is not farther away from a detected one than the sensor positional tolerance defined here above; all other VRUs need to be detected within the defined WATCH-OVER detection area.

Page 17: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

17

VRU Occlusion WATCH-OVER vision sensing component tolerates a small amount of VRU occlusion (e.g. by parked car), namely maximum 10%. Cases where occlusion is larger are to be addressed by the WATCH-OVER communication component (see Section 3.3). Handling of non-VRUs As the primary WATCH-OVER sensing configuration involves monocular vision and no depth measurements by secondary camera or by active sensing (LIDAR, RADAR), detection requires specific image classifiers. Therefore, the recognition is limited to those VRUs for which image classifiers are actually developed within the project. System Processing Rate WATCH-OVER sensor system update rate is at least 10 Hz. Assuming 3 detections before a VRU track is initialised, the target acquisition time is at most 300 ms. This is an improvement versus SAVE-U which had a sensor system update rate of 5Hz. A realistic VRU protection system is expected to operate at about 20-25 Hz. The remaining speed “gap” is expected to be bridgeable by a special-hardware implementation, yet this is outside the scope of WATCH-OVER. System Recognition Performance WATCH-OVER sensing component is to reach a correct detection percentage of at least 80% on the VRU trajectory level. At the same time, false detection rate is to be less than 1 in 3 min for driving in urban traffic. This represents a reduction of the false detection rate of factor 3-5 versus SAVE-U (see [SAVE-U:D27]), as specified in the WATCH-OVER Technical Annex.

3.2.2. Camera specifications of vision sensing The camera specifications have been derived from the WATCH-OVER system specifications and additionally from requirements coming from Bosch’s in-house product line approach. This approach ensures that the developed and investigated camera will fit also into the road map and generation planning of Bosch’s vision products and enables the chance for a direct product development and early exploitation after the WATCH-OVER project. The specification of the camera has to solve a general dilemma: on one hand the camera should have a horizontal field of view which is as wide as possible to detect objects as early as possible in near distances; on the other hand, the camera has also to provide a certain resolution in terms of “pixels per degree” to detect objects in far distances. To solve this conflict, an imager-chip, the sensing part of the camera, with much more than 1000 pixels in horizontal direction would be necessary. But at the moment, high resolution imagers which are suitable for automotive applications, have resolutions of only approx. 750 pixels in horizontal direction. For this reason, specifications of cameras in the WATCH-OVER field of application must be a compromise for the time being. The following table gives an overview of the final specifications of the imager and the lens system. Not listed, but deserving a short mention, are the electrical and mechanical specifications. Both are defined in such a way, that the electrical interface (power and data) as well as the housing size be compatible with the existing 1st generation camera. In practice, the housing of the existing 1st

generation camera will be used, and the data interface will be exactly the same. The environmental specifications can vary according to the application. In any case the camera will be designed from the beginning to meet customers requirements in a follow-up product development phase.

Page 18: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

18

For example, usual requirements concerning operating temperature ranges are in the range of -40 to +85 degrees Celsius. � ����������� ����� �

Type of imager monochrome, CMOS Resolution h x v [pixel] 750 x 480 Frame rate [frames per second (FPS)] 25 Dynamic range [dB] � 110

Imager

Intensity resolution [bit] 12 Type of lens fixed focus,

fixed aperture Focal length [mm] 7,5

Lens System

Field of view (FOV), diagonal [degree] ± 24.7 + 5 % Horizontal field of view [degree] 41 Vertical field of view [degree] 27

Resulting Parameters

Angular resolution [pixel / degree] 18

Table 7: Specifications of the imager and the lens system

In addition to the above mentioned specifications it is essential to minimize distortions from the camera, like ghost images and stray lights. In the following some bad examples are given, which have to be avoided as much as possible. Ghost images

Stray lights

Page 19: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

19

Figure 2: Undesired optical effects in an imager

3.3. Communication specifications Upfront specification of the communication device seems to be much harder that for the image processing part, as the discipline of short-range wireless networking and localisation is much newer than image processing, which thus commands a much larger experience with accuracy and performance that can be achieved under real-world conditions. Therefore, the specifications formulated below can be partly seen as a vision that the project partners strive to fulfil. Based on the understanding of the situation, the following requirements can be set which are summarised in Table 8. The various table entries are discussed thereafter.

Size / Clothing / Age Pedestrian Pose

All, providing pedestrians wear wearable device

Size / Clothing / Age

VRU definition

Cyclist Pose

All, providing pedestrians wear wearable device

Vehicle Max. 14 m/s (~ 50 km/h), going approx. straight (no turning)

Pedestrian Max. 2 m/s (~ 7 km/h) in any direction

Cyclists Max. 4 m/s (~ 14 km/h) in any direction

Object velocities

PTW Max 12 m/s (~ 50 km/h) in same direction as vehicle

VRU detection range 0 – 100 m VRU tracking range 0 – 100 m Field of view – horizontal 360°(only 1800 used, centered along

the direction of motion)

Detection Area

Field of view – vertical 180° Longitudinal System Positional

Accuracy Lateral Accuracy typically within 15m

Number of Detectable VRUs Up to 24 VRU Occlusion Up to 100% (full occlusion) Handling of non-VRUs Not considered. Performance Latency < 100 ms (typically)

Page 20: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

20

Directionality Two-way Communication Bandwidth > 100 kbit/s Active mode 30 mA (80 mA for GPS enabled

systems) Power Consumption

Passive mode 1 µA (50 mA for GPS enabled systems)

Table 8: Communication device specifications

VRU definition WATCH-OVER will consider pedestrian, cyclists and PTW of all sizes, shapes, clothing and poses providing they wear an operational wearable device. Object velocities Same as video sensing specifications. Detection Area The detection field is a full 360° a range that should be up to 100 meters. To avoid ambiguity in the ranging process the detection angle should be limited to 1800 in driving direction of the vehicle. System Positional Accuracy Positional accuracy depends both on the accuracy of the RF ranging and the accuracy of the GPS localisation. The accuracy of the range estimation should be in the range of below 5 m without post processing. The accuracy of GPS is typically considered to be within 15 meters. Number of Detectable VRUs Potentially, dozens of VRU and vehicle modules can be around within the surrounding of one node. In order to reduce frame collision rate, channel access should be based on listen before talk (LBT) schemes. Additional communication-oriented collision avoidance (CA) techniques, e.g. p-persistency, shall help to support scalability of the system. VRU Occlusion The communication should not be stopped when line-of-sight is obstructed. It is especially realistic to have metal objects in the surroundings. Therefore, the capability of the communication system to work under multi-path conditions without direct beam and diffraction seems to be required. Handling of non-VRUs VRUs not wearing the wearable device cannot be detected. Performance The latency of the communication device from microcontroller unit (MCU) -in to MCU-out should typically be below 100 ms (see section 4.2.2 for more details). Communication In order to control the passive and active phases of the VRU module, a two-way communication is envisaged. The data rate of the communication system should be sufficient to transmit self-position and cinematic data (GPS, odometer, etc.). In order to enable sufficient scalability, it should be taken into account that the channel presumably is a shared medium for all stations. Therefore, a gross data rate of some 100 kbps on the channel seems to be necessary. The coexistence of the WATCH-OVER communication system with other RF-based system should be on a possibly low level. Coordination with 802.11-based communication systems for in-car or car-2-car communication might be required.

Page 21: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

21

Power Consumption The power consumption of the VRU module should be as low as possible. In especially, power down modes should be supported. Current consumption in active mode should be below 30 mA in active phases and below 1 µA in passive phases for a communication device (RF-transceiver & MCU). For GPS-enhanced systems will consume another 50 mA. In addition, power-down modes cannot be supported due to the required efforts for time-to-first-fix (TTFF).

4. System architecture

4.1. Overall system architecture The WATCH-OVER system is composed of different parts which cooperate at the identification of vulnerable road users that are in front or surrounding a vehicle moving in the complexity of a urban scenario. While the vehicle proceeds, the vision sensor focus on the frontal part of the car and recognises objects and their motion; the communication module gathers the responding signals in the area covered by the antenna and calculates their relative position; the on-board computer collects the different input and performs a data fusion at a certain frequency, yielding the risk level for possible colliding trajectories. In case the risk level is above a certain threshold there will be both an alert to the driver and a message sent to the VRU module. The different actors taking part in the project framework are a car and a vulnerable road user (a pedestrian, a bicycle or a Powered Two Wheeler). This section is dedicated to the architectural design, so only a brief resume of the actors and equipment is given in the following picture, (referred to chapter 2 description).

ON PEDESTRIANS / CYCLISTS• A wearable communication module

(e.g. in rucksacks, watches, mobile phones…)

ON CARS / TRUCKS• A communication and localisation module• A vision based sensor• An on board unit for data fusion• An HMI to warn the driver

ON POWERED TWO WHEELERS• A communication module• An HMI to warn the driver

ON PEDESTRIANS / CYCLISTS• A wearable communication module

(e.g. in rucksacks, watches, mobile phones…)

ON CARS / TRUCKS• A communication and localisation module• A vision based sensor• An on board unit for data fusion• An HMI to warn the driver

ON POWERED TWO WHEELERS• A communication module• An HMI to warn the driver

Figure 3: representation of WATCH-OVER actors

Page 22: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

22

While the previous Figure depicts the main actors that could be present in a scenario, the general architecture of the WATCH-OVER system shall consider all the building components of each actor. The CRF demonstrator vehicle with co-operative system for preventive safety shall be equipped with the vision based sensor, the communication device, a GPS module for absolute localisation and an on-board unit that performs data fusion and evaluates the objects relative positioning. The Daimler demonstrator vehicle equipped with pre-crash system shall also be equipped with vision sensing but will not have communication system. The Piaggio Powered Two Wheeler shall be installed with a communication system, a GNSS receiver and an on-board unit integrating an HMI capable of warning the PTW driver of an imminent danger. The VRU (pedestrian or bicyclist) shall use a wearable communication module for its recognition from the WATCH-OVER vehicles. The following figure depicts the overall reference system architecture.

Car Vehicle

On

-

board unit

Video camera sensor

Motorbike

PTW unit

VRU module

Communicationmodule

CommunicationDevice

(not present in pre-crash vehicle system)

Communicationdevice

Car Vehicle

On

-

board unit

Video camera sensor

Motorbike

PTW unit

VRU module

Communicationdevice

Communicationdevice

(not present in pre-crash vehicle system)

Communicationdevice

Car Vehicle

On

-

board unit

Video camera sensor

Motorbike

PTW unit

VRU module

Communicationmodule

CommunicationDevice

(not present in pre-crash vehicle system)

Communicationdevice

Car Vehicle

On

-

board unit

Video camera sensor

Motorbike

PTW unit

VRU module

Communicationdevice

Communicationdevice

(not present in pre-crash vehicle system)

Communicationdevice

Car Vehicle

On

-

board unit

Video camera sensor

Car Vehicle

On

-

board unit

Video camera sensor

Motorbike

PTW unit

VRU module

Communicationmodule

CommunicationDevice

(not present in pre-crash vehicle system)

Communicationdevice

Motorbike

PTW unit

VRU module

Communicationmodule

CommunicationDevice

(not present in pre-crash vehicle system)

Communicationdevice

Car Vehicle

On

-

board unit

Video camera sensor

Motorbike

PTW unit

VRU module

Communicationdevice

Communicationdevice

(not present in pre-crash vehicle system)

Communicationdevice

Figure 4: Reference architecture and components selection

It has to be noticed that in Figure 4 no Human Machine Interface (HMI) is reported. Later, in the demonstrator description, HMI module will be shown, but only at a very high level. Indeed, since a proper signalling of the potential risk situation to the drivers is a fundamental requirement for the overall system, the HMI layer deserves to be treated as an autonomous task and it is thus separately analysed in the future Deliverable D3.3. The reference architecture is not repeated in all test sites, as each partner play different roles in the project context.

Page 23: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

23

4.2. Components architectural design

4.2.1. Vision based sensor The vision sensor has the task to continuously provide images to a subsequent image processing unit. The following scheme gives an overview of the vision sensor components.

Figure 5: Architecture of the Vision sensor

The vision sensor is principally constructed like a commercially available camera. The lens system focuses the light to the photosensitive elements (pixels) of a CMOS-image sensor, which transforms the impinging light into electrical current. The contents of all pixels are sampled 25 times per second, therefore providing 25 images per second. The process to control the sampling of all pixels and performing some corrections and/or pre-processing is handled by a separate controller. The final image is then transmitted at the aforementioned frame rate via the data interface. The power generation is responsible of providing a stable and well filtered power supply to all components of the vision sensor. The housing with additional fastenings elements are needed for the mechanical installation on the vehicle and ensure a robust and secure mounting inside it. An important requirement in this context is the vibration absorption, in order to guarantee a stable imaging.

4.2.2. Communication device Base architecture There are two types of communication modules: vehicle modules and VRU modules. Together, they, share a wireless short range communications channel and thus provide a means to determine the relative location of the VRU with respect to the vehicle from:

1. The physical propagation characteristics of the communication channel itself. 2. The exchange of self-localization parameters via the communication channel.

In addition, the communication channel is used to notify the VRU of warnings that are generated in the vehicle on-board unit. The vehicle communication modules (MCVe) will detect one or more VRUs, under the condition that those VRUs wear a communication module (MCVru, i.e. MCPed or MCPtw). The MCVe’s will provide the relative distance and the relative angle of the VRU with respect to the vehicle. If there is more than one VRU, the communication device computes the relative distance and the relative

Page 24: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

24

angle for each VRU. Due to physical reasons of the propagation of radio signals, it is also possible to detect VRUs if the visible range is heavily reduced, for example in case of fog, rain or dust. Also in darkness, where the vision system is limited, the radio communication device works with the same performance level. VRUs may also be detected when they are behind obstacles, such as trees, parking cars or lampposts. The communication system is divided into three modules. Two communication modules are mounted on board of a vehicle (car) (MCVe), whereas the other module is at the VRU. The requirements for the two parts are different. Therefore the communication module for the on-board mounting and the wearable module may be very different in size, cost and power consumption. For the vehicle on board communication module (MCVe) the base architecture is shown in the following block diagram (Figure 6).

Figure 6: Architecture of the on board communication device

The interface unit provides the communication to the vision system and to the fusion system. The power for the on board communication system is derived from the main vehicle power system. The power supply of the communication device can also be switched off or be in power down mode (sleep mode), for example in case of the vehicle is standing still. Figure 7 shows the base architecture of the VRU communication system and its interfaces. In this section we’ll refer to the communication device of the pedestrian and cyclist, the so called “wearable device”. The PTW communication system has an analogous architecture except from the power supply, because it has less demanding constraints in terms of energy.

Page 25: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

25

Figure 7: Architecture of the wearable communication device

The signal processing part receives the signal from the on board communication system and replies with a transmitted signal. In case of no reception the transmitting part should go in power down mode to save energy. The interface unit enables the communication to a HMI of the VRU to signalize a dangerous situation. Also an optional GPS-interface can be connected. The power supply of the wearable communication system is a technically challenging part because it is limited in size and cost, but it should have a long life time. It can be a battery, a solar cell or another system to provide electrical power to the interface and signal processing unit. In order to save energy it should have a power down mode. CSS-based communication system The envisaged CSS-based system allows direct communication between two or more stations, and an estimation of the range between those two stations. Each node consists of the elements shown in Figure 8.

RF-transceiver

MCU

pow

er s

uppl

y

VRU module

antenna

RF-transceiver

MCU

pow

er

supp

ly

vehicle module

antenna

interface to car subsystemHMI

Figure 8: System architecture of the CSS-based system for relative localization and communication

The setup of both communication nodes is basically identical. It consists of

• A microcontroller unit (MCU) for controlling the communication flow and to run first processing of the received data. In addition, it is also the MCU who controls the power supply.

Page 26: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

26

The MCU should be selected such, that the requirements concerning power consumption and concerning computational resources are met.

• The RF-transceiver: Currently, it is envisaged to use the single-chip transceiver AN5TR1 “nanoLOC TRX Transceiver”.

• An antenna, through which the RF-signal is transmitted. The selection of the antenna is dictated by cost, range and output power. As it strongly depends also on the dislocation of the systems in the car, the selection should be made flexible.

• The power supply is crucial for both VRU and vehicle unit. o The power supply of the VRU module should be as power conscious as possible.

The operation should be switched off in all inactive phase. Fast wake-up operations should be supported.

o The power supply of the vehicle module is mostly orientated towards the generation of a stable, over-tension-free supply voltage.

The VRU module comes with an additional Human-Machine-Interface (HMI), which can be kept quite simple. The main operations are

• switching the device on and off • delivering a (sound) alarm in dangerous situations • switching the alarm off, when the situation is not dangerous any more.

The vehicle module is envisaged to provide the following features:

• It shall interface to the other systems of the WATCH-OVER project and to the other security related devices in the car. It is planned to run this communication over a standard CAN bus. Due to this communication interface, no local HMI is needed.

• For triangulation, it is required to run at least two modules in the car, which both deliver the data to the detection module.

2.4GHz antenna

RF-transceiver

MCU

pow

er s

uppl

y

VRU module

2.4GHz antenna

RF-transceiver

MCU

pow

er

supp

ly

vehicle module

interface to car subsystemHMI

GPS

GP

S a

nten

na

Figure 9: System architecture of the CSS-based system for relative localization, self-positioning and

communication

In addition, the VRU module can be enhanced with a GPS module, as the communication flow between VRU and vehicle module can carry data. The two antennas cannot be shared due to the different frequency characteristics. For civil GPS services the L1-frequency (1575,42 MHz) is used. The vehicle module will not be equipped with a GPS device, as it is anticipated that the vehicle already has a GPS-module on board. The fusion of the data from the VRU module and the vehicle modules is a task for the detection algorithms. Three varieties of the communication platform must be separately regarded:

Page 27: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

27

• the system for the vehicle with two communication nodes; • the system at the vulnerable road user (VRU), consisting of one communication node; • the system at the powered two wheeler (denominated PTW), consisting of one

communication node.

Module Name Special Issues Main Tasks Additional Tasks

Vehicle Communication Module (MCVe)

- unlimited power resources

- sufficient computing performance

- Center of a network, - Distance measuring

to VRU

- Network maintenance

- Bandwidth control

PTW Communication Module (MCPtw)

- unlimited power resources

- sufficient computing performance

- No triangulation opportunities (single communication node)

- Scanning for networks in range

- Self localizing in networks found

- warning generation in case of danger

Pedestrian Communication Module (MCPed)

- limited power resources - limited computing

performance - No triangulation

opportunities (single communication node)

- Scanning for networks in range,

- Self localizing in networks found

- If possible: warning generation in case of danger

- Power-down-modes to save battery power

- Wakeup mechanisms

Table 9: Communication tasks and special issues of the different radio modules

Self-localization communication system This section further details the functional components and interfaces of the system for self-localization (see Figure 10):

• GNSS receiver (GNSS Rx), capable of determining the geometric location of its antenna, relative to an existing infrastructure of navigation satellites. This “self-location” is obtained periodically. Hence, it is also possible to derive the “self-velocity”.

• Half-duplex short range radio communication channel, capable of relaying self-location information from transmitter (Tx) to receiver (Rx). Transmitter and receiver are within short range of each other ( < 100 meters). The radio frequency (range) is selected such that line-of-sight between transmitter and receiver is not required.

• Since multiple transmitters and receivers are operating on a single physical communication channel, a digital ID, as well as contention and error management is provided. This technology is readily available, see the RFID chapter in [WATCH-OVER:D4.1].

• From the self-location and the locations received via the communications link, the relative location is determined. The accuracy of the relative location is much better than of either the vehicle’s or VRU’s self-location, as measured with GNSS. This principle lies at the heart of the differential GNSS technology. Rather than relying on (geo)stationary reference stations, the participating vehicles and VRUs provide each other’s reference.

• All information received from the communication channel is available in the vehicle’s on-board unit for further processing. Depending on the relative location and velocity, indications are provided to the vehicle’s driver (HMI).

Page 28: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

28

• The VRU’s wearable unit is equipped with a radio receiver, and the vehicle with a radio transmitter. This provides the possibility to extend the vehicle’s HMI indications (especially warnings) to the VRU.

Figure 10: Logical view on GNSS-based system (dashed area is optional)

4.2.3. Data fusion module Data fusion techniques collect data from different inputs or related information and combine them to achieve more specific inferences than those achieved by using a single or independent input or sensor. For example, when the vision is limited due to structure or external obstacles there can be other sensing capabilities able to augment the “scenario awareness” through the sense of hearing, or in other applications through a communication link to a responding entity. Data fusion, or data integration, is the problem of combining data residing at different sources and providing the user with a unified view of these data. This introduction is suitable to explain the approach used from the WATCH-OVER framework, where both a video sensing camera and a short range communication module are able together to collect wider and more precise data of the scenario surrounding a vehicle, and also the potential danger a vulnerable road user would face. Figure 11 shows the fundamental principle of the WATCH-OVER data fusion module. The measured data from the communication device, from the IR camera, and additionally from the odometer sensors of the vehicle (ego motion) are synthesized by the Data Fusion & Tracking Module to create hypotheses regarding the classes of the detected objects and their expected behaviour. In the present case, the interesting objects are VRUs, especially pedestrians.

Page 29: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

29

The core element of this module is a special Kalman Filter, suited for this application. In a recursive procedure, this filter uses the physical and error parameters of the sensors and several time-discrete measurements and estimations of the states of the same objects detected around the vehicle, in order to produce a new estimation of the object attributes in future time. This method allows the calculation of the time-continuous evolution of the object attributes. In each step of the data fusion process, the new measured data are compared with the predicted values to calculate a more precise relative position of the detected objects and their movement.

Figure 11 WATCH-OVER data fusion principle

Measurements that fit already known objects with a certain probability are assigned to these objects. Tracks can also be initialised for temporarily unknown VRU objects, which are possible candidates. This may happen, for instance, when only incomplete patterns of VRUs are visible in the IR image or the objects are out of the IR camera viewing angle. In the latter case, the communication device helps to create a preliminary track to prepare the fusion algorithm for the calculation of a reliable classification result in time.

In the further steps of the fusion process, the hypotheses are either confirmed and outfitted with more precise parameters or rejected by means of a classification algorithm. Sometimes, originally different tracks can be unified in one common track denoting one single VRU.

The vision sensor module is able to collect information about the presence of VRU within the opening angle and the range of the camera, while the communication module, depending on the architectural design, has the possibility to cover a circular area surrounding the vehicle within the range of the communication mean. To avoid ambiguity in the ranging process the detection angle should be limited to 1800 in direction of vehicle movement. The region covered from both sensors

Page 30: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

30

can give contemporary input to the on-board unit, possibly augmenting the precision of the elaboration.

The contribution of the communication device leads to the definition of a region of interest (ROI) in the camera image determined by localisation based algorithms (see Figure 12). The ROI covers a limited area of the camera picture, which contains interesting objects, e. g. VRUs. This reduced region enables a substantially faster and more reliable detection of VRUs and hence a more trustworthy recognition of dangerous situations.

Page 31: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

31

Synchronisation Communication module for

relative localisation

Vision sensor module

Data fusion algorithm

VRU

ROI VRU

IR Camera Communication device

Odometric data

Figure 12: WATCH-OVER data fusion for VRU detection

Finally, the distance information between the vehicle and the VRU generally delivers more confidence in estimating the actual traffic scenario and allows a relatively precise calculation of the time to a possible collision to generate a warning signal. Because of fundamental physical conditions, on one side an acceptable distance accuracy and resolution of about 1 m is expected, while on the other side only poor angle estimation is foreseen.

Page 32: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

32

Therefore, it is here anticipated that a sufficiently exact angle estimation is not possible for the definition of an angle limited ROI. Nevertheless, the relatively precise distance estimation determines at least the size of the bounding box in the IR image according to the VRU pattern. This delivers an additional benefit in image processing. Odometer data are used to improve the estimation of relative VRU position by applying an ego motion compensation algorithm in order to equalise the influence of the vehicle’s movement on the IR image. In the fusion system the VRU position in the vehicle coordinate system is corrected at each time step using this data. Thereby a more accurate ROI creation can be achieved. Furthermore, considering this information a first prediction of the track where the vehicle will probably drive can be calculated. In this way an additional useful limitation of the processing area in the image plane would be possible. A significant task in a data fusion system is the collection and synchronisation of the data acquired by different sensors. Generally, the distributed sensor modules collect and send their data independently and asynchronously. In order to keep the correct time reference between the corresponding data, every data package of the sensors obtains a definite timestamp by the sensor module, stating the time of the recorded measurement. These packages are saved in a special data structure in the memory. The structure allows seeking for the data, based on its measurement timestamp or based on the sensor type of the data. Table 10 summarizes the expected effects and advantages of the data fusion concept for the WATCH-OVER project, based on video sensing specifications of chapter 3.2 and on the first measurement results of the communication device.

Single sensor (IR camera) Data fusion

VRU Detection Range 25 m 80 m

VRU Tracking Range 1 – 25m 1 – 60m (initialisation of a

track)

Detection Angle 410 180° (communication only)

Longitudinal 20%, not better than 2m 1m, independent of distance Position

Inaccuracy Lateral 10%, not better than 1m 10%, not better than 1m

VRU Occlusion Max 10% Max 10% Relative Detection Rate per time unit 1.0 2.0

Relative Correct Detection Rate 1.0 4.0

Absolute Correct Detection Rate 70% >70%

False Alarm Rate Less than 1 per 3 minute Less than 1 per 30 minutes Relative Precision of Time to Collision Calculation 1.0 2.0

Table 10: Expected effects of data fusion

The last five parameters in Table 10 illustrate the expected performance improvement of the system in the area where the data fusion operates, compared to the situation when only one sensor (IR camera) is present. A Relative Detection Rate per time unit of 2.0 means that an image analysing process is predicted two times faster, compared to the application of the solely IR camera.

Page 33: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

33

The Relative Correct Detection Rate describes the expected improvement, which is reached by the reduction of the error rate during the detection process. The factor 4 means that the error rate in the first rough detection procedure of VRU patterns is lowered by around at least this factor. The further in-depth investigation and tracking of the supposed VRU patterns during the data fusion process then lead to the False Alarm Rate specified in the table.

Finally, the precise distance measurement, delivered by the communication device, guarantees a much more accurate estimation of the Time to Collision if such a dangerous situation is detected. This is important for the generation of a reliable warning signal, and an improvement of this parameter by at least a factor 2 is expected (see also the longitudinal position accuracy in the table). All the statements given above presuppose that each VRU is equipped with a wearable unit, and therefore the data fusion can function correctly.

5. Conclusions The main outcome of this Deliverable is the WATCH_OVER system design, in terms of general reference architecture as well as single components characteristics. As usually, the original use cases and requirements have to be reviewed and refined after investigating the enabling technologies and evaluating the actual feasibility. Yet, although some scenarios will be left to theoretical assessment, the most relevant use cases are going to be evaluated in the WATCH-OVER testing phases by means of the developed demonstrators. Concerning the system architecture, the reference model based on cooperation of vision and communication components for the identification of VRU by vehicles, has been maintained and studied to a deeper level. Specifications are given of the following basic modules: the vision sensor, monitoring the frontal part of the car and recognising objects and their motion; the communication modules on the vehicle and on the VRU, exchanging signals for the calculation of relative position; the on-board computer performing a data fusion, yielding the risk level for possible colliding trajectories, and generating both and on-board HMI alert and a message addressed to the VRU in case the risk level is passed. This phase helped also to identify the role of system demonstrators, whose description in this document is still preliminary, but already highlights the functionalities that are going to be tested in each of them.

Page 34: Vehicle-to-Vulnerable roAd user cooperaTive communication ... · PDF fileVehicle-to-Vulnerable roAd user cooperaTive communication and sensing teCHnologies to imprOVE transpoRt ...

WATCH-OVER Project FP6-2004-IST-4 – 027014 PUBLIC D3.1-System architecture and functional specifications

This project has been co-funded by the European Commission DG-Information Society and Media in the 6th Framework Programme. The content of this publication is the sole responsibility of the project partners listed herein and does not necessarily represent the view of the European Commission or its services.

34

References

• [SAVE-U:D1A] D.M. Gavrila, P. Marchal and M.M. Meinecke. Vulnerable Road User

Scenario Analysis, EU SAVE-U project Deliverable 1A, February 2003.

• [SAVE-U:D7] E. Marc, P. Marchal, M. Töns, D.M. Gavrila, L. Letellier, J.J. Yon, and M.M.

Meinecke. Specifications of SAVE-U sensor platform and ECUs. EU SAVE-U project

Deliverable 7, January 2004.

• [SAVE-U:D27] P. Marchal, M. Dehesa, D.M. Gavrila, M.M. Meinecke, N. Skellern and R.

Vinciguerra. Final Report. EU SAVE-U project Deliverable 27, August 2005.

• [WATCH-OVER:D2.1] L. Andreone, E. Bekiaris, A. Guarise, A. Mousadakou,

Requirements and use cases, EU WATCH-OVER project Deliverable 2.1, July 2006.

• [WATCH-OVER:D4.1] A. Sikora, Communication technologies specifications, EU WATCH-

OVER project Deliverable 4.1, February 2007.


Recommended