- Beyond 720°, scene assessment - 360° outside - 360° inside
adopting input from multiple heterogeneous sensors
- Raw sensor data fusion - prepared for possible deeper diagnosis in a scalable Cloud infrastructure
- Metadata stemming and local dynamic mapping for scene understanding and reconstruction for further danger assessment.
- Design with elaborated methodology for safety and security validation and testing.
- Efficient information propagation
- Situation aware control transfer models evaluating the driver participation and current execution state
warning and assistance in anticipation of dangerous situations
- Insurance point of view
- Auto liability vs.possibly increase product liability
- Estimate of the resultant risk profiles
- A holistic liability analysis will be undertaken in the context of ethical and legal considerations.
- Necessary infrastructure for storing and analysing the traffic data at a large scale
- Help to fine tune the recognition and understanding capabilities of the deployed ADAS deployed algorithms
- Cooperative V2X solutions will be based on Cloud connectivity
- Cloud will be used for both real-time situational updates to a central system and batch transmission of key in-vehicle events and the vehicle connectivity will be used for real time data transferring to enhance driver experience
- Standardised interface will be defined through which ADAS components will share data, metadata, configurations, processing logic etc.
The testing activities will be focused on validating the driving manoeuvres performed by different drivers (with different profiles) and for several levels of automation.
To meet this objective, three horizontal actions shall be included for each use case, along with the overall risk evaluation and prediction:
- Individualised and real time driver model that will be updated while driving.
- Unique, integrated and adaptive HMI.
- Monitor driver’s response to actuate consequently.
The selected use cases will cover:
During manual driving, the challenge will be to inform/warn the driver about threats or inappropriate/risky driving, as assessed by the Road Environment and Driver Monitoring functions to be developed in WP2, WP3 and WP4.
During vehicle takeover by the automation and automated driving phases, the challenge will be to support the drivers’ mode awareness (current status of vehicle automation), to help them to adequately know what the automated functions are currently doing or what they are in charge of, and what is under the driver’s responsibility.
During vehicle handover to the human driver, after a Highly Automated Driving phase (HAD), the challenge will be to support the driver in the “rebuilding” of their situational awareness according to the current driving context, and then to help them to take the control of the car in an appropriate way, in order to manually manage the situational risk and/or to safely perform the driving task.
User Centric and Incremental Cycle Development
Initial prototype ALPHA
The consortium will create an Alpha prototype from M9 to M12 of the project. The goal of this prototype is to integrate existing technologies and background knowledge of the consortium in a “rough
and ready” platform that can be used to scope and define the goals of end user requirements and verify platform designs.
First prototype platform BETA
The Beta version of the platform will be made available in M24 of the project and will contain the project results developed until that point.
It will implement basic connection between the considered modules, following an initial inter-communication protocol. This platform will serve as the basis for conducting initial testing.
The results of the Beta test will be fed back into the RTD tasks for the final version. This version of the platform will be viewed as feature complete.
Final prototype platform GAMMA
To be available and tested in M36. This will be the final prototype of VI-DAS platform that is functionally complete and contains all RTD deliverables and previously tested deliverables from the Beta Platform and thus functional complete.
At release time, the final prototype will have been tested (from M34 to M36) to be measured against the architecture and user requirements
From these in-depth accident and naturalistic dataset analysis, VI-DAS will identify typical scenarios of accidents and a set of selected cases of driving errors
(Step 1), to define in accordance a set of Use Cases of Reference relevant for VI-DAS (Step 2), and then to specify (Step 3) the VIDAS prototypes (Alpha, then Beta, and then Gamma)
from real drivers difficulties (i.e. accident scenarios and typical errors) to be at last supported by VIDAS technologies.