Last modified on February 20, 2024

CLUSTER 2: VEHICLE TECHNOLOGIES

Introduction

Cluster 2 “Vehicle Technologies” focuses on the development of technologies on-board of connected and automated vehicles (CAVs) to perceive the environment and take decisions, enabling safe interaction with other road users and providing protection in the case of emergency while also ensuring the comfort and well-being of the vehicle occupants. Robust and accurate environment perception is absolutely essential for highly automated vehicles to enable the extraction of reliable information for real-time driving decision-making which must be performed in a safe and unambiguous way, concentrating on the combination of system, human and environment status within the framework of digital traffic rules. The imminent transformation of the electronic control architecture and the embedded code and data will pave the way towards a decoupling of hardware and software that accelerates innovation and enables harmonized safety functionalities in automated and connected vehicles and their integration in the cloud.

Cluster 2 Objectives

  • Deliver vehicle technology ready for demonstration on public roads, including functional safety[1], safety of the intended functionality [2]and cybersecurity (to Cluster 1).
  • Provide recommendations for a European Statement of Principles (ESoP) for automated vehicles to enable conformity checks (to Cluster 3).
  • Provide improved vehicle technologies (e.g., sensor fusion, on-board decision making) contributing to better performance of vehicle-transport system integration also with respect to shared and regularly-updated high-definition maps of the environment (to Cluster 4).
  • Specify vehicle technology needs for advancing e.g., perception capabilities through key enabling technologies (to Cluster 5).
  • State requirements for the co-design of e/e-architecture and operating systems in view of automated functionalities and cloud connection (to KDT partnership/ Chips partnership).

[1] ISO 26262-1:2018 Road Vehicles – Functional Safety, https://www.iso.org/standard/68383.html
[2] ISO/PAS 21448:2019 Road Vehicles – Safety of the intended functionality (SOTIF), https://www.iso.org/standard/70939.html

Indeed, safe and reliable CCAM solutions will not be feasible without robust and accurate environment perception technologies. Comprehensive environment perception capabilities are required that can reliably identify, track, and discriminate between benign and hazardous objects in the path of the vehicle under the full range of environmental conditions in which the vehicle is intended to operate (e.g., adverse weather and lighting conditions).

A significant technological challenge is presented by the wide variety of traffic participants with completely different characteristics that might enter the viewing area of the sensors, thus enforcing the need for a multitude of different sensing devices with different attributes depending on the specific task. To address this challenge, a variety of technical solutions have been adopted to date, the result being that a multitude of different sensor set-ups exist which vary considerably in terms of the number and type of sensors and their positioning on the vehicles.

Essentially automated driving requires that multiple systems interact with each other within a “Sense-Think-Act” process: The vehicle needs to perceive all surrounding objects and localise itself in the environment (“environment perception”) before selecting all relevant objects to be considered for the subsequent motion planning task (“decision-making”). Hence the vehicle systems need to take into account not only the current situation, but also predict the relative movement of the ego- vehicle (the vehicle equipped with the automated functionality that performs the Dynamic Driving Task autonomously) with respect to all relevant objects while anticipating the path trajectory of the other road users.

Another technological challenge is the need for a shared high-definition (HD) map of the environment, a prerequisite for the precise localization of the highly automated vehicle for high level situation awareness; the HD map also requires regular updating as the environment evolves frequently especially in urban and peri-urban contexts (e.g., construction sites and public works). In general, the electric and electronic components of automated and connected vehicle need to interact in an efficient, reliable and fail-operational way with each other and with the embedded software and artificial intelligence at the edge, which is being increasingly facilitated by more centralized vehicle control architectures which importantly allow also software-over-the-air updates. 

Several different sensing sources (using technologies including radar, LIDAR, ultrasonic sensors, cameras, etc.) can be aggregated with “sensor-fusion” to enable the vehicle to perceive its surroundings before function-specific software uses the information as input to think and act appropriately, deciding which actions the vehicle shall take before executing them accordingly. A series of recent and ongoing EU projects have addressed environment perception and on-board decision-making, and the development of relevant technologies. Specifically the projects RobustSENSE, DENSE, Dreams4Cars, ROADVIEW and EVENTS, details of which you can find in https://www.connectedautomateddriving.eu/projects/findproject/, focus on perception and are considered relative to Cluster 2.

In addition to perceiving the surrounding environment, locating the ego-vehicle in this environment is an important step for the execution of automated driving functions, requiring the utilisation of advanced digital mapping technologies. As the amount of data and information increases with higher levels of automation, so too does the need for more computational power with new software required to handle and process the data, increasingly involving AI and Machine-Learning techniques. Such software-intensive systems must be designed to provide very high safety levels so that the rate of errors in the requirements, specifications, and coding is sufficiently low for the system to be effectively as safe as, or safer than, human driving.

For future, highly automated vehicles in which the human driver cannot be considered as a backup, systems must be “fail operational” such that, in the event that the limit of the Operation Design Domain (ODD) is reached, the vehicle brings itself into a safe state with a minimum risk manoeuvre. Hence the automated driving system must be provided with comprehensive fault detection, identification and accommodation capabilities so that malfunctions can be immediately diagnosed and enable switching to a fall-back mode of operation in order to ensure safety.

Active safety functions need to enable automated vehicles to navigate safely in both expected and unexpected scenarios, with advanced passive safety systems also being required to protect passengers in new, unconventional seating positions. In practice, highly automated vehicles will bring new challenges in terms of traffic rules and new regulations may be needed. In this direction, Cluster 2 projects OSCAR, AWARE2ALL and HADRIAN are performing R&I relative activities.

Freeing the driver from the need to be regularly coupled to the vehicle controls, effectively becoming a passenger in the autonomous vehicle, increases the risk of experiencing motion sickness which can also jeopardise driving performance in the event of taking back control of the vehicle; hence automated vehicles should minimise this risk by ensuring a smooth ride. In terms of achieving of high-levels of acceptance and adoption of CCAM with a variety of different types of private and public vehicles, it is also essential to understand and optimise the on-board experience and overall satisfaction of all users in terms of comfort, well-being, safety and security. In this regard, the approach being taken by the BRAVE project is to assume that the launch of automated vehicles on public roads will be successful only if a user-centric approach is adopted where the technical aspects go hand-in-hand with compliance in terms of societal values, user acceptance, behavioural intentions, road safety, social, economic, legal and ethical considerations.

Indeed, automation will add new opportunities and challenges in terms of comfort and on-board physical well-being. At the same time, automated public transport vehicles that do not require a human driver on-board will generate new challenges in terms of ensuring the safety of passengers and their protection from possible threats and unsociable behaviour e.g. criminal acts including assaults, harassment, vandalism, theft etc. while meeting new potential users’ demands in terms of comfort and privacy.

Cluster 2 R&I Actions

Correspondingly, the specific R&I actions relating to Cluster 2 are the following:

  • Environment perception technologies for CCAM. Robust and accurate environment perception is essential for highly automated vehicles. Vehicle environment and traffic scenario perception systems are required to enable the extraction of reliable information for real-time decision-making including sensor fusion. On-board sensor systems are available for partially automated driving for specific Use Cases and ODDs. Considering more extensive safety requirements going along with higher levels of automation and strong demand for multiple applications in less limited ODDs, there is need for significant further progress in innovation in environment perception technologies including in high-definition maps.
  • Safe and reliable on-board decision-making technologies. Advanced on-board decision-making functionalities are required to handle the diversity of Use Cases in their respective operational domains while guaranteeing the safety, reliability and conformity of future automated vehicles which will integrate complex in-vehicle systems-of-systems with advanced sensors, control and actuators, relying on extensive computational power.
  • Efficient, certifiable and upgradable functions integrated in the vehicle. Aiming to provide the most recent and powerful advancements in software, data analysis and artificial intelligence to the automated and connected vehicles, the innovation cycles of hardware and software need to be decoupled, hence taking advantage of the imminent transformation of the in-vehicle control architectures and operating systems towards centralization and generic functionality, co-designing their interfaces and their integration in the cloud-edge continuum in a seamless, reliable and cybersecure way.
  • Preventive and protective safety for highly automated vehicles. To exploit the potential of CCAM in terms of improving the safety of the transport system as a whole, on-board systems need to anticipate risks reliably, prevent crashes and minimise the consequences of unavoidable collisions. Furthermore, provision of the required technologies must be accompanied by the development of suitable assessment tools for the newly developed prevention and protection systems while supporting the definition of standards as well as the potential needs for traffic rules adaptation.
  • Human Machine Interaction (HMI) development for on-board CCAM technology. The design of vehicle technologies must focus on enhancing the acceptance by users and by other road users, and generating trust and reliance on automated systems through well-designed, user-relevant and informative human- machine interfaces which allow, amongst others, intuitive and seamless transfer of control between the driver and the vehicle, while also responding to the needs of all users. Research should also focus on developing recommendations for the commonality of HMI principles and harmonisation of safety-related HMI designs for highly automated vehicles. This includes HMI design principles for remote operators, addressing amongst others the issue of situation awareness.
  • Addressing User-Centric Development of CCAM. As the driving task is gradually delegated to the car, and as shared mobility becomes increasingly accepted and practiced, understanding and optimising the on-board experience and overall satisfaction of users, particularly in terms of inclusiveness, comfort, well-being, privacy and security, through the user-centric design of future road vehicles will be paramount for the widespread adoption of CCAM.

Cluster 2 Expected Outcomes

  • Determination of appropriate, accurate, robust, reliable, cost-effective sensor-suite compositions (enabling safe and reliable Connected and Automated Vehicles, in all conditions and environments, with expanded ODDs); ability to perform advanced environment and traffic recognition/prediction (supported by big data and digital maps with dynamic real-time information), limiting false or non-detections of threats, focusing on VRUs; improved data fusion with infrastructure-based sensing and other vehicles; standardisation mandate for performance requirements.
  • Capacity to determine the appropriate course of action in an open-world context with a wide range of traffic scenarios, respecting traffic rules; ability to perform state prediction to take timely actions and prevent activation under unsuitable conditions; determination of the required control system performance and quality/quantity of data needed to describe complex traffic scenarios, considering also human behaviour, with remote software updates.
  • Reliable technologies for advanced safety systems to prevent crashes or minimise the consequences of unavoidable accidents, including improved minimum risk manoeuvres; protection systems designed for unconventional seating positions and body postures, ensuring inclusiveness, considering all situations/conditions while taking into account different crash configurations in mixed traffic; consistent design methodologies and tools for performance assessment of the new protection systems; evidence-based support to regulatory bodies for the potential adaptation of traffic rules.
  • Advanced HMI and HTI (Human-Technology Interaction) solutions enabling the safe, efficient co-existence and interaction of Connected and Automated Vehicles in mixed traffic (VRUs included); reliable, seamless interfaces, which are easy to understand, based on comprehensive knowledge and models of human behaviour/capabilities; monitoring systems and simulation models to assess/predict driver status; improved HMI functionalities to prepare the driver to take control (e.g., when the vehicle nears the ODD limit) and facilitate safe remote operation.
  • Vehicle technologies and solutions which assess/optimise the on-board experience in terms of well-being, security and privacy, with improved design, ensuring inclusiveness and preventing dangerous/uncomfortable situations; training and information campaigns.
  • Co-designed e-architectures and operating systems ensuring decoupled upgradeability of hard- and software for the simplified control of connected and automated to provide harmonized safety functionalities in extended ODDs, increased energy efficiency of the control system and predictive maintenance features.

Cluster 2 Timeline (as of Feb 2024)

All planned R&I actions of this cluster start early on the partnership programme. The actions advance during the CCAM Partnership timeline towards testing and implementation in Cluster 1. Expected outcomes will also support the Validation activities and facilitate the Integration of the vehicle in the transport ecosystem as well as defining requirements for key enabling technologies. Innovation Actions will follow and advance technical maturity and progress the state of the art after the first phase, delivering mature results ready for testing/implementation. The following image aims at making this progression process transparent.

Feedback form

Have feedback on this section??? Let us know!

Send feedback

Feedback

Please add your feedback in the field below.

Your feedback has been sent!
Thank you for your input.

An error occured...
Please try again later.