Risk Assessment in AD for the Generation of Virtual Testing Scenarios

An Innovative Tool for Sensor and Driving Function Validation

The safety of vehicles using automated driving functions or an Advanced Driver Assistance System (ADAS) is a crucial factor regarding acceptance by users and authorization by legislation. Driving millions of kilometers to generate a feasible amount of data for the evaluation of ADAS functions is one possibility, however, the analysis is a huge manual effort for experts in the form of labelling the objects and incidents in the videos [4]. An incident is a critical situation in which a real-world driver must react in an unexpected situation. If a vehicle solves these incidents in a virtual environment, it is a huge step to solve them in the real world as well.

In the project RELAI – Risk Estimation with a Learning AI, the companies IPG Automotive GmbH and EDI GmbH – Engineering Data Intelligence work together with the research institutes Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (IOSB) and Institute of Human Factors and Technology Management IAT of the University of Stuttgart to solve this issue.

Using EDI’s approach, different video sources from real-world traffic like WAYMO and Berkeley DeepDrive datasets are combined with several other driving context parameters of the involved objects. Thus, EDI’s AI-based Dynamic Risk Management (DRM) algorithm identifies critical incidents in this data automatically. Those incidents are labelled and described based on situational parameters (e.g., velocity of the ego vehicle), involved objects (e.g., pedestrians) and their parameters (e.g., their velocity and distance to the ego-vehicle). Fraunhofer IOSB labels the objects and analyzes their behavior resulting in an understanding of the incident’s predictability. Out of the identified and labelled incidents, EDI generates a bird-eye view with the velocity and direction of the relevant objects, which can be exported as a object list in IPG InfoFile format. For a clear overview of the incidents, a labelled list including the extracted snippets of the real-world driving videos showing the incidents is displayed in a (web) application, complemented by the visualized bird-eye view and the attached info-file.

This info-file is imported by IPG Automotive as a raw scenario into CarMaker via ScenarioRRR process and varied regarding the environment (e.g., sun and lightning context, road lanes, trees, and other surroundings) as well as parameter values of the involved objects, including the ego-vehicle. Furthermore, the IAT of the University of Stuttgart evaluates criteria for how the autonomous ego-vehicle should behave regarding the expectations and acceptance of other traffic participants. This information is used to adapt the behavior of the ego-vehicle in the created scenario. By using automatically generated scenarios out of publicly available databases or individually recorded real-world driving videos, ADAS functions can now be validated in testing and simulation environments for realistic and reliable driving situations without manual analysis of videos and the generation of millions of driving kilometers.