ADAS

Virtual Scenario Generation for ADAS Testing – Tools, Methodologies and Benefits

Advanced driver assistance systems (ADAS) can defined as a vehicle based intelligent safety system which could improve road safety in terms of incident avoidance, incident intensity mitigation and protection and can also include post- incident phases. It is an integrated in-vehicle or infrastructure based system. For example, intelligent speed adaptation and advanced braking systems have the potential to prevent an incident or mitigate the severity of the incident.

In recent years, automotive active safety systems have become more prevalent. Furthermore, these systems will be used as the stepping stones for the imminent fully Autonomous Driving (AD). With the rising level of automation onboard vehicles, intelligent systems have to deal with an increasing amount of complex traffic scenarios. In turn, the intelligent systems themselves are also becoming more complicated. 

They consist of a plethora of different sensor technologies as well as increasingly advanced algorithms for sensor fusion, object tracking, classification, risk estimation, driver status recognition and vehicle control. As a result, it is rapidly becoming infeasible to check the performance of each new sensor or system in the traditional way by manually driving around, storing data, manually labeling the data for reference, and evaluating the results. This is where an integrated Virtual Scenario Generator can be leveraged for ease of testing and validation.

Prevalent ADAS Features and Themes

1. Features

There are a variety of features that encompass active safety including: Adaptive Cruise Control (ACC), Forward Collision Warning (FCW), Lane Departure Warning (LDW), Automatic Emergency Braking (AEB), Electronic Stability Control (ESC), Lane Keeping Assistance (LKA), Pedestrian Avoidance (PA), Adaptive Headlights (AH), Automatic Park Assist (APA), Seat belt reminders (SBR), Blind Spot Monitoring (BSM).

 2. Themes

Driver safety is a key theme used in addressing road casualty reduction targets through ADAS. Vehicle safety addresses the safety of all road users and currently comprises measures for incident avoidance and injury prevention (primary safety); reduction of injury in the event of an incident (secondary safety) and systems which assist with post impact care (to reduce consequences of the injury).

2.1 Incident avoidance systems 

There is a large scope for casualty reduction from driver assistance systems, as long as development is prioritized to provide maximum casualty reduction. Since driver behavior can modify the performance of safety systems which aims for incident avoidance, assessment of the regional driving patterns, human-machine interface (HMI), while complex, is essential. 

2.2 Incident mitigation systems 

These refer to active onboard systems which aim to mitigate the severity of the incident. Examples include intelligent speed adaptation and advanced braking systems.

2.3 Incident protection systems 

Substantial improvements have been made in the recent past for the scope of enhanced vehicle safety from improved crash protection which aims to reduce injury severity during the impact phase. Examples include improvements in occupant restraint systems which better reflect the different human tolerance thresholds of male and female occupants and a range of age groups. 

2.4 Post- Incident response systems 

A new development is the deployment of systems which aim to alert and advance emergency medical system support in the event of an incident. 

2.5 Integrated systems 

The prospect for in-vehicle systems to integrate incident avoidance, incident protection and post- incident objectives is being increasingly researched on, as are vehicle to vehicle and vehicle to grid communications (V-2-X).

Testing Methodologies

There are various testing methodologies in the automotive sector that are being used to test and evaluate various features that may or may not be specific to ADAS. The most used approaches include: 

  1.  Model-in-the-Loop (MIL) Testing, where algorithmic developments are used without involving dedicated hardware. This development generally involves high-level abstraction software (Simulink) frameworks running on general-purpose systems. 
  2.  Software-in-the-Loop (SIL) Testing, where the actual implementation of the developed model will be evaluated on general-purpose hardware. This step requires a complete software implementation used to test production ready source code derived from the model. 
  3.  Hardware-In-the-Loop (HIL) Testing, which involves the hardware, running the final software with input and output connected to a simulator. This process is very widely used in the automotive industry and has enabled the development of very high quality components which are then integrated into bigger systems or vehicles. 

Modern vehicles however integrate so many components that the integration phase has become more complex and also requires a multi-step validation process, followed by the final integration tests performed on tracks or roads. While mandatory, these real-condition tests are limited because of multiple factors and have a very high cost. Testing a complex system like a modern vehicle on a test track or on a real road involves complex and costly engineering. 

First of all, to be testable the vehicle must be fully or nearly-fully functional. This limits the testing opportunity to a very late stage in the development process and implies high engineering costs. Moreover, because the real-condition test is constrained in time and space, the test cases are not complete and only a very small pool of real-world conditions can be tested. 

To address these limitations and lower the cost, modern ADAS tests uses virtual scenario generations where realistic generation tool is used to enable faster and less expensive tests with better coverage on complete vehicles.

A detailed note on the virtual testing follows.

  1.  Virtual Scenario Testing

While test driving is still the main method for ADAS evaluation, the resulting data rarely contain events that would truly contribute to the active safety system analysis; even with thousands of miles driven. Crashes are rare and difficult to capture. Even close range encounters are rare. Hence, the test might end up with much of the collected data being simple false positives (i.e. driving with no difficult decision to make). 

Many of the drawbacks of hardware testing of ADAS are not present for a virtual test environment. Virtual testing with simulation software provides an efficient and safe environment to design and evaluate ADAS. Moreover, simulated scenarios are completely quantifiable, controllable and reproducible. As a result, the creation of virtual driving scenarios can be less time-consuming and laborious compared to on-road tests; in particular, when real testing data and conditions must be manually converted into analysis data for further testing.

Typical Scenario Generation Tools

Some tools are available as an Open Source (OS) platform and can be leveraged at no cost and can be used for preliminary level tests and for developers testing their algorithms. Users can build their own API’s that can be integrated with the OS tools. These tools can also be used to test the Deep Learning models created for ADAS/AD requirements.

The Licensed tools provide a range of options for the user to generate a scenario and comes with various interface models that can dovetail with the user’s current hardware or model configurations. These tools can be used for high level simulation requirements and does not require much manual model creations as there are various inbuilt resources within the pool.

The following are few of the widely used tools for virtual scenario generation. The current text classifies the tools in two groups depending on the delivery model. 

*Note: The list is not exhaustive and features only selected tools.

Licensed ToolsOpen Source Tools
Mechanical Simulation CarSim, IPG CarMaker, dSPACE ASM, Siemens Prescan, Ansys SCANeR, Cruden Panthera, rFpro, Vector DYNA4, Vires VTD, AVL VSM, MathWorks RoadRunner, Nvidia DRIVE SimCARLA, LGSVL, Autoware, Baidu Apollo, Microsoft AirSim, TORCS, OpenDS, Voyage Deepdrive, Udacity Simulator, Unity and Unreal Engines
Table 1: Virtual Scenario Generation Tools

Methodology for Virtual Scenarios

Creating Scenarios

The interactive road editors allow to design road networks in full detail with unlimited numbers of lanes, complex intersections, comprehensive signs and signaling. It links and exports logic and graphic data consistently from a single source, for example from Google Maps data.

Virtual scenes can be designed from scratch or compiled from existing database in the given tool. Various import and export formats (example: Open DRIVE) as well as large libraries of 3D models and country specific signs/signals can be made use of to accelerate the creation process. The export of the graphics data also can be customized.

Configuring Scenarios

Dynamic content can be defined with the scenario editor. Most tools have a database pool that allows the user to specify traffic as individual controllable objects or as autonomous swarms around the host entity. Various driving environments can be configured (example: RHD/LHD). 

The basic library of vehicles, pedestrians and driver properties may easily be customized. The user can define paths for individual entities, configure signal control programs, place objects and add events from a large set of actions. Real-time monitoring and command injection for the used entities can also be performed during the simulation phase.

Simulating Scenarios

The generation tools integrate with a range of environments such as; X-in-the-Loop (XIL), real-time or non-real-time and co -simulation with various modelling tools (example: MATLAB, LabVIEW). At any time, the user may take full control over the execution of the simulation, specify varying time steps and consume object, image and sensor data via a whole range of interfaces. Various externally computed simulation models can be injected and multiple iterations can be run either in parallel or inter-connected.

Customizing Scenarios

The user can customize the scenario generation tools on various levels. Either through SDKs along with ready-to-go templates for sensor simulation (object-list based and physics based), dynamics simulation and image generation, co- simulation with modelling tools. The open interfaces for run-time data and simulation control make it easy to integrate the tools in any environment (Virtual, Hardware or In-vehicle).

Steps for Creating Scenarios

General Maneuvers

The testing process follows a dedicated Design Verification Plan and Report (DVP&R) which defines the required test parameters such as the roads, lanes, traffic objects placed on the rod/roadside, host and traffic vehicle properties among others. 

The process may vary with each tool based on their dedicated structure, the following illustration shows the general steps that might be followed. The steps need not be followed in the shown order of sequence and is up to the priority of the user and the test case requirements.

The generic steps include:

Sensor based Maneuvers

Figure 1 represents a simple workflow of the process. In addition to the basic scenario requirements such as road, vehicles and objects; the scenario generation tools have various sensors that can be added to the host vehicle for performing various tests.

Commonly Used Sensors

The sensors commonly used in ADAS and Autonomous Driving and can be found in a scenario generation tool may include:

  • Light Detection and Ranging (LIDAR, ToF – Solid State/Mechanical)
  • Radio Detection and Ranging (RADAR, Short/Long Range)
  • Positioning/Navigation Sensors (GPS/GNSS)
  • Inertial Sensors (IMU/INS)
  • Ultrasonic Sensors
  • Camera
Figure 2: Sensors in Virtual Scenario (Image source: ipg-automotive.com)

The individual sensors mentioned above or the combination of more than one sensor could be made use to test various ADAS scenarios as mentioned in Section 2.1. 

Example Cases

  • A combination of Camera and LIDAR can be used for long range route mapping and obstacle detection at highway speeds.
  • A combination of Camera and Ultrasonic sensors can be used for Park Assist.
  • A short range RADAR for detection of potholes and small obstacles (small animals, trash etc.) in close proximity.
  • Inertial sensors can be used for Stability Control scenarios, and so on.

Sensor Models

Each tool delivers sensor data in different models, the models represent the perceiving methods of the sensory output. It can be explained under two models namely; Physics based and Math based. 

  • Physics Based Model

Physics based model capability is the real-time computation of the sensor data directly from the visual scene, with no offline pre/post computation or storage requirements. This real-time model combines automatic scene classification of visual depth and RGB imagery, and a physics-based data and sensor model. 

The actual physics based model takes full account of the local scene environment, factoring information such as time of day, current traffic conditions, rad/terrain and other sensor data as per requirements. The result is a physically accurate sensor scene derived from a visual spectrum database.

  • Mathematical Model

Unlike physics based model, a math based sensor model does not take real-time scene data for vehicle simulation. Rather these models make use of precomputed scenario conditions and formulae for maneuvers. The formula based actions are triggered by maneuvers specific maneuvers or commands and in certain cases a dedicated Human Machine Interface (HMI) can also be made use of.

Conclusion and Benefits

In conclusion, considering the simplicity in creating the scenarios, virtual generation methods have found a widespread acclaim. Especially in the upcoming startup environment, purchasing actual tools for ADAS/AD requirements may not be practically possible considering the high costs, particularly most of the sensors mentioned in section 5.2. In these cases, the usage of the sensors provided by the virtual tools can provide enough testing accessories in both developing an ADAS feature and in testing and validating a specific feature. The major benefits are listed below:

  • Complex scenarios can be created precisely without expenditures for generating generic and realistic test cases.
  • Cost and usage feasibility in case of type of sensors and architecture being used.
  • Extensive scenario and situation coverage in simulation.
  • Feasible testing times for larger test pools.
  • Minimum to no safety hazards.
  • Shorter development time.
  • Reusable scenarios.

Author:

Darshan has a vast experience in scenario generation, testing and validating ADAS and Autonomous Systems (Automobile & Marine) in conjunction with a range of Startups, Tier 1s and OEMs. He was an integral part of a startup who were one of the first teams to deploy Autonomous Vehicles on Singapore public roads. Designed an ADAS module actively used by a Tier 1. Co-developed a custom Autonomous System Framework (ATON) and has authored 6 papers and 2 books based on the build.

Published in Telematics Wire

Check Also
Close
Back to top button