Software developed to accelerate ADAS testing


According to specialist simulator software company, rFpro, its latest Sensor_IG sensor testing software can generate realistic data feeds, in real-time, for sensor models of cameras, radar, LiDAR and ultrasound sensors. The software is claimed to enable the entire ADAS (Advanced Driver Assistance Systems) toolchain to be tested on a driving simulator with a human driver in control, reducing development times and cost.

“When testing sensors used for ADAS on a simulator, the simulation software responsible for sending video images to the human driver must simultaneously send realistic data feeds to the sensor models: you need ‘real’ data, as though from actual sensors,” explained Chris Hoyle, technical director of rFpro.

“Sensor_IG generates this data, enabling sensor models and algorithms to be accurately tested. It means much more ADAS development can take place in a virtual environment, which gets you closer to the final solution quicker,” Hoyle added.

The software can be used to create multiple simultaneous real-time feeds, each at the correct refresh rate for the sensors being modeled. For example, a stereo pair of 32bit HDR camera data feeds could be running at 25Hz, while a number of 20Hz medium- and long-range radar data feeds, as well as feeds for short-range ultrasound sensors are also modeled around the perimeter of the vehicle.

“Our core business is real-time driver-in-the-loop (DIL) simulation, but what we are seeing at customer sites is that 90% of ADAS testing never gets beyond the offline software-in-the-loop (SIL) stage,” stated Hoyle. “A further 9% of testing reaches a more detailed hardware-in-the-loop (HIL) test stage, and maybe just 1% of testing reaches the full DIL simulator. Sensor_IG means that your control systems and ADAS strategies may be tested by human drivers, in controlled conditions, without the need to carry out relatively expensive and time consuming tests involving real cars and real sensors.”

Hoyle is keen to distance the product from actual sensor models. “We are not offering sensor models,” he said. “Our customers are providing the sensor models themselves or from third parties. We are not experts in RF propagation or camera hardware, but we are however very good at generating high speed, high bandwidth data feeds from very high-quality 3D worlds in real-time.”

Share this story:

About Author

Adam divides his time as an editor between the worlds of aviation and motoring. These worlds may seem a little diverse today, but autonomous technology and future urban mobility is bringing them ever-closer. Adam is also chairman of the Vehicle Dynamics International Awards.

Comments are closed.