Camera simulation – GPU is the only limit

News & insights

Check our latest stories on automated driving

Cars with lights on in aiSim software

Written by Zoltán Hortsin / Posted at 5/6/21

Camera simulation – GPU is the only limit

In 2020 aiMotive's aiSim™ simulator was certified to TCL 3 according to ISO 26262:2018 by TÜV-Nord. This made it the world's first ISO26262-certified comprehensive automated driving simulator for the development of automated driving solutions. Let’s go a layer deeper and see what’s under the hood: what makes aiSim's rendering engine special and unique.

Game engines vs sensor simulation – Why "physically-based" is our motto?

Most game engines focus on spectacular visuals, but they are not necessarily physically correct. In the development of aiSim, we aim for physically-based image synthesis because the main emphasis for automated driving solutions is not on the aesthetic experience but on being as realistic as possible. However, this does not mean that aiSim cannot handle spectacular effects. Furthermore, in aiSim, we can turn these effects on and off – and the number of these visuals can also be set by the users depending on the use-case. Another element that is not visible initially is that our simulation is completely deterministic – even for sensor simulation, on the pixel level (on the same GPU). This is a must-have requirement for automotive testing, enabling the replayability of scenarios with consistent results.

Vulkan API

We use the Vulkan API (application programming interface), to create the most scalable and robust product. Doing so, helps us make the simulation a platform-independent application, and also takes advantage of multi-GPU execution. aiSim has a built-in scheduler that transparently schedules sensor evaluations between GPUs. The software provides the same features on all OSs and deterministic behavior. Importantly, the Vulkan API lets us improve not only camera sensors but may help in simulating other sensors as well – in the same application.

The Vulkan API is also ready for ray-tracing. This gives us the freedom to find the optimal way of simulating each sensor type using the same API. Camera, LIDAR and radar simulations can be executed within the same framework and application.

Check back soon for another blog post on how we utilize the Vulkan Ray Trace extension for LIDAR and Radar sensor simulation, but for now, let's talk about camera sensor simulation.

Rendering of Camera sensors

As mentioned above, post-process effects can be turned off, but what happens if you turn them all off? You will be left with an HDR physically-based render and lens distortion. Our lens model is also mathematically correct. Users can also fetch the raw HDR color values without the lens distortion; this allows the integration of a more complex, higher fidelity ISP into the simulation. This feature opens the door for high-fidelity camera sensor simulation with unlimited possibilities. 

aiSim also contains state-of-the-art physically based rendering in which we follow the "canonical" lighting equation and avoid artistic math modifications – unlike most game engines. That is why we integrated specific component-level tests, like image quality tests (color accuracy, dynamic range) into the camera sensor simulation. In aiSim, not only materials are physically based, but our atmosphere/weather and light representations are photometric as well. Both natural (for example, the Sun or the sky) light sources that we calculate from a physical model and artificial light sources that support IES are available. This means that all LEDs, bulbs on vehicles, including turn indicators, brake lights, and front headlights, are physically modelled correctly. Going further, the same scene with a different atmosphere, day of time or weather gives physically accurate illumination.

Building on our in-house developed rendering engine we came up with a unique tool that is bringing never-seen-before determinism and high-fidelity sensor simulation to the table. With the release of aiSim 3.0, we believe that we created a simulator that brings a new era to validation of automated driving solutions.