aimotive success


We have received your email.

aimotive failure

Something went wrong!

Please try again later or write an email to with the subject FWD TO: Info

CES 2019: Showcasing the next generation of Self-Driving Technology

The biggest show of the year for the technology and autonomous industry is underway in Las Vegas and the AImotive team is on hand at our booth in the Convention Center to answer questions about our products and showcase our latest innovations.

January 09, 2019
Author: Daniel Michael Seager-Smith

Over the last few months of 2018, we made several major announcements regarding aiWare, aiSim2 and aiDrive2. CES is the first event where you’ll be able to see our next generation ecosystem of self-driving enabling platforms in action on the show floor. Visitors to booth #7538 will see aiDrive2 running on automotive-grade hardware, the scalability of aiSim and the power gains of aiWare through four live demos.

At AImotive we are working to create modular, scalable and transparent technologies that enable our partners to achieve their self-driving goals. All our products and development work are organically interconnected. aiDrive has driven the development of aiSim2 by allowing our team to recognize the challenges of simulation for autonomy. aiSim, in turn, aids the development of aiDrive continuously as a virtual test environment. The development of aiDrive also leads us to the creation of aiWare, our dedicated NN acceleration hardware IP core. As further challenges have arisen our team has worked to overcome them resulting in the next generation of self-driving technologies on display.

aiDrive2 is AImotive’s self-driving software stack. The highly modular design allows our partners to select the elements of the system they need and easily integrate them into their existing solutions. aiDrive2 can in a sense be considered an SDK that supports the creation of automated driving. To showcase the maturity of the solution we have aiDrive running on a Drive PX2 platform in our booth. While its functionality is limited compared to the complete software stack, this implementation illustrates how our code can be optimized to run on automotive grade development platforms and serve as the base or elements of a production automated driving solution. The demo shows aiDrive networks performing recognition tasks on a highway scenario, reading four camera image streams: two fish eye cameras on the side of the vehicle, a camera looking forward and a camera looking backward. Nevertheless, aiDrive is compatible with a wide range of sensors including radars, LiDARs, IMUs, sonars, and GPS.

It is this wide range of compatible sensors which has driven the creation of dedicated sensor simulation modules in aiSim2. Achieving the highest possible realism in sensor simulation is one of the many reasons we developed our own rendering and physics engine. To simulate one frame of a fisheye camera six images have to be rendered and then distorted. This requires a huge amount of resources, especially when up to 24 camera feeds and different sensors also have to be simulated. The optimized utilization of multi-GPU and CPU setups is a core feature of aiSim2 and can be seen in action at our booth. The same scenes are rendered on a four-GPU setup in real time and a single GPU in fixed time step mode. The resolution and rendering quality is the same in both cases. This illustrates the deterministic rendering of the system, a characteristic vital for the proper testing of automated driving technologies. Similarly to aiDrive2, aiSim2 also has a modular design, meaning our partners with existing simulation solutions can rely on selected modules of aiSim to enhance or augment the technologies they have in place if required.

The power consumption of the hardware required to accelerate AI-based automated driving solutions remains a major barrier to the deployment of these technologies. Following the announcement of aiWare3 in the Fall, we can now showcase the optimized power consumption of our hardware IP when implemented on silicon. Our proof-of-concept aiWare ASIC consumes only 5W of power while accelerating two proprietary AImotive neural networks for free space segmentation and obstacle bounding boxing when running on a 720p video stream at 10 FPS. This implementation of aiWare could be used as the base of smart sensor solutions such as a blind spot monitoring system of intelligent reverse cameras. However, more powerful implementation of aiWare could in the future serve as the technology behind powerful central compute clusters driving autonomous systems.

This interconnected development of these three branches of technology means AImotive is uniquely positioned to understand the demands of self-driving, simulation for autonomous systems, and hardware platforms for accelerating them. Our ecosystem forms a modular platform we hope will facilitate collaboration in the self-driving industry and lead to the creation of the safest automated driving systems possible. To learn more, visit the AImotive booth #7538 in Tech East North Hall of the LVCC. Our team is always on hand and happy to answer any questions you may have. Also if you want to stay up to date with the latest we have to offer in self-driving news make sure you subscribe to our newsletter.


For the latest updates, sign up for our newsletter.