Required

aimotive success

Great!

We have received your email.

aimotive failure

Something went wrong!

Please try again later or write an email to info@aimotive.com with the subject FWD TO: Info

Bravery or Foolishness?

The culprits of consumers acting as beta testers for safety critical software


October 26, 2020
Author: Gábor Pongrácz


Last week Tesla sent out a beta software update called "Full Self-Driving" to a select group of customers. This release is expected to comprise of improvements to the company's Autopilot – and include advanced driver-assist features designed for local, non-highway streets. While this is a remarkable engineering feat and a courageous move, it is also a dangerous one. 

On the one hand, it proves that Tesla is rightfully the most highly valued car maker in the world because its software technology is ahead of what anyone else can put on the road. On the other it places untrained consumers behind the wheel of what is essentially a test vehicle. At AImotive, we believe that using customers to validate development software versions should never be a solution.

Fully self-driving versus fully autonomous
The National Highway Traffic Safety Administration has stated: No vehicle available for purchase today is capable of driving itself. Before we even begin, we must agree with this statement.

The world seems to have become accustomed to Tesla naming certain features rather freely. It is no surprise that these often seem to be marketing ploys rather than the real deal in hindsight. In July, a Munich court ruled that Tesla cannot mention "full potential for autonomous driving" or "Autopilot" in its ads in the country as these could mislead consumers into believing that the cars can actually drive themselves without human input. Soon after, South Korea followed the German example.
 

Three months later, on October 21, the company released a beta software update to selected Tesla owners called Full Self Driving (FSD). The problem is the same as before: the name is misleading, as the vehicles can't drive themselves. A disclaimer on Tesla's website confirms this, saying that the Autopilot system doesn't make the vehicles autonomous, and drivers must still supervise it.

In 2020 Euro NCAP launched a comprehensive Assisted Driving (AD) grading system intending to inform consumers about the best AD systems currently available, and more importantly, highlighting their technological limitations. In the classification system, points are deducted for misleading names. This may be one of the reasons the Tesla Model 3 only finished in sixth place.

We can safely say that Tesla's definition of fully self-driving is very different from a fully autonomous vehicle. What they mean is that their cars are "able to be autonomous but require supervision and intervention at times." There is a fundamental difference between driver assistance and autonomy. Systems that require oversight from a human driver are not fully self-driving. Then why are they called that?

Using consumers to validate
According to some reports, up to 1 million Tesla users could receive the FSD update by the end of this year. This potentially means that 1 million untrained consumers will validate the beta software update for Tesla.

Public road testing is a serious responsibility. For example, at AImotive, a special (and very thorough) test driver exam is required before someone can drive one of our cars to validate self-driving software in traffic. What Tesla is doing is beyond industry norms and guidelines and raises the risk that someone might misuse the feature, ignore the warnings, and use the FSD update in a way that is dangerous to themselves and others.

These beta software updates are far from foolproof. There have been several incidents where drivers have crashed and died with autopilot engaged.

Would you trust a system that—according to Tesla—"may do the wrong thing at the worst time"?
 

However, there are also much better solutions for testing other than involving users. For example, a secure, scalable, deterministic, and reliable way to validate software updates is to use simulators. The world's first automotive-grade simulator certified with the ISO26262 ASIL-D standard, aiSim, for example, can run hundreds of thousands of scenarios in a matter of hours. Developers can run it on laptops, servers, or even a cloud provider such as Microsoft Azure. The main advantage is that software bugs can be fixed without putting others at risk. Patched software can then be re-tested in traffic in a safe, controlled way with the help of test drivers.


Tesla is still leading the pack
After discussing the downsides of last week's software update from an industry player's point of view, we also have to say that Tesla being the world's most valuable automaker, is no coincidence.

As we've mentioned several times, it is clear that software is the new king of the automotive industry. In this regard, Tesla is far ahead of any traditional OEM. The technical implementation of this new FSD update has to be praised because it is not only brave, but it truly is the most mature solution so far, even with all its faults.

With the scalable, modular, and holistic solutions AImotive offers, everyone has a real chance to catch up with Tesla—and even overtake them.

 

Back

For the latest updates, sign up for our newsletter.

Subscribe