The Fixation on Tesla’s Autopilot is a Problem for Germany

With the German Bundestag passing the law on autonomous driving a few days ago, attention is once again falling on the state of current technology development. Which manufacturers are the first candidates to benefit from this law from 2022? So far, on the one hand, the rather limited self-driving shuttle buses of providers such as Navya, EasyMile or Local Motors have been in use, mostly tested at very low speeds with security personnel on private property or at least a very controlled environment, on the other hand, test vehicles of the major German OEMs or the start-up Kopernikus Auto, which have been spotted sporadically and rarely and then mainly on highways.

The first thing that comes to mind for the public is Tesla, which is well-known and, above all, infamous for its Autopilot. The reports of accidents in which Autopilot was engaged and caused an accident attract immense attention. Fatal Autopilot accidents like the first one involving Joshua Brown, where a flatbed truck crossing the highway was interpreted as a directional indicator by the Model S’s Autopilot, to the accident in which a Tesla Model 3 crashed into an overturned pickup truck in South Korea without braking. Other accidents, such as the one reported in China or the one in the U.S. in which two occupants were burned to death in their Model S after they crashed into a tree at high speed shortly after starting to drive, were not attributable to Autopilot but to driving errors during manual driving.

Nevertheless, Autopilot keeps people busy, and for the layman and non-tesla owner, it is also admittedly difficult to understand how and how well it works. To make matters worse, Tesla has also put a pre-release (beta) version of the so-called Full Self Driving (FSD) software on about 2,000 customer cars, which is now being tested by customers. Add to that the fact that Tesla has only used cameras and, until recently, radar for FSD, while most other companies developing self-driving technology also use LIDAR and think they can’t do without it, and the confusion is complete.

So many questions

Who is right now? And that is only one of many questions.

Is Level 4 possible without LiDAR and radar?Are autopilot and FSD the same thing?Are they now Level 4 or Level 2 autonomous systems?And didn’t Tesla CEO Elon Musk promise us Tesla robotaxis last year?Is Tesla’s FSD nothing more than a big scam because Tesla has been accepting payments for the feature for years but still hasn’t delivered anything?

And these are all perfectly legitimate questions that can lead to confusion. Coupled with half-knowledge and the tendency to dismiss Tesla’s efforts and overestimate one’s own competence, this can lead to a dangerous situation for Germany as an industrial location.

Candles and light bulbs

First of all, we need to realize that Tesla Autopilot and FSD are two different systems, just like the candle and light bulb. Autopilot is defined as a Level 2 system that assumes a driver in the vehicle who can take control of the vehicle at any time. The autopilot is the candle, standing in the same room, breathing the same air, but built of completely different materials and operating on different principles. The FSD is based on a different software architecture that has been developed to allow a Level 4 system. This means that no driver needs to be in the vehicle anymore, but can come to a stop in a controlled and safe manner, should a case arise where the vehicle is stuck. The FSD is the light bulb or LED. Although the light bulb is in the same space as the candle, it seems to do similar things, namely emit light, but the construction and operation are completely different, as well as the possible applications. A light bulb (or LED) can work in the most adverse conditions, in the rain, wind and cold, but also in a flashlight or as a backlight for controls.

But now Tesla officially announced a few weeks ago that the FSD will only be a Level 2 system? So how does that fit together? The reason is primarily a legal one and the liability involved. The California DMV checked with Tesla and wanted to make sure that the FSD beta that is on customer vehicles today is not being touted as a Level 4 system, mainly because official approval is still lacking to date. It is still a beta, meaning it is under development and in the process of being upgraded and improved. This requires the presence of a driver in the vehicle, who can take control at any time. However, this does not contradict Tesla’s aspiration and goal to develop the FSD into a Level 4 system.

So clearly autopilot is not FSD. The autopilot is designed as a level 2 system, the FSD as a level 4 system, even if it does not meet these requirements today. Anyone who claims that the FSD is only Level 2 and will never become Level 4 is not doing themselves any favors and is lying to themselves.

With LiDAR or without LiDAR?

Can Tesla now achieve all this without LiDAR and radar and only with cameras? Here, opinions are divided. The majority swears it is not possible, the minority says it is very possible. Some analyses show possible hurdles. It seems certain that in the short term it could be difficult to achieve equivalent functional reliability and accuracy with cameras alone, but in the medium and long term algorithms have always won in the past. One only has to think of the ever better compression algorithms that made movie streaming possible in the first place, even with poor connections.

Insiders become skeptical when considering safety regulations and standards that manufacturers must comply with when homologating vehicles and approving new functions. Redundant systems that can be taken over by another system if one fails are mandatory for steering control and brakes. A similar situation can be expected with self-driving technology. Because this technology is still under development, however, legislators cannot yet make specific regulations because dominant and safe approaches have not yet emerged. In many cases, the legislator also does not prescribe a specific technology, but keeps it vague, as new developments can and should encourage other approaches.

There was a very concrete example with the rectangular steering (yoke) wheel that Tesla introduced with the refreshed version of the Model S and Model X. If the initial reaction from the public was that they thought the registration authorities would never approve such a steering wheel, the statement from the Dutch registration authority taught otherwise. In fact, the text of the law refers to steering control. Nowhere does it say anything about the shape, i.e. whether it has to be a steering wheel.

Vaporware?

Now Tesla Robotaxis have already been announced by Elon Musk for last year, and the FSD for the customers still not. Is this software therefore vaporware, i.e. just something long promised that ultimately never comes to market? Anyone who has been watching Tesla for years will have noticed that ambitious goals are always stated, which are often not achieved, but only after a delay. However, if you look at the ten-year master plans published by Elon Musk in 2006 and 2016, Tesla has always ultimately made good on its promises and delivered. In the case of the Model 3 and Model Y, even far ahead of schedule. If you recognize one character trait of Tesla, it’s that it doesn’t give up or simply abandon a project. It gets delivered. Why should that be any different with the FSD Beta, such a highly anticipated and important piece of functionality for Tesla?

But when is it really coming? Looking at the current state of development and the progress that beta testers are posting, FSD could come marked as a Level 2 system on all customer vehicles that have ordered it – including my Model 3 – in the second half of this year, at which point we can expect more than 100,000 of today’s more than 1.5 million Teslas with the appropriate hardware kit capable of running FSD to be charging it. For the next year or two, these Tesla owners will officially use it as a Level 2 system, and the data sets generated will be uploaded by Tesla to Tesla via over-the-air updates, fed into the Tesla machine learning system and processed, and the respective new version will be uploaded to customer vehicles at intervals of a few weeks. Rapid improvement can be expected simply due to the volume of vehicles and experienced traffic situations.

This is a scenario well within reach for the U.S. and possibly some other countries (Canada, China…).

The simple reflex

The simple and quite understandable reflex of some is that Tesla’s efforts and approaches are not to be taken first and that nothing can be learned from them. But that would be a serious mistake.

First of all, Tesla is only one of about five dozen companies developing self-driving technology. Currently, 55 companies in California alone have a test license, and eight companies are even allowed to drive on public roads without a driver. Among these, Tesla is not currently one of the leading companies, if the data from the Disengagement Report from California is used for comparison and ranking of development progress. In it, you can see that several companies more than meet all of the above criticisms of Tesla and are on their way with seven-league boots to bring this technology to market, probably still in the first half of this decade, in any case much sooner than many in the DACH region think.

However, because Tesla has been taken as the yardstick, the message to the domestic industry, its management, the developers and the public is that there is nothing to fear from this technology. Yet this is not only incorrect, it is almost negligently false. Such a message signals that there is no need to exert oneself in developing this technology. But far from it.

The way the law on autonomous driving mentioned at the beginning is reported and discussed in public alone shows how little is known in the country, even among experts, about the state of technology development. For example, not only Federal Minister Scheuer, but also other politicians and the media saw this law as proof that Germany had made itself number one in autonomous driving. Nothing, however, is further from the truth than that. With this law, we now have the best referee, but you don’t become world champion with the best referee, but with the best team. And we are currently playing in the regional league, but not in the Champions League, if we only look at the figures from the Disengagement Report.

Nor is the virtual absence of autonomous test vehicles from German manufacturers on public roads in Germany a sign that a) German manufacturers are not making as much fuss and marketing as the Americans and Chinese, and b) they are in fact already much further ahead. Such technology cannot be developed only in the laboratory and on closed test tracks, just as an airplane cannot be developed only in the wind tunnel, but must go out into the real world. And this should have been done long ago in large numbers with passengers.

Healthy paranoia

Instead of talking disparagingly about Tesla and wasting our energy on why Tesla supposedly can never make its cars self-driving, we should instead not be so complacent, but more paranoid. Not in the morbid way, but the healthy way.

Even as certain as we are that Tesla will fail here, those skeptics (and everyone else) should still consider that one percentage point of probability that Tesla could pull it off after all. And then they should run through some thought exercises:

What if Tesla does manage to get FSD to Level 4 and approved?What if – as we have already seen with COVID – laws and approval rules for this technology are completely redrafted and quickly established and Tesla’s approach proves to be approvable?What if Tesla does get by without LiDAR and radar after all, and in one fell swoop several million Teslas can drive autonomously?What does that mean for the domestic industry and its products?What new business models would emerge for Tesla?How would such a success affect Tesla’s stock price?What impact would something like this have on the opinion and behavior of its own people if they can experience autonomous driving from Tesla itself, but not from domestic companies?What could domestic companies learn from Tesla, what assumptions could probably be jettisoned then or already today?

Such a thought exercise makes one thing clear above all: it brings a touch of urgency to one’s own endeavors because they question assumptions that are perceived as unchangeable. Those who deal with Tesla only in the manner mentioned at the outset, which tends to be pejorative, are not doing their own country any favors. He underestimates the threat, helps his countrymen lull themselves into a sense of security, and thus endangers domestic industry and the industrial location.

Our own arrogance

The fixation on Tesla Autopilot itself is not the problem, but the way and tendency how. Our fixation is one of a pejorative nature that denies Tesla’s ability and capability. Instead, we should take Tesla’s Autopilot and especially FSD as inspiration and incentive to create technology that is not only equal to, but superior to Tesla’s technology under the same conditions. However, our own arrogance has so far prevented us from doing so. And that is the real problem that will fall on our heads.

This article was also published in German.

Read More

System

This news post was created automatically from a RSS feed reader.

Leave a Reply

Your email address will not be published. Required fields are marked *

Generated by Feedzy