Tesla FSD 12: Commentary

A few days ago, Tesla CEO Elon Musk introduced the new FSD beta version 12 in a livestream, driving from Tesla’s Global Engineering headquarters in Palo Alto to a destination and filming the ride. I myself currently have version 11.4.4 on my Model 3, which I recently demonstrated to a Belgian delegation of transit agency and public transit representatives in San Francisco in half-hour drives through the city each way.

There are some things I can’t judge from the video, because I would have to be in the car to do so. These are mainly things like how “smoothly” the vehicle drives, or whether it occasionally hesitates, such as braking slightly while driving for no apparent reason, or braking when turning, or correcting the steering wheel.

Speed bumps seem to be recognized by the vehicle shown in the video, although it is not clear whether it recognizes them automatically or whether they are already entered in the system. I myself recall a situation a year ago when my Model 3 took such a speed bump in Los Altos at the valid speed (25 miles per hour, or just under 40km/h) and shook the occupants.

In the video, Musk comments that the system detects these speed bumps itself, using video data pulled from Tesla vehicles. To that end, it’s important to note that Tesla owners can allow Tesla to pull data from the vehicles and use it in their machine learning computer to develop the FSD.

And that seems to be the approach: Tesla’s FSD 12 apparently learns from this video data and also how real drivers behave. They slow down in front of speed bumps, increase their distance from cyclists, and roll through stop signs instead of coming to a full stop. Not all data is taken from all drivers. Data from bad drivers is excluded from the training data set by Tesla, even mediocre driving data, is removed, mainly data from good drivers is selected and fed into the system.

KREATIVE INTELLIGENZ

Über ChatGPT hat man viel gelesen in der letzten Zeit: die künstliche Intelligenz, die ganze Bücher schreiben kann und der bereits jetzt unterstellt wird, Legionen von Autoren, Textern und Übersetzern arbeitslos zu machen. Und ChatGPT ist nicht allein, die KI-Familie wächst beständig. So malt DALL-E Bilder, Face Generator simuliert Gesichter und MusicLM komponiert Musik. Was erleben wir da? Das Ende der Zivilisation oder den Beginn von etwas völlig Neuem? Zukunftsforscher Dr. Mario Herger ordnet die neuesten Entwicklungen aus dem Silicon Valley ein und zeigt auf, welche teils bahnbrechenden Veränderungen unmittelbar vor der Tür stehen.

Erhältlich im Buchhandel, beim Verlag und bei Amazon.

I have not yet seen the FSD Beta in traffic circles, because they are quite rare in the USA. The Belgian delegation had asked for it because they are quite common in Belgium, France and the UK. The FSD 12 seems to handle traffic circles in Austin just fine.

Musk mentions that the FSD 12 he’s showing runs on hardware version 3.0, the same one that was installed in my June 2019 vehicle, with the first AI GPUs designed by Tesla itself. The FSD also doesn’t need an internet connection while driving because it can generate all other data in addition to the navigation data in the vehicle itself. The eight cameras produce a frame rate of 36 frames per second, which is one and a half times that of movies.

Apparently, there are already test drivers for the FSD 12 around the world, in New Zealand, Thailand, Norway, Japan and the USA. This also shows Tesla’s approach, which is quite different from Waymo or Cruise. While the latter are “boiling pots”, i.e. driving and perfecting their vehicles in geofenced parts of a region, Tesla is “boiling the ocean.” Tesla tries in other words to let Teslas drive autonomously everywhere where Teslas are already driving today. Not in a geofenced area, and only with cameras and without lidars, radar or ultrasound. And this approach is certainly not easier and takes longer, but over the time that I had the FSD beta on my car, I could clearly see the progress. As a former software engineer, I find this achievement impressive.

However, with FSD 12 there also seems to have been a paradigm shift at Tesla. Instead of telling the system the traffic rules and programming in driving behaviors based on driving data, Tesla went to analyzing only the driving data from good drivers and using that training data to train Tesla’s Deep Learning system.

This approach is similar to the evolution of the different versions of Google’s Go computer, which started with AlphaGo, and then evolved into AlphaGo Zero and AlphaZero. While AlphaGo was trained with 30 million games of Go played by humans, AlphaGo Zero was only given the rules of the game and then played nearly 5 million games against itself in 40 days. AlphaGo Zero then won 100:0 against AlphaGo. AlphaZero in turn was taught the game on new hardware and improved algorithms in only 24 hours and with the rules of the game, and this then won against AlphaGo Zero with 60:40.

At 19:50, however, FSD 12 has a glitch, it interprets the traffic light signal for turning left as the one for going straight. Musk has to intervene here and prevent the vehicle from crossing the intersection on red.

What was new to me was that at the end of the drive, the vehicle doesn’t just stop in the middle of the road and the FSD turns off, but the vehicle drives toward a vacant parking spot on the side of the road and stops (at 24:15 in the video).

One question that came up while driving (around 33:15 in the video) is about poor visibility in rain, snow or obstacles at intersections. According to the statement, the vehicle then reduces speed and at intersections it feels its way forward. I myself have already noticed something similar with the FSD 11.4.4. when it can not see well enough to the left at intersections. Then it drives slowly forward. In heavy rain, however, it hands over control to the driver and reports poor visibility. That’s why the driving data from other countries is so important, by the way, because in California the weather is very often sunny and therefore good, while in other countries it changes more often.

It is also very nice to see how the vehicle deliberately crosses barrier lines sometimes in order to keep distance to a cyclist and still keep going. Such a scene can be seen from 34:45. Then about a minute later at 35:45 there is a nice treatment of an unprotected left turn with cross traffic from both directions and oncoming traffic. And then starting at 43:40, the vehicle enters the Tesla parking lot, of which there is no road map left in the car. In other words, the car automatically generates a road map as it drives.

The drive itself takes place in Palo Alto and Stanford, which are rather small suburbs with little traffic, well-marked roads, without much pedestrian or other traffic. Unfortunately, the video doesn’t really show the juxtaposition of what the screen is displaying versus what can be seen through the windshield, because Elon Musk is obviously holding his phone in the driver’s seat, switching between portrait and landscape mode over and over again.

Anyway, the ride itself is quite uneventful and almost boring, as you would expect from an autonomous car. In the video, the situations seem to be handled confidently and offer some innovations to the current version 11.4.4. that I have on my Tesla. How much the FSD Verson 12 Beta proves itself in practice and how quickly shortcomings can be eliminated will only be seen once I and others also have it. However, one thing can already be said: FSD 12 surpasses everything that traditional manufacturers currently offer or even expect as functionality before 2030.

Here is the video of the live stream:

This article was also published in German.

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *

Generated by Feedzy