Tech

Tesla's Beta "Self-Driving" Software Sure Seems Heavy on the Beta

Regulators continue to be MIA as Elon Musk uses public roads as a laboratory.
Tesla FSD
Screenshot: Youtube
Screen Shot 2021-02-24 at 3
Moveable explores the future of transportation, infrastructure, energy, and cities.

Imagine you are grading a driving test. In the roughly half hour you're on the road, the driver fails to properly pick a lane, maneuvers behind a parking lane thinking it is a travel lane and waits behind a parked car, tries to drive down railroad tracks, runs over a curb, enters a parking garage by mistake, runs over pylons delineating the road from bicycle and pedestrian space, and cuts off another vehicle almost causing a crash. What do you think? Pass or fail?

Advertisement

The driver, in this case, was Tesla's "Full Self-Driving" (FSD) Beta version 8.2 in a video posted to Youtube by user AI Addict. Everything above actually happened on a single uninterrupted drive around San Jose, except the Tesla did not actually run over the curb or the pylons because the driver intervened to prevent it.

They also did a test in Oakland, which if anything went even worse. At one point, the car drove on the wrong side of the road and then disengaged FSD.

There is an entire genre of YouTube videos in which people "test" Tesla's FSD Beta software. Some are edited to make it look better than it is, others are edited to make it look worse. I chose to highlight these videos specifically because they fall somewhere in the middle. The drivers are obviously impressed and think the technology works well, particularly on suburban roads, but also clearly perturbed by its behavior in more urban environments.

Now, for the obligatory disclaimer: Tesla's "Full Self-Driving Beta" is not self-driving at all. It is illegal for drivers to let the car drive itself, because drivers must be monitoring the vehicle and have their hands on the wheel at all times. Which is what makes the "Full Self-Driving Beta" all the more ridiculous, as the entire point of the "beta" is to demonstrate some bastardized, janky version of self-driving technology. 

The rollout of the "beta" makes even less sense considering we already know of fundamental problems with the regular "Autopilot" product that has been available in Teslas for years. We were served yet another reminder of this on Wednesday when a Tesla on Autopilot crashed into a parked police vehicle investigating another crash. It is not yet known if it was in regular Autopilot, FSD, or FSD Beta. This is not the first time a Tesla on Autopilot has plowed into a parked emergency vehicle, or the first time the National Highway Traffic Safety Administration (NHTSA) has investigated such a crash. But if NHTSA decides to do anything about it, it would be the first time it took any sort of action.

Advertisement

And there are many things it could do. NHTSA exists for this very reason, to regulate unsafe vehicles on U.S. roads. Back in December, David Zipper outlined the various ways NHTSA could crack down on Tesla's semi-autonomous driver assist offerings. Teslas are still required by law to have active driver monitoring regardless of the software running, but the company has dropped virtually all pretense that it still requires hands to be on steering wheels. At this point, it is practically the encyclopedic entry for "predictable abuse," a standard NHTSA itself implemented in 2016. As former NHSTA chief counsel Paul Hemmersbaugh told Zipper in Slate, “NHTSA has the authority to weigh in and pursue a remedy such as a recall on anything that causes an unreasonable risk." 

Just last month, NHTSA made Tesla recall 135,000 vehicles for a safety issue so marginal in comparison, that it was possible faulty flash memory might result in drivers not being able to use the rearview backup cameras. We already know Autopilot poses a bigger risk of injury and death on American roads than that. Which is why, after watching these videos of FSD 8.2 reviews on Youtube, I continue to remain baffled that this is being allowed on U.S. roads. 

Advertisement

There are two common defenses of Tesla's approach, each equally unconvincing but revealing in their own way. One is the technocratic argument, that Tesla's software mostly works great and is an impressive feat of engineering that will only get better with more real-world experience. I would give this argument more credence, except the saying is not "almost only counts in horseshoes and self-driving cars." Lives are ended, limbs paralyzed, families destroyed in the margins created by "mostly works." That is, quite simply, not good enough, and anyone arguing the opposite ought to think long and hard about how it would feel if their closest loved one was in the driver's seat at a time when the technology "mostly" worked, let alone if they were in another car, walking across the street, or were biking when a Tesla malfunctioned.

The second argument is the more utilitarian approach, that human drivers are profoundly flawed, too, and we cannot possibly expect a computer to be flawless. Surely it is better to have a computer that, if not perfect, is at least better than a typical human driver? This is an argument Elon Musk himself has made. Again, I would give this argument more serious thought if we had any evidence Tesla's software was better than a human driver that didn't come from Tesla itself. We don't. In fact, as the videos above demonstrate, we have pretty incontrovertible evidence that, especially in cities, it is significantly worse, to the point where it could not pass a basic driving test.

But even if one of these arguments were more convincing, it ignores the most unforgivable aspect of Tesla's approach. In order to enroll in the beta program, Tesla owners must agree to a lengthy terms and conditions agreement. But unlike a beta version of some app on your phone in which the phone's owner is the only user, Teslas are driven on public roads and occupy public spaces. When one Tesla becomes a beta test, we are all beta testers. 

I never signed any terms and conditions. I never consented to become a beta tester for Musk's not-self-driving cars that mostly work. And if given an option by anyone, I would not consent, because it clearly doesn't work and driving is dangerous enough as it is.

Unfortunately, for all its flaws, FSD still mostly works, and that's more than anyone can say about the regulators at NHTSA who ought to have shut it down a long time ago.