I was asked to watch a bunch of videos of drivers using Tesla’s FSD Beta. This is what I thought.

Back in December, the Washington Post reporter Faiz Siddiqui asked me to respond to a number of videos showing Tesla’s Full Self Driving (FSD) beta technology in action. My comments (and a subsequent video) informed Faiz’ article, but ultimately didn’t find their way into print, so I thought I’d put them up here for posterity. [Update: the Washington Post article on the videos is now up, and available here].

I’m posting these with the proviso that they were very much on the fly responses!

Context

These are my on-the fly-notes (lightly edited) associated with watching several videos of people driving Teslas with recent iterations of the Full Self Driving (FSD) beta engaged.

I must admit that I am a fan of self driving technologies, and hopeful that they will mature and substantially improve our lives in many ways — some obvious, others as-yet unseen. However, I am also personally and professionally concerned that irresponsible and unthinking development and testing in the transition between all-human and human-machine hybrid vehicles could have lasting adverse consequences, from injury and loss of life, to a loss of trust in a technology that, in the right hands, could be transformative.

I’ll confess up front that I’m critical of Tesla’s approach to developing and testing their Full Self Driving technology. The company is making a huge gamble that the technology will learn and improve faster than liabilities and roadblocks associated with its beta deployment. It’s a gamble that may pay off — if there are few serious incidents involving drivers, passengers, other road users etc, consumer opinion continues to support the company, and Tesla stays ahead of the regulators, I can see a point where the safety and utility of FSD far outstrips concerns. But the gamble is highly risky, and depends on a number of factors outside of Tesla’s control.

One of those factors is the experience of early adopters and testers of FSD. In looking at the videos (below) of some early testers, I’m especially interested in four things:

  • Does FSD potentially inconvenience the driver;
  • Does it potentially inconvenience other road users;
  • Does it potentially place the driver in substantial danger, and
  • Does it potentially place others in substantial danger.

Inconvenience is important as, while it’s easy to criticize autonomous vehicle technologies, this is a natural part of developing a new and cutting edge technology. It’s also why beta testers with a high tolerance for inconvenience are used to iron out early issues.

Substantial danger is trickier. Humans in cars are dangerous, although one of the amazing things about us is that, despite all of our cognitive and behavioral inadequacies, most of us are competent, if not necessarily good, drivers.  However, there is a subset of drivers that are more dangerous on the road, and this includes student drivers.

Here, we make allowances under the understanding that drivers have to learn through experience, and make mistakes along the way. As a car-driving society, we accept there is risk here to the drivers and others in the transition from non-driver to driver.

This is closely analogous — in some ways at least — to what Tesla is doing with FSD beta. And as such, it’s fair to assess the technology in terms of a student driver that’s learning — albeit rather faster than most humans I suspect. Then the question is not so much mow many mistakes the technology makes, but the severity of these mistakes in terms of how they place the driver, other road users, and even pedestrians, in jeopardy; and whether the learning process is worth the risk.

With that, here are my on the fly impressions watching the videos:

Video Responses

[FSD Beta 10.6] San Jose Stress Test…Rail Roads, Pedestrians & Bad Drivers

https://www.youtube.com/watch?v=2ub2F-UnXIU

1:10 FSD seems to be struggling more than most people on a left turn across rail tracks. Disconcerting certainly, but nothing that an attentive driver couldn’t manage. This highlights challenge of complex environments with unexpected and uncommon configurations. Testers refer to these as “edge cases” but one of the challenges of driving safely is being able to be adaptable to situations and environments you haven’t encountered before.

3:09 There’s an odd glitch here where there seems to be a short fight for control between the driver and the car. The road is narrow and there are parked vehicles either side, but this is disconcerting. It’s something that should be handleable by an experienced driver, but could easily throw less experienced drivers for a loop. It appears there are scenarios where both driver and car potentially lose control at some points. This is important as this is not a glitch in the car’s judgment, but in the ability of the human driver to ensure safety.

3:44 It’s notable that the second time down the same street, there is no problem. Either FSD is showing how sensitive the system is to marginally different conditions, or that it’s a fast learner. It’s worth noting here that student drivers may also hesitate navigating narrow roadways and parked vehicles, but the behavior of FSD departs from inexperienced driver hesitancy in cases like this.

4:00 FSD shows hesitancy with a right turn right. The behavior is frustrating, but more of an inconvenience than a danger, and shows caution.

6:00 Driver takes over from FSD  at a pedestrian crossing with pedestrians. It’s not clear how FSD would have navigated the pedestrians in this case, and it feels similar to a driving instructor taking over from a new student. But more worrying than a student driver, FSD seemed to be behaving a little erratically. This is clearly concerning where pedestrians are involved. Again this may be down to a confusing/complex environment, but there need to be extremely high safety standards around behavior that potentially places people in any proximity to danger, and it’s not clear that the current beta of FSD is fully trustworthy in this case (at this time).

6:30 FSD inches forward on no right turn on red. The system’s showing inappropriate behavior with respect to the rules of the road, and while it is not life threatening, it suggests that the current beta is still capable of making poor driving decisions.

7:14 FSD stops at a red square on a building as if it’s a stop sign! This is good in that it shows an abundance of caution, but at the same time it’s frustrating behavior that would not be seen in a student driver, and illustrates where FSD departs from the analogy with a human learning to drive.

8:01 The Tesla shows confusion on which route to take, and tries to go down rail tracks. We’ve all been there (at least I have), but I would hope even beta tech would be able to navigate this. This instance does demonstrate what is still a large gap at times between human interpretation of complex and messy road conditions, and FSD’s

8:17 A pedestrian starts to walk out in front of the car as it turns. It’s unclear whether the car reacted or not to their presence, but clearly the driver is shaken.

8:58 The car is still having problems with the railway tracks. This is illustrative of subtle environmental features that humans have learned to interpret and navigate, but that are extremely hard for a machine to distinguish. It’s impressive that FSD does as well as it does, but from a safety perspective, impressive isn’t enough.

9:44 Railroad tracks again confusing FSD

My overall impression with this video: My anxiety levels were surprisingly high watching this. It felt like driving with a second or third time driver in the seat! There were a number of glitches that were easily corrected by the driver, but would likely confuse a less experienced beta tester.

The two incidents involving pedestrians were concerning. Here, there’s no room for error in self driving vehicles that are on the road. It’s understandable that there will be cases where FSD gets confused with complex road layouts and messy environments where pedestrians are thrown into the mix, but this type of confusion increases the risk stakes considerably, as well as potentially putting people who have nothing to do with the car or the tech in danger.

 

FSD Beta 10.2 First Drive Trip 3 – 2021.32.25 – Tesla Vision AutoPilot – NO RADAR

https://www.youtube.com/watch?v=bchYnCqRqDI

1:29 There was confusion with FSD and disengagement on the freeway with a left hand turn. The driver manages this, but what is clear is that driving with the FSD beta adds another layer of things that the driver needs to focus on to remain safe. This is OK for an engaged and experienced driver, but just how good are FSD beta testers in juggling more inputs and split second decisions than your average driver?

The rest of the trip was fairly smooth and uneventful, but it also took place on relatively straightforward roads, with very few challenging conditions to navigate.

 

Tesla FSD Beta Update 10.5 ASSERTIVE Downtown Driving

https://www.youtube.com/watch?v=vqDOYq51AzE 

1:56 The Tesla handles what should be an easy right hand turn poorly. Unclear why — it may be the subtleties of the road layout, but the driver needed to intervene.

 

Tesla FSD Beta 10.4 Doesn’t Want To Stop For This Red Light

https://www.youtube.com/watch?v=WI8ihVmidf0 

1:57 The car seems confused, possibly thrown by traffic cones. This is not high risk, but it is disconcerting that unexpected and temporary road layouts can throw the car’s decision-making. Generally, there seem to be issues with the current iteration of FSD and cones — something that will most likely be corrected, although not something that is ideal by a long shot.

2:59 Again, the car is confused by roadworks. Not immediately dangerous, but not good driving.

4:14 The Tesla under FSD was flummoxed on an unprotected left turn. It felt like an early student driver who hasn’t the experience or knowledge to navigate an intersection like this. In the end, the driver takes over. This wasn’t a safety issue — apart from how it might affect driver frustration — but it does show FSD struggling with driving under conditions that aren’t easy, yet experienced drivers learn to navigate successfully.

6:36 Here viewers can pick up some of the uncertainty in how far the driver trusts the car. In general these beta testers are responsible, diligent drivers who are taking great care to ensure the car drives safely while under FSD. Where they are uncomfortable in how FSD is performing, this should be taken seriously.

7:44 The car didn’t stop on red — this would be serious if there was no driver intervention.

8:24 The car didn’t pull into the left turn lane, and as a result was blocked from turning left by another vehicle that pulled up next to it. This feels very much like the type of error a rookie driver would make. It isn’t serious, but it does indicate inadequacies in the technology that might potentially lead to serious errors.

9:10 FSD seems to ignore a blinking red crosswalk light. This is a potentially serious error — not so much with a diligent driver in place, but with an inattentive driver, it could be problematic. That said, the car did spot pedestrians using the crossing, and stopped to avoid them!

Concluding Thoughts

Overall, with experienced and diligent drivers behind the wheel, these videos show that FSD beta has a long way to go to be trustworthy, but that there are only a few occasions where the technology places the driver or others in danger.

In many ways, these clips indicate that FSD is like a student driver that requires very close supervision. In others though, there are clear differences between how a human student driver would react to challenging conditions, and how FSD reacts, suggesting that the student driver analogy should be treated with some caution. This is especially the case where pedestrians are involved, and here it seems that there may be a need for more checks and balances in place to ensure that FSD mis-steps are more than just frustrating or an inconvenience.

Of course, these clips don’t provide a full picture of FSD’s capabilities, even in beta, and the indications are that there have been thousands of miles drive under FSD without major incidents. But when it comes to vehicle safety, it’s the times that things go wrong that are important, not the times that things go right. And clearly, there are still plenty of times when FSD beta doesn’t behave as well as many would like to see.

The question is, will the technology learn fast enough to avoid major incidents before it can qualify as anything other than a sometimes-unpredictable student driver?

Subscribe to Andrew’s Newsletter for regular updates, behind the scenes insights, exclusive content, and more …