Tesla 'Robotaxi' adds 5 more crashes in Austin in a month – 4x worse than humans

electrek.co

193 points by Bender 2 hours ago


Veserv - an hour ago

It is important to note that this is with safety drivers. Professional driver + their most advanced "Robotaxi" FSD version under test with careful scrutiny is 4x worse than the average non-professional driver alone and averaging 57,000 miles per minor collision.

Yet it is quite odd how Tesla also reports that untrained customers using old versions of FSD with outdated hardware average 1,500,000 miles per minor collision [1], a literal 3000% difference, when there are no penalties for incorrect reporting.

[1] https://www.tesla.com/fsd/safety

WarmWash - an hour ago

The problem Tesla faces and their investors are unaware of, is that just because you have a Modey Y that has driven you around for thousands of miles without incident does not mean Tesla has autonomous driving solved.

Tesla needs their FSD system to be driving hundreds of thousands of miles without incident. Not the 5,000 miles Michael FSD-is-awesome-I-use-it-daily Smith posts incessantly on X about.

There is this mismatch where overly represented people who champion FSD say it's great and has no issues, and the reality is none of them are remotely close to putting in enough miles to cross the "it's safe to deploy" threshold.

A fleet of robotaxis will do more FSD miles in an afternoon than your average Tesla fanatic will do in a decade. I can promise you that Elon was sweating hard during each of the few unsupervised rides they have offered.

lateforwork - an hour ago

Tesla's Robotaxis are bringing a bad name to the entire field of autonomous driving. The average consumer isn't going to make a distinction between Tesla vs. Waymo. When they hear about these Robotaxi crashes, they will assume all robotic driving is crash prone, dangerous and irresponsible.

Traster - 2 hours ago

I said in earlier reports about this, it's difficult to draw statistical comparisons with humans because there's so little data. Having said that, it is clear that this system just isn't ready and it's kind of wild that a couple of those crashes would've been easily preventable with parking sensors that come equipped as standard on almost every other car.

In some spaces we still have rule of law - when xAI started doing the deepfake nude thing we kind of knew no one in the US would do anything but jurisdictions like the EU would. And they are now. It's happening slowly but it is happening. Here though, I just don't know if there's any institution in the US that is going to look at this for what it is - an unsafe system not ready for the road - and take action.

maxdo - 4 minutes ago

electrec as always.

``` The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds. ```

so in reality one crash with fixed object, the rest is... questionable, and it's not a crash as you portrait. Such statistic will not even go into human reports, as it goes into non driving incidents, parking lot etc.

vessenes - an hour ago

Interesting crash list. A bunch of low speed crashes, one bus hit the Tesla while the Tesla was stationary, and one 17mph into static object (ouch).

For those complaining about Tesla's redactions - fair and good. That said, Tesla formed its media strategy at a time when gas car companies and shorts bought ENTIRE MEDIA ORGs just to trash them to back their short. Their hopefulness about a good showing on the media side died with Clarkson and co faking dead batteries in a roadster test -- so, yes, they're paranoid, but also, they spent years with everyone out to get them.

jackp96 - 2 hours ago

I'm not an Elon fan at all, and I'm highly skeptical of Tesla's robotaxi efforts in general, but the context here is that only one of these seems like a true crash?

I'm curious how crashes are reported for humans, because it sounds like 3 of the 5 examples listed happened at like 1-4 mph, and the fourth probably wasn't Tesla's fault (it was stationary at the time). The most damning one was a collision with a fixed object at a whopping 17 mph.

Tesla sucks, but this feels like clickbait.

ProfessorZoom - 18 minutes ago

Is there any place online to read the incident reports? For example Waymo in CA there's a gov page to read them, I read 9 of them and they were all not at the fault of Waymo, so I'm wondering how many of these crashes are similar (ie at a red light and someone rear ends them)

fabian2k - an hour ago

It's impressive how bad they're at hiring the safety drivers. This is not even measuring how good the Robotaxi itself is, right now it's only measuring how good Tesla is at running this kind of test. This is not inspiring any confidence.

Though maybe the safety drivers are good enough for the major stuff, and the software is just bad enough at low speed and low distance collisions where the drivers don't notice as easily that the car is doing something wrong before it happens.

nova22033 - 20 minutes ago

He going to fix this by having grok redefine "widespread"

https://www.cnbc.com/2026/01/22/musk-tesla-robotaxis-us-expa...

Tesla CEO Elon Musk said at the World Economic Forum in Davos that the company’s robotaxis will be “widespread” in the U.S. by the end of 2026.

jeffbee - 12 minutes ago

Their service is way worse than you think, in every way. The actual unsupervised Robotaxi service doesn't cover a geofenced area of Austin, like Waymo does. It traverses a fixed route along South Congress Avenue, like a damned bus.

smileson2 - an hour ago

ill stick to the bus

hermitcrab - an hour ago

"Tesla remains the only ADS operator to systematically hide crash details from the public through NHTSA’s confidentiality provisions."

Given the way Musk has lied and lied about Tesla's autonomous driving capabilities, that can't be much of a surprise to anyone.

chinathrow - 40 minutes ago

Well, how about time to take them off the roads then?

pengaru - an hour ago

It's a fusion of jazz and funk!

BirAdam - an hour ago

At this point, I am really sick of both Elon supporters and Elon haters, coverage of Elon's companies either good or bad (as it's always incredibly biased in either direction), and sick of both the current trend of hyper optimism and hyper doomerism.

I know that it is irrational to expect any kind of balance or any kind of objective analysis, but things are so polarized that I often feel the world is going insane.

anonym29 - 15 minutes ago

This data seems very incomplete and potentially misleading.

>The new crashes include [...] a crash with a bus while the Tesla was stationary

Doesn't this imply that the bus driver hit the stationary Tesla, which would make the human bus driver at fault and the party responsible for causing the accident? Why should a human driver hitting a Tesla be counted against Tesla's safety record?

It's possible that the Tesla could've been stopped in a place where it shouldn't have, like in the middle of an intersection (like all the Waymos did during the SF power outage), but there aren't details being shared about each of these incidents by Electrek.

>The new crashes include [...] a collision with a heavy truck at 4 mph

The chart shows only that the Tesla was driving straight at 4mph when this happened, not whether the Tesla hit the truck or the truck hit the Tesla.

Again, it's entirely possible that the Tesla hit the truck, but why aren't these details being shared? This seems like important data to consider when evaluating the safety of autonomous systems - whether the autonomous system or human error was to blame for the accident.

I appreciate that Electrek at least gives a mention of this dynamic:

>Tesla fans and shareholders hold on to the thought that the company’s robotaxis are not responsible for some of these crashes, which is true, even though that’s much harder to determine with Tesla redacting the crash narrative on all crashes, but the problem is that even Tesla’s own benchmark shows humans have fewer crashes.

Aren't these crash details / "crash narrative" a matter of public record and investigations? By e.g. either NHTSA, or by local law enforcement? If not, shouldn't it be? Why should we, as a society, rely on the automaker as the sole source of information about what caused accidents with experimental new driverless vehicles? That seems like a poor public policy choice.

outside1234 - an hour ago

Just imagine how bad it is going to be when they take the human driver out of the car.

No idea how these things are being allowed on the road. Oh wait, yes I do. $$$$

xyst - an hour ago

Just take these fucking things off the road. "Robotaxi" needs to die in same fashion as predecessor, Cruise.

small_model - 2 hours ago

[flagged]

b8 - 2 hours ago

[flagged]

LightBug1 - an hour ago

Move fast and hospitalise people.

arein3 - an hour ago

A minor fender-bender is not a crash

4x worse than humans is misleading, I bet it's better than humans, by a good margin.

ArchieScrivener - 42 minutes ago

Good, who cares. Autonomous driving is an absolute waste of time. We need autodrone transport for civilian traffic. The skies have been waiting.

In before, 'but it is a regulation nightmare...'