Waymo robotaxi hits a child near an elementary school in Santa Monica
techcrunch.com403 points by voxadam 16 hours ago
403 points by voxadam 16 hours ago
From the Waymo blog...
> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
Yup. And to add
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.
You're omitting the context provided by the article. This wasn't just a random scenario. Not only was this by an elementary school, but during school drop off hours, with both children and doubled parked cars in the vicinity. If somebody doesn't know what double parking is - it's when cars parallel park beside one another, implicitly on the road, making it difficult to see what's beyond them.
So you are around young children with visibility significantly impaired because of double parking. I'd love to see video of the incident because driving 17mph (27kph for metric types) in this context is reckless and not something human would typically do, because a kid popping out from behind one of those cars is not only unsurprising but completely expected.
Another reason you also slow way down in this scenario is one of those cars suddenly swinging open their door which, again, would not be particularly surprising in this sort of context.
That's my thinking as well. Taken in some abstract scenario, all those steps seems very reasonable, and in that abstract scenario we can even say it would do better than an average human would. But that is missing the overall context that this was an elementary school during drop-off hours. That's when you crawl at 3 mph expecting kids to jump behind any car, and not going at 17mph.
Driving is based so much off of feel so my numbers may be off, but in the scenario you are talking about 5mph seems reasonable, 10mph already seems like to much.
The want to be E but really armchair engineer in me for this context says there's far too little Engineering safety of the situation.
That school should not be on a busy roadway at all, it should also not have a child dropoff area anywhere near one but instead, ideally, a slow loop where the parents do drop off children, and then proceed forward in a safe direction away from the school in a flow.
If you drive in Sweden you will occasionally come up to a form of speed reduction strategy that may seem counterintuitive. They all add to make driving harder and feel more dangerous in order to force attention and lower speed.
One is to merge opposite directional roads into a single lane, forcing drivers to cooperate and take turn to pass it, one car at a time.
For a combined car and pedestrian road (max speed of 7km/h) near where I live, they intentionally added large obfuscating objects on the road that limited visibility and harder to navigate. This forces drivers to drive very slow, even when alone on the road, as they can't see if a car or person may be behind the next object.
In an other road they added several tight S curves in a row, where if you drive anything faster than 20km/h you will fail the turns and drive onto the artificial constructed curbs.
In other roads they put a sign in the middle of two way roads while at the same time drastically limiting the width to the curb, forcing drivers to slow down in order to center the car in the lane and squeeze through.
In each of those is that a human driver with human fear of crashing will cause drivers to pay extra attention and slow down.
In Bulgaria we have a similar speed reduction strategy but we are a bit ahead of Sweden: We use medium-radius but very deep potholes. If you lose attention for even a split second, you are forced to a full stop to change a tire. Near schools it gets more "advanced": they put parked cars on both sides of the road, and the holes positioned so you can't bypass them. For example, two tire-sized holes on both sides of the road right next to the parked cars. You have to come to a complete stop, then slowly descend into the hole with the front wheels, climb back out, and repeat the process for the rear wheels. Occasionally, even though we (technically) have sidewalks, they are covered in mud or grass or bushes, so pedestrians are forced to walk in the middle of the road. This further reduces driving speed to walking pace and increases safety in our cities. Road markings are missing almost everywhere and they put contradicting road signs so drivers are not only forced to cooperate but also to read each other minds.
Same in India! We go one better, we let people drive in the opposite lane as well!
It's a runaway process of prioritizing safety over convenience -- and it's wrecking their road base just before self-driving cars would allow them to have both.
I was wondering how much convenience is worth one kid's life. This thread reminded me of some interesting terms like "value of statistical life." It appears that all those annoying low speed limits and purposeful obstructions in residential areas really do save lives.
> An evaluation of 20 mph zones in the UK demonstrated that the zones were effective both in reducing traffic speed and in reducing RTIs. In particular child pedestrian injuries were reduced by 70 per cent from 1.24 per year in each area before to 0.37 per year after the zones were introduced
https://www.rospa.com/siteassets/images/road-safety/road-saf...
The "Vision Zero" program was started in Sweden, and is becoming more widely adopted.
20mph residential is pretty close to standard. Note the Waymo car was going slower than that. That's far from the 5mph GP was reacting too, or the super tight curves.
If they're actually self-driving they should be able to drive around the obstacles just as well or better than human.
What an American framing. My convenience at the cost of your eventual safety. I guess this is why we also have toddler death machines with 5-foot grills that we call “full size” vehicles.
If you've ever driven more than 5 miles an hour, you risked hurting someone for your convenience.
Acknowledging life has risk tradeoffs doesn't make you an American, but denying it can make you a self-righteous jerk.
Gosh, no, the self-driving cars will be forced to drive at safe speeds in pedestrian corridors as opposed to voluntarily driving at safe speeds in pedestrian corridors. How awful.
> prioritizing safety over convenience
this sounds like exactly the right tradeoff, especially since these decisions actually increase convenience for those not in cars
Of course it sound right, because you cut off the word "runaway".
It is possible to go too far in either direction.
I recently visited a friend that lives in Sweden (couple hours south of Stockholm). Something he said while I visited stuck with me:
"Sweden hates cars."
There must be a happy medium somewhere in between.
It's true, Sweden isn't quite bike and pedestrian friendly enough yet, but they'll get that balance someday!
It's fairly common at least in the Netherlands, Germany, and Switzerland too. In Switzerland they also place street parking spots on alternating sides on narrow streets, which also makes you more attentive and lower your speed.
I've heard that that is why roundabouts are safer than their alternatives: counterintuitively, they're safer because they're less safe, forcing the user to pay more attention as a result.
>they're safer because they're less safe
Roundabouts are safer. They're safer because they prevent everybody from speeding through the intersection. And, even in case of an accident, no head-on collisions happen in a roundabout.
They're safer specifically for vehicles, as they convert many conflicts that would be t-bones (worst for passengers) into getting rear-ended (maximum crumple zone on both vehicles).
Roundabouts are worse for land use though, which impacts walkability, and the safety story for pedestrians and bike users with them is decidedly not great as well.
>Roundabouts are worse for land use though, which impacts walkability, and the safety story for pedestrians and bike users with them is decidedly not great as well.
They're much safer for pedestrians than intersections. You're only crossing and dealing with traffic coming from one direction, stopping at a median, and then crossing further over.
Unlike trying to navigate a crosswalk where you have to play guessing games as to which direction some vehicle is going to come at you from while ignoring the lights (people do the stupidest things, and roundabouts are a physical barrier that prevents a bunch of that)
In Waterloo Region I used to cycle through multiple intersections that were "upgraded" some years ago from conventional stoplights to roundabouts and imo it was a huge downgrade to my sense of safety. I went from having a clear right of way (hand signal, cross in the crosswalk) to feeling completely invisible to cars, essentially dashing across the road in the gaps in traffic as if I was jaywalking.
I could handle it as an adult just walking my bike but it would be a nightmare for someone pushing a stroller or dependent on a mobility device.
>from one direction, stopping at a median, and then crossing further over.
This assumes a median, which is not present at most smaller roundabouts in the US.
One-lane-roundabouts are very safe. I lived in Hannover (Germany) in the 80s and 90s, they had 2 or 3 lanes in the roundabouts. There were large signs that counted the accidents (200+/year) to raise awareness and during the trade fairs (anybody remembers Cebit?) the number of accidents peaked. Today they are all a lot safer because of a lot of traffic lights.
Same with driving in the winter. Anecdotally I always observe more accidents when the roads are clear.
Does it actually work though?
Many roads in London have parked cars on either side so only one can get through - instead of people cooperating you have people fighting, speeding as fast as they can to get through before someone else appears, or race on-coming cars to a gap in the parked cars etc. So when they should be doing 30mph, they are more likely doing 40-45. Especially with EVs you have near-instant power to quickly accelerate to get to a gap first etc.
And putting obstacles in the road so you cant see if someone is there? That sounds really dangerous and exactly the sort of thing that caused the accident in the story here.
Madness.
> Does it actually work though?
Yes. They have made steady progress over the previous decades to the point where they can now have years with zero road fatalities.
> And putting obstacles in the road so you cant see if someone is there? That sounds really dangerous and exactly the sort of thing that caused the accident in the story here.
Counterintuitive perhaps, but it's what works. Humans adjust their behaviour to the level of perceived risk, the single most important thing is to make driving feel as dangerous as it is.
I think the humans in London at least do not adjust their behaviour for the perceived risk!
From experience they will adjust their behaviour to reduce their total travel time as much as possible (i.e. speed to "make up" for lost time waiting etc) and/or "win" against other drivers.
I guess it is a cultural thing. But I cannot agree that making it harder to see people in the road is going to make anything safer. Even a robot fucking taxi with lidar and instant reaction times hit a kid because they were obscured by something.
> I think the humans in London at least do not adjust their behaviour for the perceived risk!
The evidence is that they do though. E.g. the Exhibition Road remodelling (removing curbs/signs/etc.) has been a great success and effectively reduced vehicle speeds, e.g. https://www.rbkc.gov.uk/sites/default/files/media/documents/...
There are always going to be outlier events. If for every one person who still manages to get hit—at slow, easily-survivable speeds—you prevent five others from being killed, it’s a pretty obvious choice.
why not just put in speedbumps if all you're trying to do is slow people down? Are you sure this was the purpose of these designs? sounds a little too freakonomics to me.
Speed bumps suck for both the driver and passangers of the car and generate road noise.
> It's likely that a fully-attentive human driver would have done worse.
We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.
The car reacted quickly once it saw the child. Is that enough?
But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.
The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.
As someone who lives on a residential street right by a primary school in the UK, the majority of drivers are going over 20mph even at the peak time when there are children everywhere.
While in theory human drivers should be situationally aware of the higher risks of children being around, the reality is that the majority will be in their own bubble of being late to drop their kid off and searching for the first free spot they can find.
the human driver would usually drive more closely to the centerline of such a residential road. If the road is clear ahead i'd drive almost over the centerline of the road having enough clearance between my path and the parked cars for any such "jumper" to be visible long enough for me to react. If there is an opposite traffic i get back strictly into my lane and slow down much more if the parked cars are close and they block sidewalk view, etc.
The autonomous cars have really got more aggressive recently as i mentioned before:
https://news.ycombinator.com/item?id=46199294
Also Waymo handling road visibility issue:
I'd really like to see the video of the incident.
I have a similar school drop-off, and can confirm that the cars are typically going around 17-20mph around the school when they're moving. Also that yes, human drivers usually do stay much closer to the centerline.
However, Waymo was recently cleared to operate in my city, and I actually saw one in the drop-off line about a week ago. I pulled out right in front of it after dropping my kid off. And it was following the line of cars near the centerline of the road. Honestly its behavior was basically indistinguishable from a human other than being slightly more polite and letting me pull out after I put my blinker on.
> the human driver would usually drive more closely to the centerline of such a residential road
I certainly do this. But asserting that most humans would usually do this? Have you ever actually seen humans drive cars? This is absolutely not what they do. On top of that, they run stop signs, routinely miss pedestrians in blind spots, respond to texts on their phone, or scroll around on their display to find the next song they want to put on.
I vividly recall a shot within a commercial, in which a driver was shown in slow motion, chucking his coffee into the passenger foot well in order to have two hands on the wheel for an emergency. I don’t remember what was about to happen to the car or the world around it. I’m pretty sure that a collision occurred.
Your opinion of "most humans" is vastly overinflated. The median human driver would be going 5 over the speed limit, on their cell phone, and paying fuck all attention. Humans never drive as slow as 17 mph, even in the context of being directly in front of schools with visible children.
You're describing the median driver in America or India. This is not universal.
True, but it seems fair to evaluate Waymo against the median American driver. If they expand to whatever other countries you're thinking of, then it will be fair recalibrate accordingly.
> But most humans would have been aware of the big picture scenario much earlier.
I wouldn't call it likely. Sure, there are definitely human drivers who are better than Waymo, but IME they're few and far between. Much more common to be distracted or careless.
When walking along a busy street facing traffic, I like to play a game of "who's using a phone?" I sometimes score in excess of 50% of drivers texting or otherwise manipulating a phone instead of actually driving.
It's amazing how much nonsense we let slide with human drivers, and then get uptight about with anything else. You see the same attitude with bicycles. Cars run stop signs and red lights all day long and nobody bats an eye, but a cyclist does it and suddenly they're a menace.
I don't think it makes sense to lump some drivers better than waymo and worse than waymo. A human brain automatically thinks of all the scenarios, where Waymo has pre-programmed ones (and some NN based ones). So it's scenarios by scenario.
Consider this scenario:
5 kids are walking on the sidewalk while you're driving past them. But suddenly a large dumpster is blocking your view of them just as you pass. You saw them before the dumpster, but not after your car and the dumpster completely blocks the view.
Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster? Does Waymo carry that worry or just continue to drive at the exact same speed.
Again, it's not like every driver will think about this, but many drivers will (even the bad ones).
> A human brain automatically thinks of all the scenarios
I don't think this is true. There are infinitely many scenarios in a complex situation like a road with traffic, cars parked, pedestrians about, weather, etc. My brain might be able to quickly assess a handful, but certainly not all.
> like a road with traffic, cars parked, pedestrians about, weather
Not all of those need to be done "quickly". That's where LLMs fail
You note the weather when you leave. You understand the traffic five minutes ahead. You recognize pedestrians far ahead of time.
Computers can process a lot in fractions of a second. Humans can recognize context over many minutes.
The Waymo may have done better in the fraction of a second, but humans can avoid being in that situation to begin with.
Computers can take all of those things into account as well
Can, but don't.
It doesn't seem like self driving cars take into account the icy conditions of roads for one simple example.
There aren't infinitely many scenarios to consider, but even if that's a figure of speech, there aren't thousands or even hundreds.
If there's ten kids nearby, that's basically ten path scenarios, and that might be reduced if you have great visibility into some of them.
> My brain might be able to quickly assess a handful, but certainly not all.
What would you do if you can't assess all of them? Just keep driving same speed?
If the situation is too overwhelming you'll almost certainly back off, I know I would. If I'm approaching that school block and there's like 50 small kids running around in all directions, I have no idea what's going on and who is going where, so I'm going to just stop entirely until I can make some sense of it.
> here aren't infinitely many scenarios to consider, but even if that's a figure of speech, there aren't thousands or even hundreds.
There are a very, very large number of scenarios. Every single possible different state the robot can perceive, and every possible near future they can be projected to.
Ten kids is not 10 path scenarios. Every kid could do a vast number of different things, and each additional kid raises the number of joint states to another power.
This is trivially true. The game that makes driving possible for humans and robots is that all these scenarios are not equally likely.
But even with that insight, it’s not easy. Consider a simple case of three cars about to arrive at an all-way stop. Tiny differences in their acceleration - potentially smaller differences than the robot can measure - will result in a different ordering of cars taking turns through the intersection.
It’s a really interesting problem.
It should be trivial for Waymo to implement a "drive carefully near schools" feature, and if really spicy "drive REALLY carefully near schools at these times" feature.
Safe driving starts with speed, lowering speed and informing the passengers seems like a no-brainer.
It was a figure of speech, but I think you're undercounting. When you consider interactions between all the things, even with just a handful of variables (and I think there are many more than a handful) you get a huge number of scenarios.
This is the classical ‘Frame Problem” of AI. How do you consider, even if only to reject, infinite scenarios in finite time? Humans and other animals don’t seem to suffer from it.
God I wish I re-read my statement, I was more focused on Humans think of an unlimited number of scenarios - not necessarily all. A computer will only think of pre-programmed ones.
The computer isn't pre-programmed though. These computers are trained similar to how human brains are (though obviously brains are still vastly, vastly, vastly superior to computers for tasks like this).
You are vastly overestimating most drivers. Most drivers aren't even looking out the window the majority of their time driving.
> A human brain automatically thinks of all the scenarios, ...
Patently, obviously false. A human brain will automatically think of SOME scenarios. For instance, if a collision seems imminent, and the driver is holding a cup of coffee, these ideas are likely to occur to the driver:
IF I GRAB THE STEERING WHEEL AND BRAKE HARD, I MIGHT NOT HIT THAT PEDESTRIAN IN FRONT OF ME.
IF I DON'T CONTINUE HOLDING THE COFFEE CAREFULLY, I MIGHT GET SCALDED.
THIS SONG ON MY RADIO IS REALLY ROCKING!
IF I YANK MY WHEEL TO THE LEFT, I MIGHT HIT A CAR INSTEAD OF A HUMAN.
IF I BRAKE HARD OR SWERVE AT ANY TIME IN TRAFFIC, I CAN CAUSE AN ACCIDENT.
Experiments with callosal patients (who have damaged the connective bridge between the halves of their brains) demonstrate that this is a realistic picture of how the brain makes decisions. It offers up a set of possible actions, and attempts to choose the optimal one and discard all others.
A computer program would do likewise, EXCEPT it won't care about the coffee cup nor the radio (remove two bad choices from consideration).
It still has one bad choice (do nothing), but the SNR is much improved.
I'm not being hyperbolic; self-preservation (focusing on keeping that coffee in my hand) is a vital factor in decision-making for a human.
> ...where Waymo has pre-programmed ones (and some NN based ones).
Yes. And as time goes on, more and better-refined scenarios will be added to its programming. Eventually, it's reasonable to believe the car software will constantly reassess how many humans are within HUMAN_RUN_DISTANCE + CAR_TRAVEL_DISTANCE in the next block, and begin tracking any that in an unsafe margin. No human on Earth does that, continually, without fail.
> Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster? Does Waymo carry that worry or just continue to drive at the exact same speed.
You continue to imply that Waymo cannot ever improve on its current programming. Does it currently consider this situation? Probably not. Will it? Probably.
God I wish I re-read my statement, I was more focused on Humans think of an unlimited number of scenarios - not necessarily all. A computer will only think of pre-programmed ones.
For what it's worth, that kind of lumping of drivers is more-or-less one of the metrics Waymo is using to self-evaluate. Perfect safety when multi-ton vehicles share space with sub-300-pound humans is impossible. But they ultimately seek to do better than humans in all contexts.
Have you been in a waymo? It knows when there are pedestrians around (it can often see over the top of parked cars) and it is very cautious when there are people near the road and it frequently slows down.
I have no idea what happened here but in my experience of taking waymos in SF, they are very cautious and I'd struggle to imagine them speeding through an area with lots of pedestrians milling around. The fact that it was going 17mph at the time makes me think it was already in "caution mode". Sounds like this was something of a "worst case" scenario and another meter or 2 and it would have stopped in time.
I think with humans, even if the driver is 100% paying attention and eyes were looking in exactly the right place where the child emerged at the right time, there is still reaction times - both in cognition but also physically moving the leg to press the pedal. I suspect that a waymo will out-react a human basically 100% of the time, and apply full braking force within a few 10s of milliseconds and well before a human has even begun to move their leg.
You can watch the screen and see what it can detect, and it is impressive. On a dark road at night in Santa Monica it was able to identify that there were two pedestrians at the end of the next block on the sidewalk obscured by a row of parked cars and covered by a canopy of overgrown vegetation. There is absolutely no way any human would have been able to spot them at this distance in these conditions. You really can "feel" it paying 100% attention at all times in all directions.
According to the article the car was traveling at 17 miles an hour before it began braking. Presumably this was in a 25 mph school zone, so it seems the Waymo was already doing exactly what you describe - slowing down preemptively.
This is close to a particular peeve I have. Occasionally I see signs on the street that say "Slow Down". I'm not talking about the electronic ones connected to radar detectors. Just metal and paint.
Here's my problem. If you follow the instructions on the sign, it still says to slow down. There's no threshold for slow enough. No matter how slow you're going, the sign says "Slow Down". So once you become ensnared in the visual cone of this sign, you'll be forced to sit stationary for all eternity.
But maybe there's a loop-hole. It doesn't say how fast you must decelerate. So if you come into the zone going fast enough, and decelerate slowly enough, you can make it past the sign with some remaining non-zero momentum.
You know, I've never been diagnosed on the spectrum, but I have some of the tendencies. lol.
Obviously a static sign is not aware of your current state, so it's message can only be interpreted as relevant to your likely state... i.e. the posted speed limit.
If you should slow down relative to the posted speed limit why not change the speed limit to reflect that directly?
Usually the reason is the "slow down" portion is very small, and it's confusing to shift down the actual speed limit for a 200 foot stretch of road then increase it again.
There are signs for that. Advisory speeds that don't change the actual limit. https://wisconsindot.gov/PublishingImages/doing-bus/local-go...
Much better to be specific than a vague "slow down". There's a road near me with two tight turns a couple blocks apart. One advises 25mph and the other advises 10mph.
FWIW, it seems less confusing to me than longer speed limits, but with "Slow Down".
Except we do that all the time in school zones... normally 35+, but from 7am-9am and again from 2pm-4pm the limit drops to 25mph (which is still to fast if the kids are actually crossing the street or walking alongside en masse).
A lot of clickbait headlines have the same problem. "You're using too much washing powder!"
Everyone's replying to you as if you truly don't understand the sign's intention but I'm sure you do. It's just annoying to be doing everything right and the signs and headlines are still telling you you're wrong.
There was a driving safety safety ad campaign here: "Drive to the conditions. If they change, reduce your speed." You can imagine how slow we'd all be going if the weather kept changing.
We might have OCPD.
Yes. You have understood precisely the spirit in which I intended it.
In advertising: "Treat yourself. You deserve it!"
Me: What if someone who didn't deserve it heard this message. How can you possibly know what I deserve? Do all people deserve to treat themselves? Is the notion of deserving or treating really so vacuous?
Normies: jfc
There's a mental health awareness campaign going on around here at the moment with all the generic messages like that. "You're doing great" is completely devalued by the sign giving the same message to everyone, and the best one says something like "Don't push yourself too hard. If you want to rest, rest." Wondering if I can tell my boss the sign told me it's okay not to get any work done.
Humans are supposed to deal with this kind of ambiguity. Actually, that's one of our nicest abilities.
I hate when people pretend to be smarter than everyone else by pointing this kind of utterance and insisting that someone, somehow, will parse those statements in the most literal and stupid manner.
Then there are the ignorant misanthropes that can't waste a chance to repeat their reductionist speculations about human cognition. Just like the idiot Elon Musk that wasted billions in irrecoverably fucked self-driving system based on computer-version because he underestimated the human visual cortex.
Fucking annoying midwits.
Sorry I made you mad. I wasn't trying to seem smarter than everyone. Maybe dumber.
Think of it like they're saying "my children play on this street and my neighbors walk here. Please think about that when you decide how fast to go here."
You learn how to put those signs into context during your driving lessons, and fail your test if you don't apply that correctly.
My driving test was so thorough that I had to parallel park between two entirely fictional cars. There was certainly no consideration of eccentric signage.
I apologize if I gave the impression that I did not understand how to put them into context. Although I don't think my driving lessons ever mentioned it.
This is idle XKCD-style musing.
A 25mph school zone? That seems fast. 15mph would be more the norm, which is in line with the 17mph the car believed itself to be traveling.
FYI, unless you are a commerical truck, a cop, or a racer, your speedometer will read slightly fast, sometimes as much as 5 to 10%. This is normal practice for cars as it limits manufacturer liability. You can check this using independant gps, ie not an in-dash unit. (Just imagine the court cases if a speedo read slower than the actual speed and you can understand why this started.)
I mostly see 25 mph for school zones, though I'm in NC. Checking California, it sounds like 25 is standard there as well.[0] Some will drop to 15, but 25 is the norm as far as I can find.
[0] https://www.dmv.ca.gov/portal/handbook/california-driver-han...
I've lived all over California and I agree that 25mph is the norm here.
Edit: However, elsewhere in the thread someone linked this Streetview image that shoes that this particular school zone is 15mph: https://maps.app.goo.gl/7PcB2zskuKyYB56W8?g_st=ac
25 mph is typical non-school-zone residential around here, with school zones always slower.
Also, a different wheel diameter than the speedometer was calibrated with and you will have a larger difference between actual velocity and speedometer reading. The odometer will also not record actual distance traveled.
It depends. I had a honda motorcycle where the speedo was 10ish % fast (not unussual on bikes due to tire shape) but the odo was accutrate. Same sensor, but the computer just counted wheel rotations slightly differently for each use.
Virtually all speedos read fast. The federal standards have a fairly high margin for being allowed to read high, and a zero margin for reading low. Thus speedos are more or less universally calibrated to read at least 5% high.
It does seem fast to me -- school zones are 20 mph in Seattle, at least when children are present. But Google suggests 25 is the norm in Santa Monica, where the incident occurred.
In Encinitas, California, that sign would have no more than 20 MPH. In adjacent Carlsbad, I believe 25 is normal.
In this situation, the car was already driving under the legal speed required for a school zone (25mph when children are present) [edit: some comments in the post suggest there is a 15mph sign, which is sometimes posted; to me, driving 17mph in a 15mph zone is acceptable).
I think any fair evaluation of this (once the data was available) would conclude that Waymo was taking reasonable precautions.
> was already driving under the legal speed
That's exactly part of the problem. If it is programmed to be over-cautious and go 17 in a 25 zone, that feels like it is safe. Is it?
It takes human judgment of the entire big picture to say meaningfully whether that is too slow or too fast. Taking the speed limit literally is too rigid, something a computer would do.
Need to take into account the flow of the kids (all walking in line vs. milling around going in all directions), their age (younger ones are a lot more likely to randomly run off in an unsafe direction), what are they doing (e.g. just walking, vs. maybe holding a ball that might bounce and make them run off after it), their clustering and so on.
Driving past a high school with groups of kids chatting on the sidewalk, sure 20mph is safe enough. Driving past an elementary school with a mass of kids with toys moving in different directions on the same sidewalk, 17mph is too fast.
And if I'm watching some smaller kids disappear behind a visual obstruction that makes me nervous they might pop up ahead of it on the street, I slow down to a crawl until I can clearly see that won't happen.
None of this context is encoded in the "25mph when children are present" sign, but for most humans it is quite normal context to consider.
But would be great to see video of the Waymo scene to see if any of these factors was present.
It was going 17 mph. That is rather slow.
To put it another way. If an autonomous vehicle has a reaction time of 0.3 seconds, the stopping distance from 17 mph is about the same as a fully alert human driver (1 second reaction time) driving 10.33 mph.
>It was going 17 mph. That is rather slow.
There's a case to be made that it wasn't slow enough.
I have a hard time believing that a human driver would be as slow as this Waymo, or even slower. I drive my kid to school where it's posted 20mph and there are cameras (with plenty of warnings about the presence of said cameras) and witness a constant string of flashes from the camera nailing people for speeding through there.
A small child jumped out in front of it, which is about the worst case scenario you can have... and the kid was fine. So it sounds like it was slow enough?
Considering the car hit the child at only 6 mph and the kid just got up and brushed themselves off, it was plenty slow enough.
Nobody was injured.
Maybe. That level of safetyism seems pretty unreasonable when humans are 100x worse and still allowed on the road.
Two things:
I've read studies saying that most drivers don't brake at max effort, even to avoid a collision. This may be at least one of the reasons that Waymo predicted that an attentive human would likely have been going faster than their car at the moment of impact. I've got a good idea of my fun-car's braking performance, because I drive it hard sometimes, but after reading that I started practicing a bit with my wife's car on the school run, and... Yeah: it's got a lot more braking power than I realized. (Don't worry, I brake hard on a long straight exit ramp, when no one's behind me, a fast slow-down is perfectly safe, and the kiddo loves it.) I've now got an intuitive feel for where the ABS will kick in, and exactly what kind of stopping distance I have to work with, which makes me feel like a safer driver.
Second, going off my experience of hundreds and hundreds of ride-share rides, and maybe thirty Waymo journeys, I'd call the best 10-15% of humans better drivers than Waymo. Like, they're looking further up the road to predict which lane to be in, based on, say, that bus two blocks away. They also drive faster than Waymos do, without a perceptual decrease in safety. (I realize "perceptual" is doing some work in that sentence!) That's the type of defensive and anticipatory urban driver I try to be, so I notice when it's done well. Waymo, though, is flat-out better, in every way, than the vast majority of the ride-share drivers I see. I'm at the point where I'll choose a Waymo any time it'll go where I'm headed. This story reinforces that choice for me.
> I've read studies saying that most drivers don't brake at max effort, even to avoid a collision.
Ha! It is unbelievable how difficult it is to make someone brake hard. You'd think it's the easiest thing possible in the age of ABS - just press hard as you can.
I have a lot of experience on this, I used to teach car control both to teens and adults. One of the frequent exercises was seemingly very simple: Drive at Xmph until this spot, then brake at maximum power.
The vast majority of people can't do it on the first or second try, they'll just meekly press on the brake like they're coasting to a stop. After more coaching that hard means hard, they start to get it, but it takes many many tries.
The reason attentive humans don't equal the Waymo here is reaction time. When a thing happens the human takes a moment to process what it means, and choose a reaction. It's not, by our standards, a long time but it's way longer than it takes the Waymo.
Going early means you slow early, which means you also take longer to reach the child, but you're braking for all of that extra time, so you're slowing down even more.
Curiously enough Google could have access to how fast humans usually drive through that street.. if they record people's Google Maps trips, they can show the court that "Look, 80% of Google Maps users drive through here at 30 mph!".
Waymo itself has this as well. They record their drives after all which means they know the speed of the vehicles around them.
They even wrote a blog post about it:
https://waymo.com/blog/2023/07/past-the-limit-studying-how-o...
Google might even know how many drivers aren't obeying the speed limit or slow-rolling through stop signs. I wonder if they already have partnerships with law enforcement to detect areas where the traffic law is more ignored than others.
It would be nice to see the video (although maybe there are some privacy issues, it is at a school after all).
Anyway, from the article,
> According to the NHTSA, the accident occurred “within two blocks” of the elementary school “during normal school drop off hours.” The safety regulator said “there were other children, a crossing guard, and several double-parked vehicles in the vicinity.”
So I mean, it is hard to speculate. Probably Waymo was being reasonably prudent. But we should note that this description isn’t incompatible with being literally in an area where the kids are leaving their parents’ cars (the presence of “several double parked cars brings this to mind). If that’s the case, it might make sense to consider an even-safer mode for active student unloading areas. This seems like the sort of social context that humans might have and cars might be missing.
But things speculation. It would be good to see a video.
> But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
Waymos do this and have for years. They know where the people are around them and will take precautionary action based on that.
Here's a video from 2019 of one understanding that a car in the bike lane means the cyclists may dart out into the lane it's in and taking action based on that. https://waymo.com/blog/2019/05/safety-at-waymo-self-driving-...
That video is nearly 7 years old at this point and they've gotten much, much better since then.
If you think a fully-attentive human driver would have done better, I think you're kidding yourself.
I know you didn't make this point, but if anyone think the average LA driver would have done better than this I've got a bridge to sell you and that's really what matters more. (I say that as someone who used to live like half a mile from where this happened)
The car was driving 17mph before braking. I don’t think I’ve ever seen a human drive at 17mph in a school zone or other area children congregate.
Meaning you’ve never seen a human drive that slowly in such an area, or you've never seen a human exceed the speed limit in a school zone?
I live in an area where there are pedestrians stepping into the street without looking, all over the place, and you can drive / cycle without hitting them but have to slow down appropriately if you have to go near something that you can't see behind. Like you say it would be interesting to see the video.
> The car only slows down after seeing someone.
How do you know that? The article says it slowed from 17 mph. That’s cautious progress speed, not cruising speed.
There's a bus stop right behind my house. I routinely hear the driver honking and yelling at people who ignore when the stop sign is extended (which is a misdemeanor in my state). So forgive me for not assuming a human would have done better.
I drive like this too, but I think we’re a small minority. Especially here in LA.
It was already moving slowly. 17MPH is pretty conservative. Most human drivers going past my local school are doing at least 30.
In principle, attentive drivers, who have either somehow come independently to the appropriate understanding or have been trained in how to react to hazards ahead...
https://www.gov.uk/theory-test/hazard-perception-test
... could in some circumstances know that there's a likelihood that a child will emerge suddenly and reduce their speed in anticipation where circumstances allow.
Note that: If you cut speed but other drivers can't see why they may overtake, even unsafely, because you are a nuisance to them. Slowing in anticipation that a child will run out from behind the SUV, only for a car behind you to accelerate around you and smack straight into the child at even higher speed, is not the desired outcome even though you didn't hurt anybody...
And yes, we'd need to see the video to know. It's like that Sully scenario. In a prepared test skilled pilots were indeed able to divert and land, but Sully wasn't prepared for a test. You're trained to expect engine failure in an aeroplane - it will happen sometimes so you must assume that, but for a jet liner you don't anticipate losing both engines, that doesn't happen. There's "Obviously that child is going in the road" and "Where the fuck did they come from?" and a lot in between and we're unlikely to ever know for sure.
> But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
Waymos constantly track pedestrians nearby, you can see it on the status screen if you ride in one. So it would be both better able to find pedestrians and react as soon as one was on a collision course. They have a bit more visibility than humans do due to the sensor placement, so they also can see things that aren't that visible to a person inside the car, not to mention being constantly aware of all 360 degrees.
While I suppose that in theory, a sufficiently paranoid human might outdo the robot, it looks to me like it's already well above the median here.
Do they speculate about things like “we’re near a school zone, kids are unloading, there might be a kid I’ve never seen behind that SUV?” (I’m legitimately asking I’ve never been in a Waymo).
It's not particularly meaningful to ponder the subjective experience of the waymo driving computer. Instead, focus on its externally visible behavior.
Asking whether an entity has modeled and evaluated a specific situation, using that evaluation to inform its decisions, is not about subjective experience.
If you're asking whether their training data includes situations like this, and whether their trained model/other pieces of runtime that drive the car include that feature as part of their model, the answer is yes. But not in the way a normal human driver would think about it; many of the details of its decision making process are based on large statistical collections, rather than "I'm in a school zone and need to anticipate children may be obscured and run out into traffic." There are many places where the car needs to take caution without knowing specifically it's within 50 feet of a school zone.
While the deep details are not public, Waymo has shared a fair amount of description of their system, from which you can glean some ideas about the world model it creates and the actions it takes in specific situations: https://waymo.com/blog/2024/10/ai-and-ml-at-waymo https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto... https://waymo.com/blog/2024/10/introducing-emma
I was being informal with “speculate,” sorry. They could identify that sort of situation as a high-risk area in some way.
Are they not using a ton of ML to take exactly this sort of context into account?
>The car can react faster that I can after seeing someone
and that can potentially allow internal planning algorithm to choose more risky and aggressive trajectories/behavior, etc. say to reach target destination faster and thus deliver higher satisfaction to the passengers.
Anecdote, but I live next to an elementary school and also on a route frequented by Waymos. Human drivers routinely cruise down the 25mph roads at 40+ and blow stop signs, even during school intake and release. Waymo vehicles always seem a lot more cautious.
When thinking about these things you have to factor in the prior probability that a driver is fully attentive, not just assume they are.
If you’ve ever been in a Waymo you quickly realize their field of view is pretty good. You often see the vehicle sensing small pets and children that are occluded to a passenger or driver. For this reason and my experience with humans near aforementioned school, I doubt a human would out perform the Waymo in this particular incident and it’s debatable they even have more context to inform their decisions.
All that said, despite having many hours in a Waymo, it’s not at all clear to me how they factor in sidewalk context. You get the sense that pedestrians movement vectors are accounted for near intersections, but I can’t say I’ve experienced something like a slow down when throngs of people are about.
Precisely. Environmental context is not considered in Waymo's "peer-reviewed model" (I encourage reflexive commenters to first read it: https://waymo.com/safety/collision-avoidance-benchmarking), only basic driver behavior and traffic signal timings.
Note the weaselly "immediately detected the individual as soon as they began to emerge" in the puff piece from Waymo Comms. No indication that they intend to account for environmental context going forward.
If they already do this, why isn't it factored in the model?
How is "immediately detected the individual as soon as they began to emerge" worded weaselly?
Not OP but I interpret that as they are focusing exclusively on what happened after the car saw the kid.
And I completely agree that from that instant forward, the car did everything correctly.
But if I was the accident investigator for this, I would be far more interested in what happened in the 30 seconds before the car saw the kid.
Was the kid visible earlier and then disappear behind an obstruction? Or did the kid arrive from the side and was never earlier visible? These are the more important questions.
Possibly, but Waymos have recently been much more aggressive about blowing through situations where human drivers can (and generally do) slow down. As a motorcyclist, I've had some close calls with Waymos driving on the wrong side of the road recently, and I had a Waymo cut in front of my car at a one-way stop (t intersection) recently when it had been tangled up with a Rivian trying to turn into the narrow street it was coming out of. I had to ABS brake to avoid an accident.
Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.
So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.
Certainly, I'm not against constructive criticism of Waymo. I just think it's important to consider the counterfactual. You're right too that an especially prudent human driver may have avoided the scenario altogether, and Waymo should strive to be that defensive.
> I'm not against constructive criticism of Waymo.
I feel like you have to say this out loud because many people in these discussions don't share this view. Billion dollar corporate experiments conducted in public are sacrosanct for some reason.
> I just think it's important to consider the counterfactual
More than 50% of roadway fatalities involve drugs or alcohol. If you want to spend your efforts improving safety _anywhere_ it's right here. Self driving cars do not stand a chance of improving outcomes as much as sensible policy does. Europe leads the US here by a wide margin.
> I feel like you have to say this out loud because many people in these discussions don't share this view. Billion dollar corporate experiments conducted in public are sacrosanct for some reason.
Yes, and I find it annoying that some people do seem to think Waymo should never be criticized. That said, we already have an astounding amount of data, and that data clearly shows that the experiment is successful in reducing crashes. Waymos are absolutely, without question already making streets safer than if humans were driving those cars.
> If you want to spend your efforts improving safety _anywhere_ it's right here.
We can and should do both. And as your comment seems to imply but does not explicitly state, we should also improve road design to be safer, which Europe absolutely kicks America's ass on.
>data clearly shows that the experiment is successful in reducing crashes.
That's fine. But crashes are relatively rare and what matters is accountability. Will Waymo be accountable for hitting this kid the way a human would? Or will they fight in court to somehow blame the pedestrian? Those are my big concerns when it comes to self driving vehicles, and history with tech suggests that they love playing hot potato instead of being held accountable.
And yes, better walkable infrastructure is a win for all. The minor concern I have is the notion that self driving is perfect and we end up creating even more car centric infrastructure. I'm not sure who to blame on that one.
Waymo is driving the car and should be held accountable like any other driver.
I assume that's how it works already.
I hope so too. I'll be keeping a close eye on how they handle this, though. My benefit of the doubt for tech was already long drained, and is especially critical for safety critical industries.
> and that data clearly shows that the experiment is successful in reducing crashes
I disagree. You need way more data, like orders of magnitude more. There are trillions of miles driven in the US every year. Those miles often include driving in inclement weather which is something Waymo hasn't even scraped the surface of yet.
> without question
There are _tons_ of questions. This is not even a simple problem. I cannot understand this prerogative. It's far too eager or hopeful.
> We can and should do both
Well Google is operating Waymo and "we" control road policy. One of these things we can act on today and the other relies on huge amounts of investments paying off in scenarios that haven't even been tested successfully yet. I see an environment forming where we ignore the hard problems and pray these corporate overlords solve the problem on their own. It's madness.
> You need way more data, like orders of magnitude more. There are trillions of miles driven in the US every year.
Absurd, reductive, and non-empirical. Waymos crash and cause injury/fatality far less frequently than human drivers, full stop. You are simply out of your mind if you believe otherwise, and you should re-evaluate the data.
> Those miles often include driving in inclement weather which is something Waymo hasn't even scraped the surface of yet.
Yes. No one is claiming that Waymos are better drivers than humans in inclement weather, because they don't operate in those conditions. That does not mean Waymos are not able to outperform human drivers in the conditions in which they do operate.
> I see an environment forming where we ignore the hard problems and pray these corporate overlords solve the problem on their own. It's madness.
What's madness is your attitude that Waymos' track record does not show they are effective are reducing crashes. And again, working on policy does not prevent us from also improving technology as you seem to believe it does.
You're moving the goalposts. The claim is that Waymos are safer than human drivers in the areas and under the conditions where they currently operate.
Yeah, I'm sure Waymos would struggle in a blizzard in Duluth, but a) so would a human and b) Waymos aren't driving there. (Yet.)
> You're moving the goalposts
No. I'm not. I'm being realistic about the technology. You're artificially limiting the scope.
> so would a human
This is goalpost moving 101. The question isn't would a human driver also struggle but _would it be better_? You have zero data.