> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.
So you are around young children with visibility significantly impaired because of double parking. I'd love to see video of the incident because driving 17mph (27kph for metric types) in this context is reckless and not something human would typically do, because a kid popping out from behind one of those cars is not only unsurprising but completely expected.
Another reason you also slow way down in this scenario is one of those cars suddenly swinging open their door which, again, would not be particularly surprising in this sort of context.
Indeed. Sure the car knows the limit, it knows it is a school zone, it can precisely track people within the reach of its sensors (but not behind blockages it can't see through).
But it is missing the human understanding of the situation. Does it know that tiny humans behave far more erratically then the big ones? Obvious to us humans, but does the car take that into account? Does it consider that in such a situation, it is likely that a kid that its sensors can't possibly detect has a high probability to suddenly dart out from behind an obstacle? Again obvious to us humans because we understand kids, but does the car know?
Blows my mind how you guys confidently state this with authority as if that's the normal behavior, when the reality is that it probably should be - but isn't actually.
I am not sure what your definition of typical is. The reason we have lower speed zones in school districts at specific times is because humans typically drive fast. The reason police officers frequently target these areas and write a plethora of tickets is because humans typically ignore speed limits.
Your claim seems to be that a human would drive much slower than the posted speed limit considering the conditions, but the laws and the court room suggest otherwise.
Unfortunately, a vast overestimation of human danger recognition. Or empathy, unsure
That school should not be on a busy roadway at all, it should also not have a child dropoff area anywhere near one but instead, ideally, a slow loop where the parents do drop off children, and then proceed forward in a safe direction away from the school in a flow.
Things are what they are. Driving situations are never perfect and that's why we adapt. The Waymo was speeding in a school zone. Did a dangerously fast overtake of a double parked car. It's engineering safety failure over engineering safety failure from Waymo's part, on nobody else.
Source? The article doesn't list a speed limit, but highways.dot.gov suggests to me that the speed limit would be 25mph in the school zone, in which case the waymo was going significantly under the speed limit.
If only the same could be said for the other parents in the school zone. I’ve seen people roar by in similar scenarios at 30+ miles an hour.
lol I'm guessing you don't have kids. This is hilarious.
While i dont have kids, i guess you dont either. Because usually kids dont drive cars, atleast i didnt when i was in elementary school.
We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.
The car reacted quickly once it saw the child. Is that enough?
But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.
The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.
While in theory human drivers should be situationally aware of the higher risks of children being around, the reality is that the majority will be in their own bubble of being late to drop their kid off and searching for the first free spot they can find.
The autonomous cars have really got more aggressive recently as i mentioned before:
https://news.ycombinator.com/item?id=46199294
Also Waymo handling road visibility issue:
I have a similar school drop-off, and can confirm that the cars are typically going around 17-20mph around the school when they're moving. Also that yes, human drivers usually do stay much closer to the centerline.
However, Waymo was recently cleared to operate in my city, and I actually saw one in the drop-off line about a week ago. I pulled out right in front of it after dropping my kid off. And it was following the line of cars near the centerline of the road. Honestly its behavior was basically indistinguishable from a human other than being slightly more polite and letting me pull out after I put my blinker on.
I certainly do this. But asserting that most humans would usually do this? Have you ever actually seen humans drive cars? This is absolutely not what they do. On top of that, they run stop signs, routinely miss pedestrians in blind spots, respond to texts on their phone, or scroll around on their display to find the next song they want to put on.
I have no idea what happened here but in my experience of taking waymos in SF, they are very cautious and I'd struggle to imagine them speeding through an area with lots of pedestrians milling around. The fact that it was going 17mph at the time makes me think it was already in "caution mode". Sounds like this was something of a "worst case" scenario and another meter or 2 and it would have stopped in time.
I think with humans, even if the driver is 100% paying attention and eyes were looking in exactly the right place where the child emerged at the right time, there is still reaction times - both in cognition but also physically moving the leg to press the pedal. I suspect that a waymo will out-react a human basically 100% of the time, and apply full braking force within a few 10s of milliseconds and well before a human has even begun to move their leg.
Sometimes it would detect something and I think "huh? Must be a false positive?" but sure enough it turns out that there really was someone standing behind a tree or just barely visible around a corner etc.
Sure none of those have run out in front of us, but the fact it is spotting them and tracking their movement before I am even aware they're there is impressive and reassuring.
Correct. Human reaction time is at its very best ~250ms. And that's when you're hyper-focused on reacting to a specific stimuli and actively try to respond to it as fast as possible.
During normal driving, a focused driver will react on the order of 1s. However, that's assuming actively paying attention to the road ahead. If you were to say, be checking your mirrors or looking around for any other reason this can easily get into multiple seconds. If you're say, playing on your phone (consider how many drivers do this), forget it.
A machine however is 100% focused 100% of the time and is not subject to our poor reaction times. It can brake in <100ms every time.
On the other hand, a software fault could make it run into an obstacle that'd be obvious to a human at full speed.
I wouldn't call it likely. Sure, there are definitely human drivers who are better than Waymo, but IME they're few and far between. Much more common to be distracted or careless.
It's amazing how much nonsense we let slide with human drivers, and then get uptight about with anything else. You see the same attitude with bicycles. Cars run stop signs and red lights all day long and nobody bats an eye, but a cyclist does it and suddenly they're a menace.
Consider this scenario:
5 kids are walking on the sidewalk while you're driving past them. But suddenly a large dumpster is blocking your view of them just as you pass. You saw them before the dumpster, but not after your car and the dumpster completely blocks the view.
Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster? Does Waymo carry that worry or just continue to drive at the exact same speed.
Again, it's not like every driver will think about this, but many drivers will (even the bad ones).
I don't think this is true. There are infinitely many scenarios in a complex situation like a road with traffic, cars parked, pedestrians about, weather, etc. My brain might be able to quickly assess a handful, but certainly not all.
Not all of those need to be done "quickly". That's where LLMs fail
You note the weather when you leave. You understand the traffic five minutes ahead. You recognize pedestrians far ahead of time.
Computers can process a lot in fractions of a second. Humans can recognize context over many minutes.
The Waymo may have done better in the fraction of a second, but humans can avoid being in that situation to begin with.
It doesn't seem like self driving cars take into account the icy conditions of roads for one simple example.
If there's ten kids nearby, that's basically ten path scenarios, and that might be reduced if you have great visibility into some of them.
> My brain might be able to quickly assess a handful, but certainly not all.
What would you do if you can't assess all of them? Just keep driving same speed?
If the situation is too overwhelming you'll almost certainly back off, I know I would. If I'm approaching that school block and there's like 50 small kids running around in all directions, I have no idea what's going on and who is going where, so I'm going to just stop entirely until I can make some sense of it.
There are a very, very large number of scenarios. Every single possible different state the robot can perceive, and every possible near future they can be projected to.
Ten kids is not 10 path scenarios. Every kid could do a vast number of different things, and each additional kid raises the number of joint states to another power.
This is trivially true. The game that makes driving possible for humans and robots is that all these scenarios are not equally likely.
But even with that insight, it’s not easy. Consider a simple case of three cars about to arrive at an all-way stop. Tiny differences in their acceleration - potentially smaller differences than the robot can measure - will result in a different ordering of cars taking turns through the intersection.
It’s a really interesting problem.
This is the difference between computing and humans. The car will attempt to compute all possible path scenarios because it has no instict, and it might not be possible to compute everything in real time so it might fail.
But the human will easily deal with the situation.
Try running through a sports field in an elementary school during lunch, full of unpredictable kids running around. Can you make it from one side to the other without crashing into a whole bunch of kids? Of course you can. You didn't need to try to compute an exponential number of scenarios, you just do it easily. The human brain is pretty amazing.
And current practical approaches are mostly end to end (or nearly) ML systems that do not compute a lot of alternative paths, and they work in approximately constant time independent of the scenario.
You strongly imply that computers can’t drive, but you could have written that in a Waymo.
Safe driving starts with speed, lowering speed and informing the passengers seems like a no-brainer.
To fix that, you program the car to handle situations with obstructed vision, which will handle not just this specific scenario, but all relating to obstructed line-of-sight — basically slow down enough to be able to stop in time in case something jumps out from behind the obstacle.
Really though, this is less of an engineering problem and more of a social cost-benefit analysis one.
On one hand, I'd say hitting a kid at 6mph in the worst case scenario once in a blue moon probably isn't that big of a deal.
On another, someone here calculated that "even 1MPH slower would likely have resulted in no contact in this scenario".
So really, it's not possible to say whether this was handled properly or not without access to data only Waymo has and establishing some standard of how much injury we're okay with vs the impact on travel times. Remember, we're seemingly ok with ~40 000 americans dying every year due to car transportation.
Patently, obviously false. A human brain will automatically think of SOME scenarios. For instance, if a collision seems imminent, and the driver is holding a cup of coffee, these ideas are likely to occur to the driver:
IF I GRAB THE STEERING WHEEL AND BRAKE HARD, I MIGHT NOT HIT THAT PEDESTRIAN IN FRONT OF ME.
IF I DON'T CONTINUE HOLDING THE COFFEE CAREFULLY, I MIGHT GET SCALDED.
THIS SONG ON MY RADIO IS REALLY ROCKING!
IF I YANK MY WHEEL TO THE LEFT, I MIGHT HIT A CAR INSTEAD OF A HUMAN.
IF I BRAKE HARD OR SWERVE AT ANY TIME IN TRAFFIC, I CAN CAUSE AN ACCIDENT.
Experiments with callosal patients (who have damaged the connective bridge between the halves of their brains) demonstrate that this is a realistic picture of how the brain makes decisions. It offers up a set of possible actions, and attempts to choose the optimal one and discard all others.
A computer program would do likewise, EXCEPT it won't care about the coffee cup nor the radio (remove two bad choices from consideration).
It still has one bad choice (do nothing), but the SNR is much improved.
I'm not being hyperbolic; self-preservation (focusing on keeping that coffee in my hand) is a vital factor in decision-making for a human.
> ...where Waymo has pre-programmed ones (and some NN based ones).
Yes. And as time goes on, more and better-refined scenarios will be added to its programming. Eventually, it's reasonable to believe the car software will constantly reassess how many humans are within HUMAN_RUN_DISTANCE + CAR_TRAVEL_DISTANCE in the next block, and begin tracking any that in an unsafe margin. No human on Earth does that, continually, without fail.
> Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster? Does Waymo carry that worry or just continue to drive at the exact same speed.
You continue to imply that Waymo cannot ever improve on its current programming. Does it currently consider this situation? Probably not. Will it? Probably.
Here's my problem. If you follow the instructions on the sign, it still says to slow down. There's no threshold for slow enough. No matter how slow you're going, the sign says "Slow Down". So once you become ensnared in the visual cone of this sign, you'll be forced to sit stationary for all eternity.
But maybe there's a loop-hole. It doesn't say how fast you must decelerate. So if you come into the zone going fast enough, and decelerate slowly enough, you can make it past the sign with some remaining non-zero momentum.
You know, I've never been diagnosed on the spectrum, but I have some of the tendencies. lol.
Much better to be specific than a vague "slow down". There's a road near me with two tight turns a couple blocks apart. One advises 25mph and the other advises 10mph.
Everyone's replying to you as if you truly don't understand the sign's intention but I'm sure you do. It's just annoying to be doing everything right and the signs and headlines are still telling you you're wrong.
There was a driving safety safety ad campaign here: "Drive to the conditions. If they change, reduce your speed." You can imagine how slow we'd all be going if the weather kept changing.
We might have OCPD.
In advertising: "Treat yourself. You deserve it!"
Me: What if someone who didn't deserve it heard this message. How can you possibly know what I deserve? Do all people deserve to treat themselves? Is the notion of deserving or treating really so vacuous?
Normies: jfc
I hate when people pretend to be smarter than everyone else by pointing this kind of utterance and insisting that someone, somehow, will parse those statements in the most literal and stupid manner.
Then there are the ignorant misanthropes that can't waste a chance to repeat their reductionist speculations about human cognition. Just like the idiot Elon Musk that wasted billions in irrecoverably fucked self-driving system based on computer-version because he underestimated the human visual cortex.
Fucking annoying midwits.
This is idle XKCD-style musing.
FYI, unless you are a commerical truck, a cop, or a racer, your speedometer will read slightly fast, sometimes as much as 5 to 10%. This is normal practice for cars as it limits manufacturer liability. You can check this using independant gps, ie not an in-dash unit. (Just imagine the court cases if a speedo read slower than the actual speed and you can understand why this started.)
[0] https://www.dmv.ca.gov/portal/handbook/california-driver-han...
Edit: However, elsewhere in the thread someone linked this Streetview image that shoes that this particular school zone is 15mph: https://maps.app.goo.gl/7PcB2zskuKyYB56W8?g_st=ac
To put it another way. If an autonomous vehicle has a reaction time of 0.3 seconds, the stopping distance from 17 mph is about the same as a fully alert human driver (1 second reaction time) driving 10.33 mph.
There's a case to be made that it wasn't slow enough.
On that very same road with a 20mph limit, 40mph might be completely safe or 3mph might be extremely negligently dangerous. It all depends on what is going on in the area.
Nobody was injured.
I think any fair evaluation of this (once the data was available) would conclude that Waymo was taking reasonable precautions.
That's exactly part of the problem. If it is programmed to be over-cautious and go 17 in a 25 zone, that feels like it is safe. Is it?
It takes human judgment of the entire big picture to say meaningfully whether that is too slow or too fast. Taking the speed limit literally is too rigid, something a computer would do.
Need to take into account the flow of the kids (all walking in line vs. milling around going in all directions), their age (younger ones are a lot more likely to randomly run off in an unsafe direction), what are they doing (e.g. just walking, vs. maybe holding a ball that might bounce and make them run off after it), their clustering and so on.
Driving past a high school with groups of kids chatting on the sidewalk, sure 20mph is safe enough. Driving past an elementary school with a mass of kids with toys moving in different directions on the same sidewalk, 17mph is too fast.
And if I'm watching some smaller kids disappear behind a visual obstruction that makes me nervous they might pop up ahead of it on the street, I slow down to a crawl until I can clearly see that won't happen.
None of this context is encoded in the "25mph when children are present" sign, but for most humans it is quite normal context to consider.
But would be great to see video of the Waymo scene to see if any of these factors was present.
They even wrote a blog post about it:
https://waymo.com/blog/2023/07/past-the-limit-studying-how-o...
I've read studies saying that most drivers don't brake at max effort, even to avoid a collision. This may be at least one of the reasons that Waymo predicted that an attentive human would likely have been going faster than their car at the moment of impact. I've got a good idea of my fun-car's braking performance, because I drive it hard sometimes, but after reading that I started practicing a bit with my wife's car on the school run, and... Yeah: it's got a lot more braking power than I realized. (Don't worry, I brake hard on a long straight exit ramp, when no one's behind me, a fast slow-down is perfectly safe, and the kiddo loves it.) I've now got an intuitive feel for where the ABS will kick in, and exactly what kind of stopping distance I have to work with, which makes me feel like a safer driver.
Second, going off my experience of hundreds and hundreds of ride-share rides, and maybe thirty Waymo journeys, I'd call the best 10-15% of humans better drivers than Waymo. Like, they're looking further up the road to predict which lane to be in, based on, say, that bus two blocks away. They also drive faster than Waymos do, without a perceptual decrease in safety. (I realize "perceptual" is doing some work in that sentence!) That's the type of defensive and anticipatory urban driver I try to be, so I notice when it's done well. Waymo, though, is flat-out better, in every way, than the vast majority of the ride-share drivers I see. I'm at the point where I'll choose a Waymo any time it'll go where I'm headed. This story reinforces that choice for me.
Ha! It is unbelievable how difficult it is to make someone brake hard. You'd think it's the easiest thing possible in the age of ABS - just press hard as you can.
I have a lot of experience on this, I used to teach car control both to teens and adults. One of the frequent exercises was seemingly very simple: Drive at Xmph until this spot, then brake at maximum power.
The vast majority of people can't do it on the first or second try, they'll just meekly press on the brake like they're coasting to a stop. After more coaching that hard means hard, they start to get it, but it takes many many tries.
Going early means you slow early, which means you also take longer to reach the child, but you're braking for all of that extra time, so you're slowing down even more.
Anyway, from the article,
> According to the NHTSA, the accident occurred “within two blocks” of the elementary school “during normal school drop off hours.” The safety regulator said “there were other children, a crossing guard, and several double-parked vehicles in the vicinity.”
So I mean, it is hard to speculate. Probably Waymo was being reasonably prudent. But we should note that this description isn’t incompatible with being literally in an area where the kids are leaving their parents’ cars (the presence of “several double parked cars brings this to mind). If that’s the case, it might make sense to consider an even-safer mode for active student unloading areas. This seems like the sort of social context that humans might have and cars might be missing.
But things speculation. It would be good to see a video.
How do you know that? The article says it slowed from 17 mph. That’s cautious progress speed, not cruising speed.
Waymos do this and have for years. They know where the people are around them and will take precautionary action based on that.
Here's a video from 2019 of one understanding that a car in the bike lane means the cyclists may dart out into the lane it's in and taking action based on that. https://waymo.com/blog/2019/05/safety-at-waymo-self-driving-...
That video is nearly 7 years old at this point and they've gotten much, much better since then.
If you think a fully-attentive human driver would have done better, I think you're kidding yourself.
I know you didn't make this point, but if anyone think the average LA driver would have done better than this I've got a bridge to sell you and that's really what matters more. (I say that as someone who used to live like half a mile from where this happened)
https://www.gov.uk/theory-test/hazard-perception-test
... could in some circumstances know that there's a likelihood that a child will emerge suddenly and reduce their speed in anticipation where circumstances allow.
Note that: If you cut speed but other drivers can't see why they may overtake, even unsafely, because you are a nuisance to them. Slowing in anticipation that a child will run out from behind the SUV, only for a car behind you to accelerate around you and smack straight into the child at even higher speed, is not the desired outcome even though you didn't hurt anybody...
And yes, we'd need to see the video to know. It's like that Sully scenario. In a prepared test skilled pilots were indeed able to divert and land, but Sully wasn't prepared for a test. You're trained to expect engine failure in an aeroplane - it will happen sometimes so you must assume that, but for a jet liner you don't anticipate losing both engines, that doesn't happen. There's "Obviously that child is going in the road" and "Where the fuck did they come from?" and a lot in between and we're unlikely to ever know for sure.
Waymos constantly track pedestrians nearby, you can see it on the status screen if you ride in one. So it would be both better able to find pedestrians and react as soon as one was on a collision course. They have a bit more visibility than humans do due to the sensor placement, so they also can see things that aren't that visible to a person inside the car, not to mention being constantly aware of all 360 degrees.
While I suppose that in theory, a sufficiently paranoid human might outdo the robot, it looks to me like it's already well above the median here.
While the deep details are not public, Waymo has shared a fair amount of description of their system, from which you can glean some ideas about the world model it creates and the actions it takes in specific situations: https://waymo.com/blog/2024/10/ai-and-ml-at-waymo https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto... https://waymo.com/blog/2024/10/introducing-emma
and that can potentially allow internal planning algorithm to choose more risky and aggressive trajectories/behavior, etc. say to reach target destination faster and thus deliver higher satisfaction to the passengers.
When thinking about these things you have to factor in the prior probability that a driver is fully attentive, not just assume they are.
If you’ve ever been in a Waymo you quickly realize their field of view is pretty good. You often see the vehicle sensing small pets and children that are occluded to a passenger or driver. For this reason and my experience with humans near aforementioned school, I doubt a human would out perform the Waymo in this particular incident and it’s debatable they even have more context to inform their decisions.
All that said, despite having many hours in a Waymo, it’s not at all clear to me how they factor in sidewalk context. You get the sense that pedestrians movement vectors are accounted for near intersections, but I can’t say I’ve experienced something like a slow down when throngs of people are about.
Note the weaselly "immediately detected the individual as soon as they began to emerge" in the puff piece from Waymo Comms. No indication that they intend to account for environmental context going forward.
If they already do this, why isn't it factored in the model?
And I completely agree that from that instant forward, the car did everything correctly.
But if I was the accident investigator for this, I would be far more interested in what happened in the 30 seconds before the car saw the kid.
Was the kid visible earlier and then disappear behind an obstruction? Or did the kid arrive from the side and was never earlier visible? These are the more important questions.
One is to merge opposite directional roads into a single lane, forcing drivers to cooperate and take turn to pass it, one car at a time.
For a combined car and pedestrian road (max speed of 7km/h) near where I live, they intentionally added large obfuscating objects on the road that limited visibility and harder to navigate. This forces drivers to drive very slow, even when alone on the road, as they can't see if a car or person may be behind the next object.
In an other road they added several tight S curves in a row, where if you drive anything faster than 20km/h you will fail the turns and drive onto the artificial constructed curbs.
In other roads they put a sign in the middle of two way roads while at the same time drastically limiting the width to the curb, forcing drivers to slow down in order to center the car in the lane and squeeze through.
In each of those is that a human driver with human fear of crashing will cause drivers to pay extra attention and slow down.
In the UK, the cost of owning a car is high yet our potholes, while frequent, are small enough to survive. Thus being more of an annoyance rather than a speed restriction.
Roundabouts are safer. They're safer because they prevent everybody from speeding through the intersection. And, even in case of an accident, no head-on collisions happen in a roundabout.
Roundabouts are worse for land use though, which impacts walkability, and the safety story for pedestrians and bike users with them is decidedly not great as well.
They're much safer for pedestrians than intersections. You're only crossing and dealing with traffic coming from one direction, stopping at a median, and then crossing further over.
Unlike trying to navigate a crosswalk where you have to play guessing games as to which direction some vehicle is going to come at you from while ignoring the lights (people do the stupidest things, and roundabouts are a physical barrier that prevents a bunch of that)
I could handle it as an adult just walking my bike but it would be a nightmare for someone pushing a stroller or dependent on a mobility device.
IMO you are absolutely playing frogger with the gaps in the traffic.
Otherwise you either risk getting run over by a car exiting the roundabout without seeing you; or getting run over by the car that stopped, but was rear-ended by another inside the roundabout.
This assumes a median, which is not present at most smaller roundabouts in the US.
The what now? Seriously, what in the world are you talking about? Roundabouts are heaven. They physically force drivers to slow down when approaching or leaving them, creating a safe environment for pedestrians and cyclists.
For example, there's no such thing as "running a red light at full speed" at a roundabout, no speeding up to "make the light", etc.
For cyclists specifically, they're amazing, because they eliminate the deadly left-turns. Every turn is a right turn, which is super safe.
Though I've never been in an accident either on a crossing or roundabout, so I can't really judge how true my impression is.
"Sweden hates cars."
There must be a happy medium somewhere in between.
Street parking has mostly been turned into exclusive residential parking, so parking houses are often the only choice. As a result they are quite expensive, and you got to walk to the destination.
Parking and access is much better in the country side, and the highways are fairly good and similar to those found in the west Europe. It not as straight or wide as authobahn, but not as much traffic either.
"""
- Gavin, you know our shameful history of worker suicides. Since the renovation? Not a single one.
- Not even one? Ok. But there's gotta be like, a middle ground here...
"""
> An evaluation of 20 mph zones in the UK demonstrated that the zones were effective both in reducing traffic speed and in reducing RTIs. In particular child pedestrian injuries were reduced by 70 per cent from 1.24 per year in each area before to 0.37 per year after the zones were introduced
https://www.rospa.com/siteassets/images/road-safety/road-saf...
The "Vision Zero" program was started in Sweden, and is becoming more widely adopted.
Acknowledging life has risk tradeoffs doesn't make you an American, but denying it can make you a self-righteous jerk.
Taken literally, that's clearly not true.
For example you can easily drive 150mph in the flat desert where there is nothing for a hundred miles and you can see many miles ahead. You have zero risk of hurting anyone else unless they somehow teleport in front of you.
But driving 5mph in tight street full of elementary school kids running around can be extremely dangerous.
It's all about context.
There’s this techbro utopia mindset leaking through as well, just like it does for climate change topics, that pragmatic solutions that work for us today are deprioritized because some incredible technology is right around the corner. This is also distinctly American, specifically Silicon Valley, culture.
this sounds like exactly the right tradeoff, especially since these decisions actually increase convenience for those not in cars
It is possible to go too far in either direction.
And it’s not “runaway”, it’s exactly the right prioritisation. I’d encourage you to spend some time on Not Just Bikes and the say whether you’d like to live in a Nordic or an American neighbourhood. The Nordic style is also about convenience because car centric infrastructure makes a lot of things less accessible and convenient.
Many roads in London have parked cars on either side so only one can get through - instead of people cooperating you have people fighting, speeding as fast as they can to get through before someone else appears, or race on-coming cars to a gap in the parked cars etc. So when they should be doing 30mph, they are more likely doing 40-45. Especially with EVs you have near-instant power to quickly accelerate to get to a gap first etc.
And putting obstacles in the road so you cant see if someone is there? That sounds really dangerous and exactly the sort of thing that caused the accident in the story here.
Madness.
Yes. They have made steady progress over the previous decades to the point where they can now have years with zero road fatalities.
> And putting obstacles in the road so you cant see if someone is there? That sounds really dangerous and exactly the sort of thing that caused the accident in the story here.
Counterintuitive perhaps, but it's what works. Humans adjust their behaviour to the level of perceived risk, the single most important thing is to make driving feel as dangerous as it is.
From experience they will adjust their behaviour to reduce their total travel time as much as possible (i.e. speed to "make up" for lost time waiting etc) and/or "win" against other drivers.
I guess it is a cultural thing. But I cannot agree that making it harder to see people in the road is going to make anything safer. Even a robot fucking taxi with lidar and instant reaction times hit a kid because they were obscured by something.
Sure they do, all humans do. Nobody wants to get hurt and nobody wants to hurt anyone else.
(Yes there are few exceptions, people with mental disorders that I'm not qualified to diagnose; but vast majority of normal humans don't.)
Humans are extremely good at moderating behavior to perceived risk, thank evolution for that.
(This is what self-driving cars lack; machines have no fear of preservation)
The key part is perceived though. This is why building the road to match the level of true risk works so well. No need for artificial speed limits or policing, if people perceive the risk is what it truly is, people adjust instictively.
This is why it is terrible to build wide 4 lane avenues right next to schools for example.
The evidence is that they do though. E.g. the Exhibition Road remodelling (removing curbs/signs/etc.) has been a great success and effectively reduced vehicle speeds, e.g. https://www.rbkc.gov.uk/sites/default/files/media/documents/...
Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.
So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.
I feel like you have to say this out loud because many people in these discussions don't share this view. Billion dollar corporate experiments conducted in public are sacrosanct for some reason.
> I just think it's important to consider the counterfactual
More than 50% of roadway fatalities involve drugs or alcohol. If you want to spend your efforts improving safety _anywhere_ it's right here. Self driving cars do not stand a chance of improving outcomes as much as sensible policy does. Europe leads the US here by a wide margin.
Yes, and I find it annoying that some people do seem to think Waymo should never be criticized. That said, we already have an astounding amount of data, and that data clearly shows that the experiment is successful in reducing crashes. Waymos are absolutely, without question already making streets safer than if humans were driving those cars.
> If you want to spend your efforts improving safety _anywhere_ it's right here.
We can and should do both. And as your comment seems to imply but does not explicitly state, we should also improve road design to be safer, which Europe absolutely kicks America's ass on.
I disagree. You need way more data, like orders of magnitude more. There are trillions of miles driven in the US every year. Those miles often include driving in inclement weather which is something Waymo hasn't even scraped the surface of yet.
> without question
There are _tons_ of questions. This is not even a simple problem. I cannot understand this prerogative. It's far too eager or hopeful.
> We can and should do both
Well Google is operating Waymo and "we" control road policy. One of these things we can act on today and the other relies on huge amounts of investments paying off in scenarios that haven't even been tested successfully yet. I see an environment forming where we ignore the hard problems and pray these corporate overlords solve the problem on their own. It's madness.
Absurd, reductive, and non-empirical. Waymos crash and cause injury/fatality far less frequently than human drivers, full stop. You are simply out of your mind if you believe otherwise, and you should re-evaluate the data.
> Those miles often include driving in inclement weather which is something Waymo hasn't even scraped the surface of yet.
Yes. No one is claiming that Waymos are better drivers than humans in inclement weather, because they don't operate in those conditions. That does not mean Waymos are not able to outperform human drivers in the conditions in which they do operate.
> I see an environment forming where we ignore the hard problems and pray these corporate overlords solve the problem on their own. It's madness.
What's madness is your attitude that Waymos' track record does not show they are effective are reducing crashes. And again, working on policy does not prevent us from also improving technology as you seem to believe it does.
Yeah, I'm sure Waymos would struggle in a blizzard in Duluth, but a) so would a human and b) Waymos aren't driving there. (Yet.)
No. I'm not. I'm being realistic about the technology. You're artificially limiting the scope.
> so would a human
This is goalpost moving 101. The question isn't would a human driver also struggle but _would it be better_? You have zero data.
It is not moving the goalpost to say "so would a human". Comparison to human drivers is exactly the stated goalpost (and it should be).
> You have zero data.
Outrageously uninformed take. We have mountains of data that show Waymos in aggregate are safer drivers than humans.
That's fine. But crashes are relatively rare and what matters is accountability. Will Waymo be accountable for hitting this kid the way a human would? Or will they fight in court to somehow blame the pedestrian? Those are my big concerns when it comes to self driving vehicles, and history with tech suggests that they love playing hot potato instead of being held accountable.
And yes, better walkable infrastructure is a win for all. The minor concern I have is the notion that self driving is perfect and we end up creating even more car centric infrastructure. I'm not sure who to blame on that one.
I assume that's how it works already.
That is, I expect that Waymo will be required to pay for accidents they cause, whether they want to or not.
Could you spell out exactly what "sensible" policy changes you were thinking of? Driving under the influence of drugs and/or alcohol is already illegal in every state. Are you advocating for drastically more severe enforcement, regardless of which race the person driving is, or what it does to the national prison population? Or perhaps for "improved transit access", which is a nice idea, but will take many decades to make a real difference?
FWIW, your first OWI in Wisconsin, with no aggravating factors, is a civil offense, not a crime, and in most states it is rare to do any time or completely lose your license for the first offense. I'm not sure exactly what OP is getting at, but DUI/OWI limits and enforcement are pretty lax in the US compared to other countries. Our standard .08 BAC limit is a lot higher than many other countries.
To be a bit snarkier, and not directed at you, but I wish these supposedly superior Europeans would tell us what they actually want us to do. Should we enforce OWI laws more strictly, or lower the prison population? We can't do both!
We already took their license, we can't double-take it to show we really mean it. Fining them seems a bit rough when they need to drive to get to the job to make the money to pay those fines. Or we're right back to jail time and an even higher prison population.
Unless the vehicle is stolen, seize and impound the vehicle. If the driver is the owner, auction it off and give them back the proceeds, minus costs.
I feel like I'm living in some different world where drunk driving is a-okay when I face these types of objections to actually enforcing the rules around it.
Like that would sound nuts if we applied it to other things - e.g. "take away the professional license of a mid-career pilot/surgeon/schoolteacher/engineer because he was drinking on the job and his life collapses".
Various people can't drive because of e.g. visual impairments, age, poverty, etc. - I find it an ugly juxtaposition to be asserting that we must allow people with DUIs to drive because otherwise their lives would "collapse" to the same point as those other people who can't drive.
The analogy is closer to "take away their ability to get any job" and then it sounds even more harsh.
> Various people can't drive because of e.g. visual impairments, age, poverty, etc. - I find it an ugly juxtaposition to be asserting that we must allow people with DUIs to drive because otherwise their lives would "collapse" to the same point as those other people who can't drive.
If you can't see well enough to drive, then life was unfair to you, and you can often get help with transportation that isn't available to someone that violated the law. For age, if you're young then your parents are supposed to care for you, if you're too old to drive you're supposed to have figured out your retirement by now. For poverty, you kinda still need a car no matter what, that's just how the US is set up in most areas. And it's not ugly to make the comparison to extreme poverty, to say that kicking someone down to that level is a very severe punishment.
> must allow
I wasn't saying what we should do, just that turning up the aggressiveness has serious unwanted consequences.
If you take away the license of a pilot mid-career, they may be able to pivot to something else, but have a huge sunk cost of education and seniority where they ground out poor pay/schedules and then never made it to the part of the career with better pay. For a substantial segment of them, the career impact would be comparable to taking away the ability to drive from a random person.
> For poverty, you kinda still need a car no matter what, that's just how the US is set up in most areas.
You really don't. If you don't already live somewhere with public transit, you'll probably have to move. You'll have to make some sacrifices. But it's workable, I lived without a car and relied on city busses for all my transportation for several years. (And while I wouldn't necessarily recommend it, prior to that, I lived in a small town of ~4k people without transit service. I walked everywhere, and took the inter-city bus when I needed to leave the town.)
There are kinds of human sensing that are better when humans are maximally attentive (seeing through windows/reflections). But there's also the seeing-in-all-directions, radar, superhuman reaction time, etc, on the side of the Waymo.
An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.
(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)
At least it was already slowed down to 17 mph to start. Remember that viral video of some Australian in a pickup ragdolling a girl across the road? Most every comment is "well he was going the speed limit no fault for him!" No asshole, you hit someone. It's your fault. He got zero charges and the girl was seriously injured.
But you hit a kid in daytime? It's your fault. Period.
It’s possible a driver turns a corner (not wearing sunglasses) and suddenly the sun briefly blinds them, while a kid darts into the street.
I’ve seen kids (and ADULTS!) walk on the side of the street at night in all black or very very dark clothing. It’s extra amusing when they happen to be black (are they trying to get themselves killed?) It’s not the drivers fault if they genuinely can’t see a camouflaged person. I’ve had numerous close calls like this on rural and suburban roads and I think i’m a cautious driver. Make sure you are visible at night.
Or if a kid is riding a bicycle down a hill and flies into the middle of an intersection (dumb? brakes failed? etc). very possible to accidentally mow down the child.
HOWEVER, i do agree that 95% of the time it’s the drivers fault if they hit a kid. Poor awareness and speed are the biggest factors. It is certainly not 100% of the time the drivers fault though. That’s absurd. You really misunderstand how dumb some pedestrians (and parents) are.
But….it’s all besides the point. A child that doesn’t understand the importance of cross walks and looking both ways is too young to be walking alone, period. Yes even if they’re “right”. Being right isn’t helpful if you’re dead.
It is absurd, but that doesn't mean that the attitude can't be useful!
In teaching my teenager to drive, I drilled into him the fact that, in every accident, regardless of who is "at fault", there is almost always something that the other party could have done to mitigate it. I gave him plenty of situations as examples...
You're going down a street that has kids on the sidewalk? You better be prepared to have one of those kids come out in front of the car while rough-housing, playing, whatever.
You had right of way? Maybe you did, but did you even look at the opposing traffic to see if it was safe to proceed or did you just look at the traffic light?
I've driven, thus far in my life, roughly 600000km (maybe more) with 2x non-trivial accidents, both ruled not my fault. In hindsight, I could have avoided both of them (I was young and not so self-aware).
I'm paranoid when driving, and my stats are much much better than Waymo's (have never injured anyone - even my 2x accidents only had me injured), even though I drive in all sorts of conditions, and on all sorts of roads (many rural, some without markings).
Most people don't drive like this though (although their accident rate is still better than Waymo's).
You have a responsibility to be cautious in heavy equipment no matter what the signage on the road says, and that includes keeping a speed at which you can stop safely if a person suddenly steps onto the road in situations where people are around. If you are driving past a busy bar in downtown, a drunk person might step out and you have a responsibility to assume that might happen. If you have to go slower sometimes, tough.
For instance, a sailboat must alter course if a collision can't be avoided by the give-way vessel alone:
Rule 17(b):
> When, from any cause, the vessel required to keep her course and speed finds herself so close that collision cannot be avoided by the action of the give-way vessel alone, she shall take such action as will best aid to avoid collision.
So if you sail your boat into a container ship and it tries to give way, but doesn't have the ability to do so quickly enough to prevent a collision, you're violating the rules if you don't also alter course as well.
Plus, if we're going to connect this to a pedestrian, if a sailboat suddenly cut in front of a container ship with zero concern for its limited maneuverability/ability to stop, the sailboat would also violate Rule 2 by neglecting precaution required by seamen and failing to consider the limitations of the vessels involved.
In the Coast Guard Auxiliary “Sailing and Seamanship” class that I attended, targeting would-be sailboat skippers, we were told the USS Ranger nuclear-powered aircraft carrier had the right-of-way.
That logic is utter bs, if someone jumps out when you're travelling at an appropriate speed and you do your best to stop then that's all that can be done. Otherwise by your logic the only safe speed is 0.
Near my house, almost the entire trip from the freeway to my house is via a single lane with parked cars on the side. I would have to drive 10 MPH the entire way (speed limit is 25, so 2.5x as long).
Remove the free parking if that's making the road unsafe. Or drive 10 mph. Done.
- Parked cars on the street. - Drive somewhat fast. - Avoid killing people.
Pick two.
A single lane residential street with zero visibility seems like an obvious time to slow down. And that's what the Waymo did.
It is a drivers responsibility to drive for the conditions. If conditions are calling for driving 40% slower, then that's what you do and suck it up.
If too many roads have conditions that require that, take that up with your municipality to fix the situation. Or, even better, advocate for better public transit and trains, and designing cities to move people, not move cars.
If there's cars parked on the side constantly, and that's supposed to slow you down significantly, it should be baked into the speed limit.
From what I'm aware of, you're not actually expected to slow down drastically from parked cars.
I know, it's not reality in most cases, but it goes to show how poor our infrastructure is and underinvested in.
Car culture in the US is toxic, and a lot of accidents and fatalities are a result of how poorly designed our infrastructure is. We design for cars, not for people (just one more lane bro, will totally fix traffic. Nevermind that a train can move double the capacity of that entire line of traffic).
Cars are the wrong solution, particularly in urban areas. A self driving car is still a car, and comes along with all the same problems that cars cause.
Stopped buses similarly, people get off the bus, whip around the front of them and straight into the streets, so many times I’ve spotted someone’s feet under the front before they come around and into the street.
Not to take away from Waymo here, agree with thread sentiment that they seem to have acted exemplary
Or you look for reflections in the cars parked around it. This is what I was taught as “defensive“ driving.
Lmao most drivers I see on the roads aren't even capable of slowing down for a pedestrian crossing when the view of the second half of the crossing is blocked by traffic (ie they cannot see if someone is about to step out, especially a child).
Humans are utterly terrible drivers.
Duh, driver is, essentially, a type of specialized profession. It's kinda unreasonable to think that everyone could learn to do it well
Good thing we have public transport! :D
In low traffic of course it can be different. But its unrealistic to expect anybody to drive in expectation that behind every single car passed there may be a child jumping right in front of the car. That can be easily thousands of cars, every day, whole life. Impossible.
We don't read about 99.9% of the cases where even semi decent driver can handle it safely, but rare cases make the news.
Does Waymo claim that? If so I haven't seen it. That should of course be the goal, but "better than the average human driver" should be the bar.
They don't look far enough ahead to anticipate what might happen and already put themselves in a position to prepare for that possibility. I'm not sure they benefit from accumulated knowledge? (Maybe Waymo does, that's an interesting question.) I.e., I know that my son's elementary school is around the corner so as I turn I'm already anticipating the school zone (that starts a block away) rather than only detecting it once I've made the turn.
This certainly may have been true of older Teslas with HW3 and older FSD builds (I had one, and yes you couldn't trust it).
It's much more of a competition than I suspect a lot of people realize.
Obviously the distances are different at that speed, but if the person steps out so close that you cannot react in time, you're fucked at any speed.
10mph will do serious damage still, so please for the sake of the children please slow yourself and your daughter's driving down to 0.5mph where there are pedestrians or parked cars.
But seriously I think you'd be more safe to both slow down and also to put more space between the parked cars and your car so that you are not scooting along with a 30cm of clearance - move out and leave lots of space so there is more space for sight-lines for both you and pedestrians.
> The woman told police she was “eating yogurt” before she turned onto the road and that she was late for an appointment. She said she handed her phone to her son and asked him to make a call “but could not remember if she had held it so face recognition could … open the phone,” according to the probable cause statement.
> The police investigation found that she was traveling 50 mph in a 40 mph zone when she hit the boys. She told police she didn’t realize she had hit anything until she saw the boys in her rearview mirror.
The Waymo report is being generous in comparing to a fully-attentive driver. I'm a bit annoyed at the headline choice here (from OP and the original journalist) as it is fully burying the lede.
This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.
Yep. Driving safe isn't just about paying attention to what you can see, but also paying attention to what you can't see. Being always vigilant and aware of things like "I can't see behind that truck."
Honestly I don't think sensor-first approaches are cut out to tackle this; it probably requires something more akin to AGI, to allow inferring possible risks from incomplete or absent data.
When reading the article, my first thought was that only going at 17mph was due to it being a robotaxi whereas UK drivers tend to be strongly opposed to 20mph speed limits outside schools.
I'm not sure how much of that Waymo's cars take into account, as the law technically takes into account line of sight things that a person could see but Waymo's sensors might not, such as children present on a sidewalk.
Are you sure? The ones I've seen have usually been 20 or 25mph.
Looking on Image Search (https://www.google.com/search?q=school+zone+speed+limit+sign) and limiting just to the ones that are photos of real signs by the side of the road, the first 10 are: 25, 30, 25, 20, 35, 15, 20, 55, 20, 20. So only one of these was 15.
Does Waymo have the same object permanence and trajectory prediction (combined) to that of a human?
Once the video evidence it out, it might become evident.
Generally Waymo seems to be a responsible actor so maybe that is the case and this can help demonstrate potential benefits of autonomous vehicles.
Alternatively, if even they can't get this right then it may cast doubts about the maturity of the entire ecosystem
On this note specifically ive actually been impressed, ie when driving down Oak st in SF (fast road, tightly parked cars) I've often observed it slow if someone on a scooter on the sidewalk turns to look toward oncoming traffic (as if to start riding), or to slow passing parked box trucks (which block vision of potential pedestrians)
Good technical question
Maybe. Depends on the position of the sun and shadows, I'm teaching my kids how to drive now and showing them that shadows can reveal human activity that is otherwise hidden by vehicles. I wonder if Waymo or other self-driving picks up on that.
As described by the nhtsa brief:
"within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity"
The "that there were other children, a crossing guard, and several double-parked vehicles in the vicinity" means that waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
You will get honked at by aggro drivers if you slow down to the school zone speed limit of 25mph. Most cars go 40ish.
And ofc a decent chunk of those drivers are on tiktok, tinder, Instagram, etc
Your median human driver? Sadly, I think not. Most would be rushing, or distracted, or careless.
> waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
I don't think we can say at all that the Waymo was driving recklessly with the data we currently have
Why is it likely? Are we taking the vendor's claims in a blog post as truth?
Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.
It is also crazy that this happened 6 days ago at this point and video was NOT part of the press releases. LOL
Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.
> a huge portion of human drivers
What are you basing any of these blind assertions off of? They are not at all born out by the massive amounts of data we have surrounding driving in the US. Of course Waymo is going to sell you a self-serving line but here on Hacker News you should absolutely challenge that. In particular because it's very far out of line with real world data provided by the government.
>It's likely that a fully-attentive human driver would have done worse.
Is based off the source I gave in my comment, the peer-reviewed model
> a huge portion of human drivers
Is based on my experience and bits of data like 30% of fatal accidents involving alcohol
Like I said, if you have better data I'm glad to see it
The data completely disagrees with you.
> Like I said, if you have better data I'm glad to see it
We all have better data. It's been here the entire time:
https://www.nhtsa.gov/research-data/fatality-analysis-report...
Again, I welcome you to point to data that contradicts my claims, but it seems you are unable
The "Persons Killed, by Highest Driver Blood Alcohol Concentration (BAC) in the Crash"[1] report shows that in 2023, 30% of fatal crashes involved at least one driver with a BAC > 0.08 (the legal limit), and 36% involved a BAC > 0.01.
Interesting that "Non-motorist" fatalities have dropped dramatically for everyone under the age of 21, but increased for everyone between 21 and 74.[2] Those are raw numbers, so it'd be even more interesting to display them as a ratio of the group's size. Are less children being killed by drivers because there are less children generally? Changes in parents' habits? Backup cameras?
1: https://www-fars.nhtsa.dot.gov/Trends/TrendsAlcohol.aspx 2: https://www-fars.nhtsa.dot.gov/Trends/TrendsNonMotorist.aspx
- Their "peer-reviewed model" compares Waymo vehicles against only "Level 0" vehicles. However even my decade-old vehicle is considered "Level 1" because it has an automated emergency braking system. No doubt my Subaru's camera-based EBS performs worse than Waymo's, still it's not being included in their "peer-reviewed model." That comparison is intentionally comparing Waymo performance against the oldest vehicles on the road -- not the majority of cars sold currently.
- This incident happened during school dropoff. There was a double-parked SUV that occluded the view of the student. This crash was the fault of that double-parked driver. But why was the uncrewed Waymo driving at 17 mph to begin with? Do they not have enough situational awareness to slow the f*ck down around dropoff time immediately near an elementary school?
Automotive sensor/control packages are very useful and will be even more useful over time -- but Waymo is intentionally making their current offering look comparatively better than it actually is.
UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.
[0] https://www.safedrivingforlife.info/free-practice-tests/haza...
> No person shall drive a vehicle upon a highway at a speed greater than is reasonable or prudent having due regard for weather, visibility, the traffic on, and the surface and width of, the highway, and in no event at a speed which endangers the safety of persons or property.
The speed limit isn't supposed to be a carte blanche to drive at that speed no matter what; the basic speed law is supposed to "win." In practice, enforcement is a lot more clear cut at the posted speed limit and officers don't want to write tickets that are hard to argue in court.
And at the same time, if you were traveling at some speed and no damage was caused, it's harder to say that persons or property were endangered.
Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.
Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.
Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.
IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.
How many humans drivers would pass it, and what proportion of the time? Even the best drivers do not constantly maintain peak vigilance, because they are human.
> IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.
In practice, this isn't reasonable, because "hey we're slightly better than a population that includes the drunks, the inattentive, and the infirm" is not going to win public trust. And, of course, a system that is barely better than average humans might worsen safety, if it ends up replacing driving by those who would normally drive especially safe.
I think "better than the average performance of a 75th or 90th percentile human driver" might be a good way to look at things.
It's going to be a weird thing, because odds are the distribution of accidents that do happen won't look much like human ones. It will have superhuman saves (like that scooter one), but it will also crash in situations that we can't really picture humans doing.
I'm reminded of airbags; even first generation airbags made things much safer overall, but they occasionally decapitated a short person or child in a 5MPH parking lot fender bender. This was hard for the public to stomach, and if it's your kid who is internally decapitated by the airbag in a small accident, I don't think you'll really accept "it's safer on average to have an airbag!"
Then you said, "this isn't reasonable", and the bar shouldn't be "slightly better" or "barely better". It should be at least better than the 75th percentile driver.
It sounds like you either misread the parent comment or you're phrasing your response as disagreement despite proposing roughly the same thing as the parent comment.
A 20% lower fatal crash rate compared to the average might be a significant improvement-- from a public health standpoint, this is huge if you could reduce traffic deaths by 20%.
But if you don't get the worst drivers to replace their driving with autonomous, that "20% less than average" might actually make things worse. That's my point. The bar has to be pretty dang high to be sure that you will actually make things better.
Sadly, you're right, but as rational people, we can acknowledge that it should. I care about reducing injuries and deaths, and the %tile of human performance needed for that is probably something like 30%ile. It's definitely well below 75%ile.
> > And, of course, a system that is barely better than average humans might worsen safety, if it ends up replacing driving by those who would normally drive especially safe.
It's only if you get the habitually drunk (a group that is overall impoverished), the very old, etc, to ride Waymo that you reap this benefit. And they're probably not early adopters.
You also solve for people texting (or otherwise using their phones) while driving, which is pretty common among young, tech-adopting people.
Yes, but the drivers who are 5th percentile drivers who cause a huge share of the most severe accidents are "special" in various ways. Most of them are probably not autonomy early adopters.
The guy who decided to drive on the wrong side of a double yellow on a windy mountain road and hit our family car in a probable suicide attempt was not going to replace that trip with Waymo.
Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.
But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.
(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).
There are definitely times and situation where the right speed is 7MPH and even that feels "fast", though, too.
Only with instant reaction time and linear deceleration.
Neither of those are the case. It takes time for even a Waymo to recognize a dangerous situation and apply the brake and deceleration of vehicles is not actually linear.
Reaction time makes the math even better here. You travel v1 * reaction_time no matter what, before entering the deceleration regime. So if v1 gets smaller, you get to spend a greater proportion of time in the deceleration regime.
> linear deceleration.
After reaction time, stopping distance is pretty close to n^2. There's weird effects at high speed (contribution from drag) and at very low speed, but they have pretty modest contributions.
Without that these vehicles could only start braking when certainty crossed some arbitrary threshold.
In any case, with zero reaction time, linear deceleration time to stop is proportional to velocity squared. With reaction time, the linear deceleration time is that plus the velocity times the reaction time.
so the two cases we're comparing are 17 * r + (17^2 - 5^2) vs. 16 * r + (16^2), or 17 * r + 264 vs 16 * r + 256. As long as reaction time isn't negative, a vehicle that could slow to 5MPH starting at 17MPH could slow to 0MPH starting at 16MPH.
(There are weird things that happen at <2.5MPH reducing deceleration to sublinear, but the car moves only a few inches at these speeds during a panic stop).
Edit: Not 'allowed' but people do it constantly. Regular drivers, delivery drivers, city workers, construction trucks, etc. There may be laws but very little enforcement.
I would not race at 17 MPH through such an area. Of course, Waymo will find a way to describe themselves as the heroes of this situation.
These giant SUVs really are the worst when it comes to child safety
From a tactical PR standpoint, it would be a disaster. Muh big truuuucks is like a third rail because Americans are car obsessed as a culture. They already hit a kid, best to save some energy for the next battle.
Besides if Waymo wins (in general) private car ownership will decrease which is a win regardless. And maybe Waymo can slowly decrease the size of their fleet to ease up the pressure on this insane car size arms race.
That day I learned why it was so.
The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.
What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself
The UK is such a situation, and this vehicle would have failed a driving test there.
Sure but also throw in whether that driver is staring at their phone, distracting by something else, etc. I have been a skeptic of all this stuff for a while but riding in a Waymo in heavy fog changed my mind when questioning how well I or another driver would've done at that time of day and with those conditions.
This is the fault of the software and company implementing it.
Some do, some of the time. I'm always surprised by how much credence other people give to the idea that humans aren't on average very bad at things, including perception.
That’s why they purchase goods and services (from others) and then cry about things they don’t and probably never will understand.
And why they can be ignored and just fed some slop to feel better.
I could lie but that’s the cold truth.
Edit: I'm not sure if the repliers are being dense (highly likely), or you just skipped over context (you can click the "context" link if you're new here)
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
It isn't me vs them. It is just me being self-aware. Clearly, you had a problem with what I said so I must have struck a nerve.
Welcome to the real world bro.
look at what I was replying to. if you still don't get it, then yeah I'm just proving my point and you can keep crying about it.
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
The fact that you go around asking dumb questions in bad faith to people is enough for me, last time I engage with you.
Have a good life!
What else to you expect them to do, only run on grade–separated areas where children can't access? Blare sirens so children get scared away from roads? Shouldn't human–driven cars do the same thing then?
So by that logic, if we cured cancer but the treatment came with terrible side effects it wouldn't be considered a "success"? Does everything have to perfect to be a success?
The less cynical set of goals would be safer than the mean human, then safer than the median human, then safer every year indefinitely.
The real killer here is the crazy American on street parking, which limits visibility of both pedestrians and oncoming vehicles. Every school should be a no street parking zone. But parents are going to whine they can't load and unload their kids close to the school.
Plenty of American cities regulate or even eliminated, in various measures, on-street parking.
Result? No more safe places to cross, drivers are not stopping for pedestrians when no cross walk. They added parking zones right up to the old cross walks that pedestrians still use (since it were the safest places) where vans are regularly parked and obscure the entirety one side of the road. Even outside of these shared zones, there are lots lots lots of places where parking space is obscuring the crosswalk, where huge vans park right on it, even though legally you have to be 5+ meters away. Never seen a cop give a ticket for that in my life.
One day someone will get killed right there, I'm sure of it, and it'll be mainly the city's fault.
I guess you could keep doing that until kids just walk to and from school?
It doesn't stop all on street parking beside the school, but it cuts it down a noticeable amount.
> What else to you expect them to do, only run on grade–separated areas where children can't access?
no, i expect them to slow down when children may be present
The simple fact is that it hit a child and even though it wasn't a serious issue due to their safety policies, there's still room for improvement in these technologies.
And since it's a robot, and not a human, you can actually make changes and have them stick. For example, routing away from schools during certain hours.
https://www.theverge.com/2022/1/28/22906513/waymo-lawsuit-ca...
Indeed. Waymo is a much more thoughtful and responsible company than Cruise, Uber, or Tesla.
"Cruise admits to criminal cover-up of pedestrian dragging in SF, will pay $500K penalty" https://www.sfgate.com/tech/article/cruise-fine-criminal-cov...
The Waymo blog post refused to say the word "child", instead using the phrase "young pedestrian" once.
The Waymo blog post switches to "the pedestrian" and "the individual" for the rest of the post.
The Waymo blog post also consistently uses the word "contact" instead of hit, struck, or collision.
The Waymo blog post makes no mention of the injuries the child sustained.
The Waymo blog post makes no mention of the school being in close proximity.
The Waymo blog post makes no mention of other children or the crossing guard.
The Waymo blog post makes no mention of the car going over the school zone speed limit (17 in 15).
It's the "best outcome" if you're trying to go as fast as possible without breaking any laws or ending up liable for any damage.
German perspective, but if I told people I've been going 30km/h next to a school with poor visibility as children are dropped off around me, I would be met with contempt for that kind of behavior. I'd also at least face some partial civil liability if I hit anyone.
There's certainly better handling of the situation possible, it's just that US traffic laws and attitudes around driving do not encourage it.
I suspect many human drivers would've driven slower, law or no law.
I look for shadows underneath stationary vehicles. I might also notice pedestrians "vanishing". I have a rather larger "context" than any robot effort.
However, I am just one example of human. My experience of never managing to run someone over is just an anecdote ... so far. The population of humans as a whole manages to run each other over rather regularly.
A pretty cheap instant human sensor might be Bluetooth/BLE noting phones/devices in near range. Pop a sensor in each wing mirror and on the top and bottom. The thing would need some processing power but probably nothing that the built in Android dash screen couldn't handle.
There are lots more sensors that car manufacturers are trying to avoid for cost reasons, that would make a car way better at understanding the context of the world around it.
I gather that Tesla insist on optical (cameras) only and won't do LIDAR. My EV has four cameras and I find it quite hard to see what is going on when it is pissing down with rain, in the same way I do if I don't clean my specs.
Isn't the speed limit normally 15 mph or less in a school zone? Was the robotaxi speeding?
I'll just remind anyone reading: they're under no obligation to tell the unvarnished truth on their blog.
Even if the NHTSA eventually points out significant failures, getting this report out now has painted a picture of Waymo only having an accident a human would have handled worse.
It would be wise to wait and see if the NHTSA agree. Would a driver have driven at 17mph in this sort of traffic or would they have viewed it as a situation where hidden infant pedestrians are likely to step out?
Love him or hate him, releasing the video is something I can see Elon doing because assuming a human driver would have done worse, it speaks for itself. Release a web video game where the child sometimes jumps out in front of the car, and see how fast humans respond like the "land Starship" game. Assuming humans would do worse, that is. If the child was clearly visible through the car or some how else avoidable by humans, then I'd be hiding the video too.
Human reaction times are terrible, and lots of kids get seriously injured, or killed, when they run out from between cars.
I was very very lucky.
Those kind of neighborhoods where the outer houses face the fast large roads I think are less common now but lots of them left over from the 50+ years ago.
That incident still gives me the willies.
It's monumentally stupid to be in the middle of a narrow road around a blind corner. People speed around blind corners all the time.
Consider a crosswalk. The law says traffic must stop for anyone setting foot in the crosswalk. But it's crazy to step into a crosswalk assuming drivers will stop for you. I always wait until they actually stop before I step into it.
I remember a public service commercial from the 60s advocating the idea that one shouldn't be right, dead right, and to use common sense.
I talked to a cyclist once who told me about his legal rights to ride a bike in traffic. I asked him but what if a car hits you? He smugly replied that then he'd win a massive lawsuit. I then asked him what good would that do him if he was paralyzed? He then looked startled.
Placing yourself somewhere where pedestrians are not expected (non-residental road) mostly hidden from oncoming traffic for an extended period is putting yourself in undue risk.
I walk that road many times. I hug the side on the outside of the turn (there's no room on the inside) and so I can see (and be seen) from further away. I listen for cars coming. I watch for them. I am prepared to jump over the railing.
It's just common sense.
BTW, do you know that if you rear-end someone, it's your fault? I once was in heavy traffic, and the traffic in front of me stopped abruptly. I hit the brakes hard. I also glanced in the rearview mirror and realized the truck behind me was not going to stop in time. So I quickly pulled onto the shoulder. The truck hit the car in front of me.
But fortunately, that gave the truck a precious few more feet of stopping distance, and the collision with the car in front was minor.
OMG
Stopped or moved? Is it allowed in CA to move car at all after a serious accident happens?
I am personally a fan of entirely automated but slow traffic. 10mph limit with zero traffic is fast enough for any metro area.
Importantly, Waymo takes full ownership for something they write positively: Our technology immediately detected the individual.... But Waymo weasels out of taking responsibility for something they write about negatively.
the "Waymo Driver" is how they refer to the self-driving platform (hardware and software). They've been pretty consistent with that branding, so it's not surprising that they used it here.
> Importantly, Waymo takes full ownership for something they write positively [...] But Waymo weasels out of taking responsibility for something they write about negatively
Pretty standard for corporate Public Relations writing, unfortunately.
One better than: We investigated our own system and found ourselves to be at no fault?
> From the Waymo blog
Yeah, like, no shit Sherlock. We'd better wait for some videos before making our opinions.
The vast vast vast majority of human drivers would not have been able to accomplish that braking procedure that quickly, and then would not have been able to manage the follow up so quickly.
I have watched other parent drivers in the car pick up line at public schools for the last 16 years and people are absolutely trash at navigating that whole process and parents drive so poorly it’s absurd. At least half parents I see on their phones while literally feet away from hitting some kid.
> The vast vast vast majority of human drivers ... would not have been able to manage the follow up so quickly
You are saying the "vast vast vast majority of human drivers" wouldn't pull over after hitting a child?
I remember similar blind faith in and unlimited advocacy for anything Tesla and Musk said, and look how that has turned out. These are serious issues for the people in our communities, not a sporting event with sides.
If it can yell at the kid and send a grumpy email to the parents and school, the automation is complete.
The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.
Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.
Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.
This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.
[1] Yes, I'm an AV safety expert
[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...
(edit: verbiage)
I wonder whether Waymo’s model notices that small children are present or likely to be present and that it should leave extra margin for error.
(My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
Now you're asking interesting questions... Technically, in CA, the speed limit in school zones are 25 mph (which local authorities can change to 15 mph, as needed). In this case, that would be something the investigation would check, of course. But regardless of that, 17 mph per se is not a very fast speed (my gut check: turning around intersections at > 10-11 mph feels fast, but going straight at 15-20 mph doesnt feel fast; YMMV). But more generally, in the presence of child VRUs (vulnerable road users), it is prudent to drive slowly just because of the randomness factor (children being the most unaware of critters). Did the Waymo see the kids around in the area? If so, how many and where? and how/where were they running/moving to? All of that is investigation data...
My 2c is that Waymo already took all of that into account and concluded that 17 mph was indeed a good speed to move at...
...which leads to your observation below:
> (My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
Yes, I have indeed made that same observation. The Waymos of 2 years ago were very cautious; now they seem much more assertive, even a bit aggressive (though that would be tough to define). That is a driving policy decision (cautious vs assertive vs aggressive).
One could argue if indeed 17 mph was the "right" decision. My gut feel is Waymo will argue that (but likely they might make the driving policy more cautious esp in presence of VRUs, and child VRUs particularly)
Legally a speed limit is a 'limit' on speed, not a suggested or safe speed. So it's never valid to argue legally that you were driving under the limit, the standard is that you slow down or give more room for places like a school drop-off while kids are being dropped off or picked up.
Well, people are doing a lot of what-about-ism in this situation. Some of that is warranted, but I'd posit that analyzing one "part" of this scenario in isolation is not helpful, nor is this the way Waymo will go about analyzing this scenario with their tech teams.
Let's consider, for argument's sake, if the Waymo bot had indeed slammed at the brakes with max decel, and had come to a complete (and sudden) stop barely 5cm in front of the kid. Would THAT be considered a safe response??
If I'm a regulator, I'd still ding the bot with an "unsafe response" ticket and send that report to Waymo. If YOU were that pedestrian, you'd feel unsafe too. (I definitely have seen such responses in my AV testing experience). One could argue, again, that that woulda been legally not-at-fault, but socially that would be completely unacceptable (as one would guess rightly).
And so it is.
The full behavior sequence is in question: When did Waymo see the kid(s), where+ how were they moving, how did it predict (or fail to) where they will move in the next 2s, etc. etc. The entire sequence -- from perception to motion prediction to planning to control -- will be evaluated to understand where the failure for a proper response may have occurred.
As I mentioned earlier, the proper response is, under ideal conditions, one that would have caused the vehicle to stop at a safe distance from the VRU (0.5m-1m, ideally). Failing which, to reduce the kinetic energy to a minimum possible ("min expected response")... which may still imply a "contact" (=collision) but at reduced momentum, to minimize the chance of damage.
I suspect (though I dont know for sure) that Waymo executed the minimum expected response, and that likely was due to the driving policy.
We won't know until we see the full sequence from inside the Waymo. Everything else is speculation.
[Disclaimer: I dont work for Waymo; no affiliation, etc etc]
That's a difficult question to answer, and the devil really is in the details, as you may have guessed. What I can say that Waymo is, by far, the most prolific publisher of research on AV safety on public roads. (yes, those are my qualifiers...)
Here's their main stash [1] but notably, three papers talk about comparison of Waymo's rider-only (i.e. no safety driver) performance vis-a-vis human driver, at 7.1 million miles [2], 25 million miles [3], 56 million miles [4]. Waymo has also been a big contributor to various AV safety standards as one would expect (FWIW, I was also a contributor to 3 of the standards... the process is sausage-making at its finest, tbh).
I haven't read thru all their papers, but some notable ones talk about the difficulty of comparing AV vs human drivers [5], and various research on characterising uncertainty / risk of collision, comparing AVs to non-impaired, eyes-on human driver [6]
As one may expect, at least one of the challenges is that human-driven collisions are almost always very _lagging indicators_ of safety (i.e. collision happened: lost property, lost limbs, lost lives, etc.)
So, net-net, Waymo still has a VERY LONG WAY to go (obviously) to demonstrate better than human driving behavior, but they are showing that their AVs are better-than-humans on certain high-risk (potential) collisions.
As somebody remarked, the last 1% takes 90% of time/effort. That's where we are...
---
[1] https://waymo.com/safety/research
[2] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[3] https://waymo.com/research/do-autonomous-vehicles-outperform...
[4] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[5] https://waymo.com/research/comparative-safety-performance-of...
[6] https://waymo.com/blog/2022/09/benchmarking-av-safety/
[edit: reference]
Why? This is only true if they weren't supposed to be on the road in the first place. Which is not true.
If I program a machine and it goes out into the world and hurts someone who did not voluntarily release my liability, that's on me.
Indeed, it is, and that is exactly why Waymo will have to accept some responsibility. I can bet that internally Waymo's PR and Legal teams are working overtime to coordinate the details with NHTSA. We, the general public, may or may not know the details at all, if ever. However, Waymo's technical teams (Safety, etc) will also be working overtime to figure out what they could have done better.
As I mentioned, this is a standard test, and Waymo likely has 1000s of variations of this test in their simulation platforms; they will sweep across all possible parameters to make this test tighter, including the MER (minimum expected response from the AV) and perhaps raise the bar on MER (e.g. brake at max deceleration in some cases, trading off comfort metrics in those cases; etc.) and calculate the effects on local traffic (e.g. "did we endanger the rear vehicles by braking too hard? If so, by how much??" etc). All these are expected actions which the general public will never know (except, perhaps via some technical papers).
Regardless, the PR effects of this collision do not look good, especially as Waymo is expanding their service to other cities (Miami just announced; London by EOY2026). This PR coverage has potential to do more damage to the company than the actual physical damage to the poor traumatized kid and his family. THAT is the responsibility only the company will pay for.
To be sure, my intuition tells me this is not the last such collision. Expect to see some more, by other companies, as they commercialize their own services. It's a matter of statistics.
I think there is an argument for incentivising the technology to be pushed to its absolute limits by making the machine 100% liable. It's not to say the accident rate has to be zero in practice, but it has to be so low that any remaining accidents can be economically covered by insurance.
> I would say that Waymo's response, per their blog post [2] has been textbook compliance.
Remember Tesla's blog posts? Of course Waymo knows textbook compliance just like you do, and of course that's what they would claim.
Most likely, yes, the NHTSA investigation will be credible source of info for this case. HOWEVER, Waymo will likely fight it tooth-and-nail from letting it be public. They will likely cite "proprietary algorithms / design", etc. to protect it from being released publicly. So, net-net, I dunno... Will have to wait and see :shrug.gif:
But meanwhile, personally I would read reports from experts like Phil Koopman [1] and Missy Cummings [2] to see their take.
> Remember Tesla's blog posts?
You, Sir, cite two companies that are diametrically opposite on the safety spectrum, as far as good behavior is concerned. Admittedly, one would have less confidence in Waymo's own public postings about this (and I'd be mighty surprised if they actually made public their investigation data, which would be a welcome and an pioneering move).
On the other hand, the other company you mentioned, the less said the better.
As I did suspect, legal scholars are already calling for "voluntary disclosure" from Waymo re: its annotated videos of the collision [2]. FWIW, my skepticism about Waymo actually releasing it remains...
[1] https://static.nhtsa.gov/odi/inv/2026/INOA-PE26001-10005.pdf
[2] https://www.linkedin.com/posts/matthew-wansley-62b5b9126_a-w...
Cringe. Stop it. Simping for google has stopped being cool nearly 2 decades ago.
“The event occurred when the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under six mph before contact was made,” a statement from Waymo explains.
I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .
> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.
A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.
However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.
[1] https://afdc.energy.gov/data/10315
[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...
Until then, it is only prudent to defer snap judgements, but increase caution, insist on rigor and transparency, and demand more accurate information.
No we should not. We should accept that we don't have any statistically meaningful number at all, since we only have a single incident.
Let's assume we roll a standard die once and it shows a six. Statistically, we only expect a six in one sixth of the cases. But we already got one on a single roll! Concluding Waymo vehicles hit 2 to 4 times as many children as human drivers is like concluding the die in the example is six times as likely to show a six as a fair die.
And for not totally irrational reasons like machine follows programming and does not fear death, or with 100% certainty machine has bugs which will eventually end up killing someone for a really stupid reason—and nobody wants that to be them. Then there's just the general https://xkcd.com/2030/ problem of people rightfully not trusting technology because we are really bad at it, and our systems are set up in such a way that once you reach critical mass of money consequences become other people's problem.
Washington banned automatic subway train operation for 15 years after one incident that wasn't the computer's fault, and they still make a human sit in the cab. That's the bar. In that light it's hard not to see these cars as playing fast and loose with people's safety by comparison.
Are they? It is now clear that Tesla FSD is much worse than a human driver and yet there has been basically no attempt by anyone in government to stop them.
No one in the _US_ government. Note that European governments and China haven't approved it in the first place.
Surprised at how many comments here seem eager to praise Waymo based off their PR statement. Sure it sounds great if you read that the Waymo slowed down faster than a human. But would a human truly have hit the child here? Two blocks from a school with tons of kids, crossing guards, double parked cars, etc? The same Waymo that is under investigation for passing school busses illegally? It may have been entirely avoidable for the average human in this situation, but the robotaxi had a blind spot that it couldn't reason around and drove negligently.
Maybe the robotaxi did prevent some harm by braking with superhuman speed. But I am personally unconvinced it was a completely unavoidable freak accident type of situation without seeing more evidence than a blog post by a company with a heavily vested interest in the situation. I have anecdotally seen Waymo in my area drive poorly in various situations, and I'm sure I'm not the only one.
There's the classic "humans are bad drivers" but I don't think that is an excuse to not look critically into robotaxi accidents. A human driver who hit a child next to a school would have a personal responsibility and might face real jail time or at the least be put on trial and investigated. Who at Waymo will face similar consequences or risk for the same outcome?
It can feel principled to take the critical stance, but ultimately the authorities are going to have complete video of the event, and penalizing Waymo over this out of proportion to the harm done is just going to make the streets less safe. A 6mph crash is best avoided, but it's a scrap, it's one child running into another and knocking them over, it's not _face jail time_.
> Who at Waymo will face similar consequences or risk for the same outcome?
I'd argue that the general pushback against self-driving cars and immense PR and regulatory attention makes the consequences of accidents much more severe for the company than for a driver. (For comparison: How many kids do you think were hit by human drivers in the past month in the same general area, and how many of them made international news?)
I highly doubt a non-distracted driver going at/below the speed limit hitting a child that darted into the road would be at any realistic risk of facing jail time, especially in the US.
Really? My impression is that, for the most part, HN consistently sides with the companies. I say this in the most neutral way possible.
Look at Waymo's history in the space, meet some of the people working there, then make a decision.
While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.
If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.
This matches exactly what they said.
That kid is lucky it was a Waymo & not a human driven car.
It is never a 6 year old's fault if they get struck by a robot.
If you want to see an end to this nonsensical behavior by parents, pressure your local city into having strict traffic enforcement and ticketing during school hours at every local school, so that the parent networks can’t share news with each other of which school is being ‘harassed’ today. Give license points to vehicles that drop a child across the street, issue parking tickets to double parkers, and boot vehicles whose drivers refuse to move when asked. Demand they do this for the children, to protect them from the robots, if you like.
But.
It’ll protect them much more from the humans than from the robots, and after a few thousand rockets are issued to parents behaving badly, you’ll find that the true threat to children’s safety on school roads is children’s parents — just as the schools have known for decades. And that’s not a war you’ll win arguing against robots. (It’s a war you’ll win arguing against child-killing urban roadway design, though!)
There are always systemic factors that can be improved, for example working on street design to separate dangerous cars from children, or transportation policy by shifting transportation to buses, bikes, and walking where the consequences of mistakes are significantly reduced.
Cars are the #2 killer of children in the US, and it’s largely because of attitudes like this that ignore the extreme harm that is caused by preventable “accidents”
And it is not the child’s or their parents’ fault either:
Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults. And honestly even for adults stepping out a bit from behind an obstacle in the path of a car is an easy mistake to make. Don’t forget that for children an SUV is well above head height so it isn’t even possible for them to totally avoid stepping out a bit before looking. (And I don’t think stepping out vs. running out changes the outcome a lot)
This is why low speed limits around schools exist.
So I would say the Waymo did pretty well here, it travelled at a speed where it was still able to avoid not only a fatality but also major injury.
Not sure where this is coming from, and it's directly contradicted by the article:
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.” The company did not release a specific analysis of this crash.
The comment I originally replied to makes the claim a human's brain wouldn't have even responded fast enough to register the child was there. That's going WAY further than how Waymo is claiming a human would have responded.
I don't see how that's a more reasonable assumption that a human driver actually being "fully attentive", and I'm not sure Waymo's definition of that term is the same as what you're using.
I get what you are trying to say and I definitely agree in spirit, but I tell my kid (now 9) "it doesn't matter if it isn't your fault, you'll still get hurt or be dead." I spent a lot of time teaching him how to cross the street safely before I let him do it on his own, not to trust cars to do the right thing, not to trust them to see you, not to trust some idiot to not park right next to cross walk in a huge van that cars have no chance of seeing over.
If only we had a Dutch culture of pedistrian and road safety here.
https://www.yahoo.com/news/articles/child-struck-waymo-near-...
+/- 2 mph is acceptable speedometer and other error. (15 mph doesn’t mean never exceed under any legal inteprerstion I know.)
It’s reasonable to say Waymo would reduce speed in a 12 versus 15 in a way most humans would not.
We're not though. Drivers are allowed to kill as many people as they like as long as they're apologetic and weren't drinking; at most they pay a small fine.
Also, where I live that's manslaughter, a serious felony that can put you in jail.
Oh also, that video says "kid ran out from a double parked suv". Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
The 15 mph speed limit starts on the block the school is on. The article says the Waymo was within two blocks of the school, so it's possible they were in a 25 mph zone.
Can you imagine being dumb enough to think that exceeding a one size fits all number on a sign by <10% is the main failing here?
As if 2mph would have fundamentally changed this. Pfft.
A double parked car, in an area with chock full street parking (hence the double park) and "something" that's a magnet for pedestrians, and probably a bunch of pedestrians should be a "severe caution" situation for any driver who "gets it". You shouldn't need a sign to tell you that this is a particular zone and that warrants a particular magic number.
The proper reaction to a given set of indicators that indicate hazards depends on the situation. If this were easy to put in a formula Waymo would have and we wouldn't be discussing this accident because it wouldn't have happened.
According to https://news.ycombinator.com/item?id=46812226 1mph slower might have entirely avoided contact in this particular case.
The fact that it’s hard to turn this into a formula is exactly why robot drivers are bad.
In a school zone, when in a situation of low visibility, the car should likely be going significantly below the speed limit.
So, it's not a case of 17mph vs 15mph, but more like 17mph vs 10mph or 5mph.
Please pass this message on to 99.999% of human drivers who think speed limit is the minimum speed.
The car clearly failed to identify that this was a situation it needed to be going slower. The fact that it was going 17 instead of 15 is basically irrelevant here except as fodder for moral posturing. If the car is incapable of identifying those situations no amount of "muh magic number on sign" is going to fix it. You'll just have the same exact accident again in a 20 school zone.
If the car is going slower than the speed limit in this scenario, it is difficult to tell over the internet if that speed was appropriate. If the car is going over the speed limit, it is obviously inappropriate.
Yes, kids in developed countries have the autonomy to go to school by themselves from a very young age, provided the correct mindset and a safe environment. That's a combination of:
* high-trust society: commuting alone or in a small group is the norm, soccer moms a rare exception,
* safe, separated lanes for biking/walking when that's an option.
most commenters here are ignoring the structural incentives. the long term threat of waymo isn't safety, its the enclosure of public infrastructure. these companies are building a permission structure to lobby personal vehicles and public transit off the road.
transportation demand is inelastic. if we allow a transition where mobility is captured by private platforms, the consumer loses all leverage. the endgame is the american healthcare model: capture the market, kill alternatives, and extract max rent because the user has no choice. we need dense urban cores and mass transit, not a dependency on rent seeking oligopolies
The main saving grace is they all occurred at low enough speeds that the consequences were little more than frustrating/delaying for everyone present - pedestrians and drivers alike, as nobody knew what to expect next.
They are very far from perfect drivers. And what's especially problematic is the nature of their mistakes seem totally bizarre vs. the kinds of mistakes human drivers make.
This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.
Uncorrelated approach improves sensitivity at the cost of specificity. Early sensor fusion might improve both (maybe at the cost of somewhat lesser sensitivity).
About 5x more kinetic energy.
So if we're going to have cars drive irresponsibly fast near schools, it's better that they be piloted by robots.
But there may be a better solution...
In my experience in California, always and yes.
But that depends on reliability, especially in unforseen (and untrained-upon) circumstances. We'll have to see how they do, but they have been doing better than expected
Jumping out of a plane wearing a parachute vs jumping off a building without one.
But acceleration is hard to calculate without knowing time or distance (assuming it's even linear) and you don't get that exponent over velocity yielding you a big number that's great for heartstring grabbing and appealing to emotion hence why nobody ever uses it.
US human drivers average ~3.3 trillion miles per year [1]. US human drivers cause ~7,000 child pedestrian injurys per year [2]. That amounts to a average of 1 child pedestrian injury per ~470 million miles. Waymo has done ~100-200 million fully autonomous miles [3][4]. That means they average 1 child pedestrian injury per ~100-200 million miles. That is a injury rate ~2-4x higher than the human average.
However, the child pedestrian injury rate is only a official estimate (possible undercounting relative to highly scrutinized Waymo miles) and is a whole US average (operational domain might not be comparable, though this could easily swing either way), but absent more precise and better information, we should default to the calculated 2-4x higher injury rate; it is up to Waymo to robustly demonstrate otherwise.
Furthermore, Waymo has published reasonably robust claims arguing they achieve ~90% crash reduction [5] in total. The most likely new hypotheses in light of this crash are:
A. Their systems are not actually robustly 10x better than human drivers. Waymos claims are incorrect or non-comparable.
B. There are child-specific risk factors that humans account for that Waymo does not that cause a 20-40x differential risk around children relative to normal Waymo driving.
C. This is a fluke child pedestrian injury. Time will tell. Given their relatively robustly claimed 90% crash reduction, it is likely prudent to allow further operation in general, though possibly not in certain contexts.
[1] https://afdc.energy.gov/data/10315
[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...
[3] https://www.therobotreport.com/waymo-reaches-100m-fully-auto...
[4] https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto...
1. The NHTSA data is based on police-reported crash data, which reports far fewer injuries than the CDC reports based on ED visits. The child in this case appeared mostly unharmed and situations like this would likely not be counted in the NHTSA data.
2. Waymo taxis operate primarily in densely populated urban environments while human driver milage includes highways and rural roads where you're much less likely to collide with pedestrians per mile driven.
Waymo's 90% crash reduction claim is at least an apples-to-apples comparison.
If this incident had happened with a human driven vehicle would it even have been reported?
I don't know exactly what a 6mph collision looks like but I think it's likely the child had nothing more than some bruises and if a human has done it they would have just said sorry, made sure they were ok, and left
I'm not aware of any statistics for how often children come into contact with human-driven cars.
| | Injuries | Undesired |
| Miles | to | Pedestrian | Feline
| Driven | Children | Contacts | Fatalities
+--------------+---------------+----------+------------+------------
| U.S. Drivers | ~3e12 | ~7000 | ? | ?
|--------------+---------------+----------+------------|------------
| Waymo | 100e6 - 200e6 | 0* | 1 | 1
+--------------+---------------+----------+------------+------------
* for all we can tell, this incident doesn't rise to the level of injury that results in a reporting event that is captured in the 7,000 number.See past examples:
https://youtube.com/watch?v=hubWIuuz-e4 — first save is a child emerging from a parked car. Notice how Waymo slows down preemptively before the child starts moving.
https://www.reddit.com/r/waymo/s/ivQPuExwNW — detects foot movement from under the bus.
https://www.reddit.com/r/waymo/s/LURJ8isQJ6 — stops for dogs and children running onto the street at night.
That’s probably how they do it, which is again very clever stuff, chapeau. But they do it like that b/c they can’t really predict the world around them fast enough. It might be possible in the future with AI World Models though
Even if you detect “fast enough”, there are physical limits for braking and coming to a stop.
Bad any other car been there, probably including Tesla, the poor kid would have been hit with 4-10x more force.
Cheap shots. If this was Tesla there would be live media coverage across every news outlet around the world and congressmen racing to start investigation.
Look at any thread where Tesla is mentioned and how many waymo simps are mansplaning lidar.
The issue is that I don’t trust a private company word. You can’t even trust the president of the USA government nowadays… release the video footage or get lost.
Waymo hits a kid? Ban the tech immediately, obviously it needs more work.
Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.
> Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.
These can be true at the same time. Waymo is held to a significantly higher standard than human drivers.
They have to be, as a machine can not be held accountable for a decision.
Failure to acknowledge the existence of tradeoffs tends to lead to people making really lousy trades, in the same way that running around with your eyes closed tends to result in running into walls and tripping over unseen furniture.
We can't blindly trust Waymo's PR releases or apples-to-oranges comparisons. That's why the bar is higher.
In fact, if you substitute "company providing self-driving solution (integrated software + hardware)" for "company renting out commercial drivers" (or machine operators), then self-driving cars already fit well into existing legal framework. The way I see it, the only change self-driving cars introduce here is that there is no individual operator we could blame for the accident, no specific human we could heavily fine or jail, and then feel good about ourselves because we've issued retributive justice and everything is whole now. Everything else has already long been worked out.
This logic applies equally to all cars, which are machines. Waymo has its decision makers one more step removed than human drivers. But it’s not a good axiom to base any theory of liability on.
Autonomous driving systems disrupt this by directly assuming the driving function, forcing liability upstream (where it's significantly more difficult to navigate).
So now it's driver vs injured party. Self-driving makes it trillion dollar company vs injured party. Night and day difference.
It’s historically been my insurance company versus your insurance company or an uninsured driver. (Or I’m Apple Paying you $5k so my fender bender doesn’t show up on insurance.)
Now it’s my insurance company versus the insurance company of a client who can pay damages. The number of cases where drivers are individually litigating is relatively rare and preserved against e.g. Waymo.
not to mention that "driver error" becomes an argument with a black box
The argument that questions "would a human be driving 17mph in a school zone" feels absurd to the point of being potentially disingenuous. I've walked and driven through many school zones before, and human drivers routinely drive above 17mph (in some cases, over the typical 20mph or 25mph legal limit). It feels like in deconstructing some of these incidences, critics imagine a hypothetical scenario in which they are driving a car and its their only job to avoid a specific accident that they know will happen in advance, rather than facing the reality of what human drivers are actually like on the road.
In that particular instance, I was cited myself -- after the fact, at the hospital -- and eventually went before a judge. In that hearing, it was established that I was guilty of failing to yield at an intersection.
(That was a rather long time ago and I don't remember the nature of the punishment that resulted. It may have been as little as a stern talking-to by the judge.)
Waymo is liable in a civil sense and pays whatever monetary amount is negotiated or awarded.
For a criminal case, some kind of willful negligence would have to be shown. That can pierce corporate veils. But as a result Waymo is being extremely careful to follow the law and establish processes which shield their employees from negligence claims.
Although note that ~70-80% of drivers drove 6 or more mph over the speed limit in a school zone, while it seems like the claims in some of these comments are that a human driver would drive at less than 17 mph out of an abundance of caution.
[1] https://wtsc.wa.gov/wp-content/uploads/dlm_uploads/2023/12/1...
I can't tell if you are using sarcasm here or are serious. I guess it depends on your definition of normal person (obviously not average, but an idealized driver maybe?).
As this is based on detection of the child, what happens on Halloween when kids are all over the place and do not necessarily look like kids?
The object could be a paper bag flying in the wind, or leaves falling from the tree.
Autonomous vehicles won't be perfect. They'll surely make different mistakes from the ones humans currently make. People will die who wouldn't have died at the hands of human drivers. But the overall number of mistakes will be smaller.
Suppose you could wave your magic wand and have a The Purge-style situation where AVs had a perfect safety record 364 days of the year, but for some reason had a tricky bug that caused them to run over tiny Spidermen and princesses on Halloween. The number of fatalities in the US would drop from 40,000 annually to 40. Would you wave that wand?
This strawmam is bordering offtopic fiction. Only the 40k yearly deaths is based on reality. https://en.wikipedia.org/wiki/Motor_vehicle_fatality_rate_in...
I suspect the cars are trying to avoid running into anything, as that's generally considered bad.
> The Waymo Driver braked hard...
By Waymo Driver, they don't mean a human, do they?
> In October 2025, a Waymo autonomous robotaxi struck and killed KitKat, a well-known bodega cat at Randa's Market in San Francisco's Mission District, sparking debates over self-driving car safety
It's a child now. All I wanna ask - what should happen, so they stop killing pets and people?
> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.
Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!
I can see that, prioritize obstacle predictability over transit time. A school zone at certain times of day is very unpredictable with respect to obstacles but a more car congested area would be easier to navigate but slower. Same goes for residential areas during Halloween.
Great job, Waymo, for maybe hitting a little kid less than your study assumes a human would have! Is that study legit? Who cares, we trust you!
If this had been Tesla, HN would have crashed from all the dunking.
Our car hits better is a win, I guess?
Glad the child is okay.
If Waymo has fewer accidents where a pedestrian is hit than humans do, Waymo is safer. Period.
A lot of people are conjecturing how safe a human is in certain complicated scenarios (pedestrian emerging from behind a bus, driver holds cup of coffee, the sun is in their eyes, blah blah blah). These scenarios are distractions from the actual facts.
Is Waymo statistically safer? (spoiler: yes)
Imagine that there are only 10 Waymo journeys per year, and every year one of them hits a child near an elementary school, while there are 1000000 non-Waymo journeys per year, and every year two of them hit children near elementary schools. In this scenario Waymo has half as many accidents but is clearly much more dangerous.
Here in the real world, obviously the figures aren't anywhere near so extreme, but it's still the case that the great majority of cars on the road are not Waymos, so after counting how many human drivers have had similar accidents you need to scale that figure in proportion to the ratio of human to Waymo car-miles.
(Also, you need to consider the severity of the accidents. That comparison probably favours Waymo; at any rate, they're arguing that it does in this case, that a human driver in the same situation would have hit the child at a much higher and hence more damaging speed.)
Spoiler: we definitely don't know yet whether Waymo is statistically safer
But in a human driver with FSD on, are they liable if FSD fails? My understanding is yes, they are. Tesla doesn't want that liability. And to me this helps explain why FSD adoption is difficult. I don't want to hand control over to a probabilistic system that might fail but I would be at fault. In other words, I trust my own driving more than the FSD (I could be right or wrong, but I think most people will feel the same way).
> As far as fatalities were concerned, pedestrians struck at 20 mph had only a 1% chance of dying from their injuries
Certainly, being struck at 6 mph rather than 17 mph is likely to result in a much better outcome for the pedestrian. And that should not be minimized; although it is valuable to consider the situation (when we have sufficient information) and validate Waymo's suggestion that the average human driver would also have struck the pedestrian and at greater speed. That may or may not be accurate, given the context of a busy school dropoff situation... many human drivers are extra cautious in that context and may not have reached that speed; depending on the end to end route, some human drivers would have avoided the street with the school all together based on the time, etc. It's certainly seems like a good result for the premise, child unexpectedly appears from between large parked vehicles, but maybe there should have been an expectation.
[1] https://www.iihs.org/news/detail/vehicle-height-compounds-da...
A child is probably more likely to die in a collision of the same speed as an adult.
But really, did you seriously read my post as meaning people literally can't go slower than 20 so just plough into whatever is in the way? I'm obviously talking about an open road situation. Hardly any human drivers go under 20 by choice.
John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.
John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of (especially if he is living out of his car), so I'm just going to pay a bunch of legal fees for nothing. He doesn't car, because he has no skin in the game. The state doesn't care, they aren't going to throw him in jail or even take away his license (if he has one), they aren't going to even impound his car.
Honestly, I'd much rather be hit by a Waymo than John.
If you are hit by an underinsured driver, the government steps in and additional underinsured motorist protection (e.g. hit by an out of province/country motorist) is available to all and not expensive.
Jail time for an at-fault driver here is very uncommon but can be applied if serious injury or death results from a driver's conduct. This is quite conceivable with humans or AI, IMO. Who will face jail time as a human driver would in the same scenario?
Hit and run, leaving the scene, is also a criminal offence with potential jail time that a human motorist faces. You would hope this is unlikely with AI, but if it happens a small percentage of the time, who at Waymo faces jail as a human driver would?
I'm talking about edge cases here, not the usual fender bender. But this thread was about policy/regs and that needs to consider crazy edge cases before there are tens of millions of AI drivers on the road.
Waymo has deep pockets, so everyone is going to try and sue them, even if they don't have a legitimate grievance. Where I live, the city/state would totally milk each incident from a BigCo for all it was worth. "Hit and run" by a drunk waymo? The state is just salivating thinking about the possibility.
I don't agree with you that BigCorp doesn't have any skin in the game. They are basically playing the game in a bikini.
You do know that insurance being mandatory doesn't stop people from driving without insurance, right?
> If you are hit by an underinsured driver, the government steps in and additional underinsured motorist protection (e.g. hit by an out of province/country motorist) is available to all and not expensive.
Jolly good for you.
If I don't carry underinsured coverage, and someone totals my car or injures me with theirs, I'm basically fucked.
Ah great, so there's a lower chance of that specific John Smith hitting me again in the future!
The general deterrence effect we observe in society is that punishment of one person has an effect on others who observe it, making them more cautious and less likely to offend.
Most humans would be halfway into other lane after seeing kids near the street.
Apologist see something different than me.
Perception.
Better reporting would have asked real people the name of the elementary school, so we could see some pictures of the area. The link to NHTSA didn't point to the investigation, but it's under https://www.nhtsa.gov/search-safety-issues
"NHTSA is aware that the incident occurred within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity; and that the child ran across the street from behind a double parked SUV towards the school and was struck by the Waymo AV. Waymo reported that the child sustained minor injuries."
We're impatient emotional creatures. Sometimes when I'm on a bike the bike lane merges onto the road for a stretch, no choice but to take up a lane. I've had people accelerate behind me and screech the tyres, stopping just short of my back wheel in a threatening manner which they then did repeatedly as i ride the short distance in the lane before the bike lane re-opens.
To say "human drivers would notice they are near an elementary school" completely disregards the fuckwits that are out there on the road today. It disregards human nature. We've all seen people do shit like i describe above. It also disregards that every time i see an automated taxi it seems to drive on the cautious side already.
Give me the unemotional, infinite patience, drives very much on the cautious side automatic taxi over humans any day.
* “Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene,” Waymo wrote in the post.*
Other accident reports I've seen (NTSB, etc) often seem to take a similar approach - is it a bad thing?
Or what kind of language wouldn't make you 'want not to use their service'?
Big vehicles that demand respect and aren't expected to turn on a dime, known stops.
A: It thought it saw a child on the other side.
It was formerly known as the Google Self Driving Car Project
Any accident is bad. But accidents involving children are especially bad.
But I know when I drive, if it’s a route I’m familiar with, I’ll personally avoid school zones for this very reason: higher risk of catastrophe. But also it’s annoying to have to slow down so much.
Maybe this personal decision doesn’t really scale to all situations, but I’m surprised Waymo doesn’t attempt this. (Maybe they do and in this specific scenario it just wasn’t feasible)
You also have to drive much more slowly in a school zone than you do on other routes, so depending on the detour, it may not even be that much longer of a drive.
At worst, maybe Waymo eats the cost difference involved in choosing a more expensive route. This certainly hits the bottom line, but there’s certainly also a business and reputational cost from “child hit by Waymo in school zone” in the headlines.
Again, this all seems very solvable.
I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.
Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.
Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.
Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.
I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general. Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.
I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors):
https://electrek.co/2025/03/23/tesla-full-self-driving-stagn...
Do you mean like this?
I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.
If that's too annoying then bad parking by school areas so the situation doesn't happen.
And why would you make Waymo's go slower than human drivers, when it's the human drivers with worse reaction times? I had interpreted the suggestion as applying to all drivers.
Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.
One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.
Also the point isn't the specifics, the point is that the current design is not optimal, it's just the incumbent.
We would really need to see the site to have an idea of the constraints, Santa Monica has some places where additional roadway can be accomodated and some places where that's not really an option.
It seems very strange to defend a system that is drastically less safe because when an accident happens, at least a human will be "liable". Does a human suffering consequences (paying a fine? losing their license? going to jail?) make an injury/death more acceptable, if it wouldn't have happened with a Waymo driver in the first place?
In fact, I could see Google working on a highly complex algorithm to figure out cost savings from reducing safety and balancing that against the cost of spending more on marketing and lobbyists. We will have zero leverage to do anything if Waymo gradually becomes more and more dangerous.
> You take the population of vehicles in the field (A) and multiple it by the probable rate of failure (B), then multiply the result by the average cost of an out-of-court settlement (C). A times B times C equals X. This is what it will cost if we don't initiate a recall. If X is greater than the cost of a recall, we recall the cars and no one gets hurt. If X is less than the cost of a recall, then we don't recall.
-Chuck Palahniuk, Fight Club
So would you pick situation 1 or 2?
I would personally pick 1.
That is not my experience here in the Bay Area. In fact here is a pretty typical recent example https://www.nbcbayarea.com/news/local/community-members-mour...
The driver cuts in front of one person on an e-bike so fast they can’t react and hit them. Then after being hit they step on the accelerator and go over the sidewalk on the other side of the road killing a 4 year old. No charges filed.
This driver will be back on the street right away.
But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.
There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.
It’s already accepted. It’s already here. And Waymo is the safest in the set—we’re accepting objectively less-safe systems, too.
All data indicates that Waymo is ~10x safer so far.
"90% Fewer serious injury or worse crashes"
How do you remain stopped but also move to the side of the road? Thats a contradiction. Just like Cruise.
I also assume a human took over (called the police, moved the car, etc) once it hit the kid.
If you drive a car, you have a responsibility to do it safely. The fact that I am usually better than the bottom 50% of drivers, or that I am better than a drunk driver does not mean that when I hit someone it's less bad. A car is a giant weapon. If you drive the weapon, you need to do it safely. Most people these days are incredibly inconsiderate - probably because there's little economic value in being considerate. The fact that lots of drivers suck doesn't mean that waymo gets a pass.
Waymos have definitely become more aggressive as they've been successful. They drive the speed limit down my local street. I see them and I think wtf that's too fast. It's one thing when there are no cars around. But if you've got cars or people around, the appropriate speed changes. Let's audit waymo. They certainly have an aggressiveness setting. Let's see the data on how it's changing. Let's see how safety buffers have decreased as they've changed the aggressiveness setting.
The real solution? Get rid of cars. Self-driving individually owned vehicles were always the wrong solution. Public transit and shared infra is always the right choice.
But that fact does mean that we should encourage alternatives that reduce fatalities, and that not doing so results in fatalities that did not need to occur.
> The real solution? Get rid of cars.
I also support initiatives to improve public transit, etc. However, I don't think "get rid of cars" is a realistic idea to the general public right now, so let's encourage all of the things that improve things - robot drivers if they kill people less often than humans, public transit, etc. - let's not put off changes that will save lives on the hope that humanity will "get rid of cars" any time soon. Or when do you think humanity will "get rid of cars"?