The crash wasn't loud, but the ripple effect across the self-driving industry has been deafening. A Waymo robotaxi recently struck and killed a small dog in San Francisco’s Richmond District. It happened in a split second. The dog ran behind a parked car and straight into the path of the Jaguar I-PACE. The car’s sensors saw the animal. The software even identified it. But the physics of the situation—the speed, the distance, and the timing—meant the car didn't stop in time.
If you're looking for a simple story about a glitch, you won't find it here. This isn't about a computer "forgetting" how to brake. It’s about the brutal reality of the "long tail" in autonomous driving. We've spent years hearing that these cars are safer than humans. In many ways, they are. They don't get drunk. They don't text. They don't get sleepy. But they also don't have the "sixth sense" that a seasoned human driver develops over decades of navigating chaotic city streets. When a Waymo hits a dog, it forces us to ask if "better than a human" is actually good enough for the public to accept. Meanwhile, you can explore similar stories here: The Anthropic Pentagon Standoff is a PR Stunt for Moral Cowards.
The technical breakdown of the Richmond District incident
According to the report Waymo filed with the California Department of Motor Vehicles, the vehicle was in autonomous mode with a safety driver behind the wheel. The car was traveling at a steady clip when a small dog darted out from the sidewalk.
Waymo’s sensor suite is impressive. It uses a mix of LiDAR, cameras, and radar to build a 360-degree map of the world. In this specific case, the system detected the dog. However, the software's "path prediction" failed to anticipate the dog's erratic movement accurately enough to initiate an emergency stop before the impact. The safety driver didn't have time to intervene either. To understand the bigger picture, we recommend the recent report by TechCrunch.
This brings up a massive hurdle for AI. Humans are predictable-ish. We follow crosswalks and obey lights. Dogs don't. Animals represent a chaotic variable that software struggles to quantify. If a car slams its brakes every time a pigeon flies near the hood, it becomes a road hazard for the cars behind it. If it doesn't brake, things like this happen. It’s a literal trolley problem playing out in real-time on Geary Boulevard.
Why San Francisco is the ultimate stress test
San Francisco is a nightmare for autonomous systems. I've driven those streets. You have steep hills that mess with LiDAR line-of-sight. You have thick fog that can degrade camera performance. You have a population that is, frankly, tired of being treated like a giant laboratory.
This incident didn't happen in a vacuum. It happened amidst a backdrop of rising tension.
- Emergency vehicles have been blocked by stalled robotaxis.
- Construction zones have left driverless cars spinning their wheels in confusion.
- Local politicians are fighting to regain control over their streets from state regulators.
When a dog dies, it’s emotional. It’s not just a data point in a safety report. It’s a pet. For a lot of people in the city, this was the last straw. It shifted the debate from "Can these cars drive?" to "Do we want them here?"
The myth of the perfect driver
We need to be honest about human drivers. Humans kill thousands of animals every day. We hit pedestrians. We cause multi-car pileups because we're looking at Instagram. Statistically, Waymo's fleet has a lower rate of injury-causing crashes than human-driven ride-hail services.
But humans are weird. We forgive human error more easily than we forgive machine error. If a person hits a dog, we call it a tragic accident. If a robot hits a dog, we call it a system failure. Waymo is fighting an uphill battle against psychology. They aren't just building a car; they're trying to build a brand of "trust," and trust is incredibly fragile.
One thing the industry doesn't talk about enough is the "uncanny valley" of safety. As these cars get better, we expect them to be perfect. We stop paying attention. We stop expecting the unexpected. Then, when the 0.001% edge case happens—like a dog sprinting from behind a Tesla—the shock is magnified.
Comparing Waymo to Cruise and others
It's worth noting that Waymo has generally been the "adult in the room." While Cruise had its permit suspended after dragging a pedestrian in a separate, much more horrific incident, Waymo has maintained a relatively clean record. They move slower. They test more.
But even the "safe" player isn't immune to the chaos of reality. This dog accident shows that no amount of simulated miles can fully prepare a machine for the sheer randomness of a living, breathing city. You can run a billion miles in a computer sim, but the sim doesn't always account for the specific way a leash might break or a dog might spook.
What happens to the data now
Every time a Waymo has a "disengagement" or an accident, the data is gold. The engineers are likely pouring over the logs right now. They'll look at the "perception" layer to see if the dog was classified correctly. They'll look at the "planner" layer to see why the car didn't swerve or stop sooner.
They will update the code. They will run the "Richmond Dog" scenario through their simulators tens of thousands of times with slight variations.
- What if the sun was lower?
- What if the road was wet?
- What if there was a cyclist next to the car?
This is how the technology improves. It learns from tragedy. But for the residents of San Francisco, being the "learning material" is getting old. There's a growing movement of people who feel that the streets are being sold to big tech without the consent of the people who live on them.
Regulation is the next battlefield
Expect the California DMV and the CPUC (California Public Utilities Commission) to face massive pressure. There are already calls to limit the number of autonomous vehicles allowed on the streets at night or during peak hours.
The city attorney has been vocal about wanting more local oversight. Right now, San Francisco has very little power to say "no" to Waymo or Zoox. The state holds the keys. That's a political powder keg. If more incidents like this occur—even if they are technically "unavoidable"—the political cost might become too high for the state to keep protecting the industry.
Steps you should take as a pedestrian or pet owner
If you live in a city where these cars operate, you can't just rely on the tech to see you. It sounds backwards, but you have to be more defensive.
- Short leashes are mandatory. If you're near a street with autonomous traffic, keep your pet close. Sensors can struggle with small objects that move quickly and unpredictably.
- Don't assume eye contact. When you're at a crosswalk with a human driver, you look them in the eye to make sure they see you. You can't do that with a Waymo. You're looking at a spinning laser on the roof. Wait for the car to come to a complete stop before you step off the curb.
- Report weird behavior. If you see a robotaxi acting jittery or blocking a lane, report it to the city. Data from the public is often the only way local officials can build a case for better regulation.
The tech is here to stay, but the "move fast and break things" era of autonomous driving is hitting a wall of public sentiment. We’re moving into a phase where "safety" isn't just about avoiding crashes—it's about proving that the machine values life as much as we do. It’s a high bar. And right now, the industry is struggling to clear it.
Check the local DMV autonomous vehicle collision reports if you want to see the raw data yourself. It's public record, and it's eye-opening to see just how often these minor "touches" happen. Stay aware, keep your dog close, and don't assume the sensors see everything.