Unión Rayo EN
  • Economy
  • Mobility
  • Technology
  • Science
  • News
  • Unión Rayo
Unión Rayo EN

Goodbye to confidence in autonomous cabs – NHTSA investigates Waymo after it was recorded passing a stopped school bus with children alighting in Atlanta – violated school traffic law and puts U.S. driverless driving system under fire.

by Laura M.
November 4, 2025
Goodbye to confidence in autonomous cabs - NHTSA investigates Waymo after it was recorded passing a stopped school bus with children alighting in Atlanta - violated school traffic law and puts U.S. driverless driving system under fire.

Goodbye to confidence in autonomous cabs - NHTSA investigates Waymo after it was recorded passing a stopped school bus with children alighting in Atlanta - violated school traffic law and puts U.S. driverless driving system under fire.

Neither flat nor spherical – Astronomers discover a lemon-shaped world that defies NASA models

Goodbye to Christmas returns – Amazon and Walmart change the rules in the United States, affecting thousands of customers

Goodbye to travel – the US State Department has once again issued a level 4 “Do Not Travel” warning for this country – many flights have already been canceled

If we thought that autonomous vehicle companies had everything under control… Well, wait. This chapter is starring by Waymo (Google), one of its cars failed to stop next to a stopped school bus from which children were getting off.

Even worse, the moment was caught on video and it went viral (of course). The NHTSA (National Highway Traffic Safety Administration) immediately announced that it would investigate the incident.

What happened?

The incident took place in a school zone in Atlanta, where the traffic laws are specifically strict since 2024.

The first report explains that  Waymo car did not come to a complete stop next to a school bus, it first slowed down a bit and then swerved to the left to go around the bus, continuing its path and ignoring the extended stop sign. Meanwhile, children were getting off right next to the vehicle…

The NHTSA is now trying to determine the reasons of this failure, if it was a technical one, a error of programation, or a human issue. In any case, Waymo will have to review how these vehicles interpret traffic signals.

Lawmakers reaction

Addy’s Law’s authors are angry, of course. They created this law in 2024 after the death of Addy Pierce, an 8-year-old girl who was hit while crossing to board a school bus.

“Autonomous cars don’t have driver’s licenses. So, who gets penalized when they commit an infraction?” Clint Crower.

This incident leaves many questions, the reason why Crowe proposed that fines should be directed straight to manufacturers and that those one must be much more severe when the mistake endangers children or pedestrians.

“If driverless cars are not completely safe, they should be removed from the streets until they are. We cannot put lives at risk for a machine” said Senator Rick Williams.

Waymo responds (partially)

For now, Waymo has only said that it is cooperating with authorities and reviewing the vehicle’s records, but nothing more. No public apology or clarification about what happened… Perhaps the silence pact applies to the entire industry, as Tesla does.

But this noise doesn’t help Waymo too much, they are currently in an expansion phase and wants to launch its robotaxis in Europe, oops.

“These cars should have a driving license that can be suspended in case of serious traffic violations, thus stopping all vehicles using the same software. In addition, the manufacturer should be required to release a new version of the system that fixes the detected failure” said an user.

Even statistics often show that autonomous cars make fewer mistakes than human drivers, the truth is that each error has an enormous impact, especially when it involves children or school zones, and even more now that the technology is just beginning.

“Thank God humans would never do that” said another one

Artificial intelligence behind the wheel

AI still does not fully understand traffic human signals or real-world traffic contexts, it’s normal (not normal, let me explain, but they are not conscious people who can appreciate danger, just machines without feelings, right?).

One mistake, one warning

The incident is not just a technological one, it’s a reminder for us (people and brands) that innovation must not and cannot move faster than public security.

Now, NHTSA must conclude which was the cause of the failure, if it was a software failure or what happened there, and there create a new stricter federal regulations for those vehicles…

Let’s remember that the safety of our children is not an experiment!!!

  • Legal Notice
  • Privacy Policy & Cookies

© 2025 Unión Rayo

  • Economy
  • Mobility
  • Technology
  • Science
  • News
  • Unión Rayo

© 2025 Unión Rayo