We knew that in the future cars would drive themselves… And well, we are already in the future. Waymo cars are an example of that, although right now they are going through a bit of a complicated moment. The NHTSA has opened an investigation against this company (the autonomous vehicle company of Alphabet, Google) because one of its robotaxis did not stop in front of a school bus from which children were getting off… can you imagine the chaos that could have been caused? And of course, the haters of robotaxis have started talking… can a machine completely replace human attention and instinct behind the wheel?
The debate returns
According to the initial report, the vehicle involved (equipped with the fifth-generation driving system-ADS) briefly stopped when it saw the school bus but then moved forward. And it did so just at the moment when the children were getting off, while the bus still had its red lights on and the “stop” sign extended. The logical thing would have been to stop completely, right?
Waymo acknowledged the failure (it had no choice) and assured that it has already updated its software to prevent it from happening again. A spokesperson for the company explained that the car “was approaching from an angle where the lights were not visible” and that the system did not correctly interpret the situation… Aha.
Of course, the robotaxi was operating without a human backup driver, which makes this incident a key point in the debate about the real safety of autonomous vehicles… are we at risk?
“The safety of children is a priority”
The company has wanted to make clear that protecting children and pedestrians is among its core values and has assured that its next update will include specific improvements for detecting school buses and passenger unloading zones.
Currently, Waymo operates more than 1,500 robotaxis in cities such as Phoenix, Los Angeles, San Francisco, and Austin, and plans to expand to Tokyo and London in the coming years. But this incident could slow down its progress a bit.
Problems with driverless cars
There have been several incidents reviewed by the NHTSA in recent months since this type of driving has been implemented. Tesla and Cruise have been the ones attracting the most attention because many of their systems did not react correctly to pedestrians or vehicles… Maybe we are still not completely ready for a driverless future.
“Machines learn on the street” really?
Waymos drive completely on their own, there is no one behind the wheel, no operator, and no emergency mode either… So everything depends on the machine, and although the algorithms learn from each experience, the margin of error can end human lives.
The NHTSA sets limits
For now, the agency believes that innovation cannot go ahead of safety. The purpose of this investigation is to verify whether Waymo vehicles comply with traffic laws and school safety regulations.
If it is proven that there was a systematic violation or structural failure, the company could face multimillion-dollar fines…
What is at stake?
Waymo is one of the pioneers in the sector and one of Alphabet’s major technological bets, and the dream of autonomous driving is there, promising safer roads and fewer accidents… But for now, the road is still not entirely clear.
The NHTSA does not want to stop innovation; it just wants to make sure no lives are at risk. It’s not hard to understand… Are we ready to share the road like this with technology? Or are there still many steps left to take before we can relax in a robotaxi?
