This commentor must misunderstand the situation. School busses regularly stop to pickup and drop off students on streets near where they live; and there's generally schools all around. If Waymos can't properly respond to school bus signals, they need to not operate in areas where these pickups and drop offs happen, which is not exclusively near schools.
This was on El Camino in Santa Clara. I was highly suprised as I was under the assumption they were pretty much production ready as they have been expanding their area a lot.
Source: https://www.vazirilaw.com/faqs/whos-liable-in-a-waymo-self-d...
It still isn't quite as clear who or if anyone is liable when traffic laws are broken:
https://web.archive.org/web/20251025055924/https://www.nytim...
Often, they are simply getting away with it.
I always felt this was just a strategy, and that soon enough fleet operators would turn up the dials on speed and aggressiveness. After all, the only people who can complain are the people outside the car, and they will be dead.
I don't know how Waymo is going to square that circle.
You literally cannot drive on public roads unless you match the speed, flow, and maneuvering of other traffic.
I'm not sure how you can earnestly make this claim while reading people complaining about the speed and aggressiveness. Do you suspect you're replying to ghosts?
Where at? Im curious because I see a lot of people say this, but Ive never seen them go more than 1mph over the limit when riding in them, and watch them do 65 on the freeway every day, even when people are passing.
(Funny story: i was in Ottawa over the winter. There, snow plows, ambulances and fire trucks all use blue flashing lights. I thought i was being pulled over by a giant police truck ... it was a snow plow that really did not appreciate me stopping on the side of the road. Yet another special case vehicle.)
Also lol at this quote in the article "Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating." What it doesn't note is that the other 5 seem to have been human driven passenger vehicles. From the NTSB report: "located in Novi, Michigan, replied “No” to the prompt. The ADS-equipped vehicle then resumed travel and passed the school bus while its stop arms were still extended. A passenger vehicle following the ADS-equipped vehicle similarly passed the school bus. In total, six vehicles passed the school bus while it was stopped. A crash did not occur.", so it sounds to me like 4 people passed it, waymo was like wtf I'm pretty sure that's a stopped bus, a human incorrectly identified it as not a bus, waymo passed it, and then one more person passed after the waymo.
I will let you judge for yourself here what the "right" thing for the Waymo to do was... but let's think critically about how Waymos work in the real world, benchmarked against other real drivers dealing with real life issues.
What if the bus driver took a break and forgot to turn off the sign? What if it had been 10 minutes and the driver was obviously dealing with some kind of behavioral problem?
A human is not going to put their life on hold forever just for a flashing light. Part of being a smart AI is figuring out when the rules have broken down just a bit and you have to adapt.
"A School District Tried to Help Train Waymos to Stop for School Buses. It Didn’t Work."
The fact that it is passing stopped school buses does rather suggest that perhaps as cautious as it is, it still isn't smart enough to be cautious in the right ways.
Self-driving vehicles that are much better than human drivers aren't enough.
It's similar to making alternative software targeting an entrenched incumbent. The disruptor needs to add value that overcomes the friction of switching at a minimum and then more to make it worthwhile.
The problem is there is zero enforcement. We know the vehicle is not safe around schoolchildren so the appropriate incentive needs to be applied to get the issue addressed.
Why do you apply a different standard to waymos than to humans?
Show me waymo's driving license and the test it passed to get it
Bear in mind $1,000 per incident is not enough money to justify paying a software developer to fix it.
[1] https://texas.public.law/statutes/tex._transp._code_section_...