The models are predicated a specific set of hardware ( which camera, what fidelity LiDAR, but even something as simple as “what’s the frame rate” can have a difference)
https://maps.app.goo.gl/UNUzQn686ESYWbfo7
Further south of the pin, by the section of the alley with stripes on the pavement behind the garages for 842 N 6th Ave (but not on 6th or 7th, on an unnamed alley between the two).
Let me repeat - it's an alley. The Google Maps car didn't even go down that road (though it looks to have been under construction when the Maps car went through).
Without exclusing Waymo (they had their car do something dangerous and stupid) this is the kind of pseudo-off-road parking lot/driveway/construction zone nav stuff that's really hard to get right, and almost requires AGI.
I think the real error was not the damage score but the planning algorithm that directed it to drive down and to continue through that alley.
I think we'll soon get to (if we're not there already) a form of level 2 driver aids or level 3 geofenced self driving (highway only?) that's safer than average human drivers. I think we're a long way from self-driving cars that will assign a low damage score and drive over an empty cardboard box in an unmarked, unmapped private alley, and we may never get there. But that doesn't mean Waymo can't or shouldn't exist, it means they need to shut down the car and delegate to a human when they're stuck and not not on public, mapped, confirmed clear roads. Maybe that means it can't pick you up from the back of the Chic Fil A parking lot or the entrance to the mall that's an island in a quarter mile of private parking lots and you have to go to the nearest parking spot on the actual road, but if the alternative is assigning damage scores to stuff in alleys that's probably for the best.
To emphasize, I'm genuinely curious. I don't understand how the recall notice process works if your product isn't owned by anyone else but you.
> Waymo’s recall was deployed by the company’s engineers at the central depot where the vehicles return for regular maintenance and testing. It was not through an over-the-air software update, like some of Tesla’s recent recalls.
I'd be interested to learn more about why the updates are manual, and also whether the map data is fully local to the vehicle. Tesla obviously does the polar opposite of this, and it seems to have at least some degree of success, but Tesla's approach has always seemed like it would be subject to some bad potential failure modes in my mind.
How much data does this amount to? Gigs? Terabytes?
On the same note, I'm curious about what data gets pulled from the map versus sensor data. The car seems to have used map data instead of sensor data (unless I'm misunderstanding?). Whether there's a curb seems to be exactly the sort of thing you could rely on sensors for, mostly because you also already need to look for obstructions which necessarily can't be in map data.
In fact, it's probably fastest to have a small bank of hard drives that you can physically pull from the vehicle and swap out for a fresh one. You could probably pack a few hundred TB of storage into a unit that could fit in a briefcase and have it loaded with fresh map data and lots of room for logs.
What did it get classified as? What's a Waymo allowed to hit?
I'm excited for self driving cars, but I have reservations about a system that has to hard code "don't hit a telephone pole". It reminds me of this skit https://www.youtube.com/watch?v=3m5qxZm_JqM
It would be extremely silly to lie about an implementation detail that doesn't matter in an official recall filing with an agency currently investigating you for your reporting of incidents like this.
It could also just be everything gets a damage score, including reflection artifacts that aren't really there or paper garbage in the road, and in this case the pole was identified incorrectly, given a low score, and the car thought it could keep driving.
The verge seems to quote the filing, it would be nice if they had linked it. I can't find it from searching.
Well-behaved vehicles are lawful good. Skateboarders are chaotic good. Drunk drivers are chaotic neutral. Road ragers are chaotic evil. Chaotic users are less predictable, and evil users require active evasion.
P.S: I've got nothing against "recall" being recalled so that it can be fixed ; )
While I have previously agreed that it should not be necessary to change the term, it is unfortunately confusing to laypeople and intentionally misconstrued by bad actors to downplay safety faults. I propose it should be replaced with: “Safety Defect Notice” which is a fairly precise description of the underlying concept and can not be misconstrued to be about the “fix”.
Whereas, may recalls these days are just "Your car performed an over the air update while you were sleeping".
Maybe we need the concept of "physical recall" and "software recall".