There WILL be bugs and un-modellable sources of error. The real hedging in these situations is the safety driver. The death of Elaine Herzberg is very regrettable, but the fault ultimately lies with the safety driver and the training that was offered to her. She was on her phone, like 1000's of drivers are now.
I don't think we're talking about the same thing.
I mean build a machine that, in the real world, can't hurt people.
Make it light.
Make it soft.
Program it to limit its speed such that it can always stop before colliding with whatever (whoever) might leap out in front of it.
If the top speed is five miles per hour, so be it.
The safety driver is wrong too. But she was there because Uber wanted to put car-shaped robots onto public streets.
Really the insane thing is that we mix car and pedestrian traffic at all in the first place. Oddly enough, it's the result of a deliberate campaign of propaganda: https://www.youtube.com/watch?v=-AFn7MiJz_s "Adam Ruins Everything - Why Jaywalking Is a Crime"
> Adam reveals the derogatory origins of jaywalking and explains how the auto industry made it illegal.
You are not really testing though. The whole point is to build a machine that can go the speed limit. You can test all you want at 5mph, but let me assure you, most of the real issues will show up when you go 45 in the real world.
I assume much of the training is done in simulation or within a controlled environment but unfortunately the only way to train for city driving is to gather as much real world data as possible and that means "testing in production" with a hopefully alert humans (one for backup) behind the wheel.
Really, the problem is the rush to market not the idea itself.