Numerous times I've seen pilots complain that their employer requires autopilot to be used wherever possible, while they would like to fly the plane by themselves occasionally.
If we polished off the autopilot to also handle take-off and taxi, and installed it in every single plane that exists, I'm pretty sure the entire aviation industry could be completely automated.
Without consulting the literature, it just seems easy to believe that machine-like consistency 99.9% of the time is more important to safety than confusion in the 0.1%.
If the car fails to detect one in a thousand cars in front of you, or one in a thousand corners, I can see that being highly dangerous.
Let's say you change lane 5 times while commuting. That's 50 times per week. Now, are you going to pay attention well enough that you catch it twice a year just before it pulls out right into another car?
Perhaps likening it to having another person driving you would be good. If you were given a perfect chauffeur, except with the knowledge that they'll drive straight through one in a thousand red lights with no warning, are you confident you'll catch them?
Perfectly safe 99.9% and highly dangerous the other 0.1% may be overall more hazardous than human-level safe 100% of the time. Or it may not. I’m not making a claim either way. But it’s not a crazy notion that it may not be.
More data is needed. I’m cool with that data being collected in the real world with real drivers. Gotta crack some eggs...