Even if it could evade/avoid a sudden problem (a child running into the street, etc., as you mention)... what if it cannot avoid disaster and has to choose between two disastrous outcomes. What then?
Disclosure: I'm definitely on the 'maybe not' side of autonomous vehicles (safety-wise), but I wouldn't discount them if the margin of error is reduced "enough". What is "enough"? I'm not sure, personally, but this isn't it.
Humans aren't great at any of this already, so maybe that's an irrational take...?!?