A lot of patients don't know who they are dealing with nor their history. And it can be really hard to find out or get a good evaluation. Many people put too much faith in authority figures, who may not have their best interests in mind or who are not the experts they claim or appear to be.
Which isn't to say that they even should, really. It's complicated. You don't want a doctor to be so afraid of making a mistake that they do nothing, after all.
Killing people with AI is only a lateral move.
So where does guilt come in? Its not like you expect a band saw to feel guilt, and its unclear how that would improve the tool.
The advantage of human are:
* They can give a bushtit explanation of why they made a mistake. My guess is that in the future AI will gain introspection and/or learn to bushtit excuses.
* You can hang them in the public square (or send them to jail). Sometimes the family and/or the press want someone to blame. This is more difficult to solve and will need a cultural change or the creation of Scapegoats as a Service.