This isn't racism at all. It's just bad PR because humans take the implication that calling black people monkeys is calling them stupid, since that's the implication you would draw if a person did that.
An algorithm doing that is just recognizing that humans and gorillas are both primates:
http://www.aquilaarts.com/bushmonkey.html
And then it's a bug, in the same way that recognizing a black balloon as a balloon but a white balloon as a light bulb is a bug. It has nothing to do with race at all. The algorithm isn't racist against white balloons. The solution is a general increase in the amount of training data, which is what you want in all cases regardless.
> if the decision-makers unconsciously favored whites over blacks, the algorithm could wind up weighing skin color or stereotypically Black or Latino names negatively, meaning that the final model is explicitly racist, just because there is a correlation in the training data.
Except that this is exactly the thing that a paperclip optimizer will smash to bits because it interferes with the goal of making more paperclips.