I have had little exposure to black people in my life. The few that I have encountered in school or professionally have not demonstrated any aggressive nature. I have no first-hand experience of aggression by any black man. My parents did not teach me that black men are aggressive. I did not learn this in school or from my friends. The only way I can think of that I might have gotten this impression is from movies, television, music, and media. These are the same people who are now professing and demonstrating how woke and concerned they are, rather than apologizing for their roles as the primary perpetuators of a stereotype.
Therefore when I see media or entertainment figures (I include politicians here) on the anti-racist soapbox, it all falls rather flat for me. They are the ones fanning the flames on one side, and demanding attonement from everybody else on the other side.