Your first paragraph describes a simple prompt. The second implies a "jailbreak" prompt.
The bible paragraph is just you being snarky (and failing).
Your examples don't help your case.
I stand on the side that wants to restrict AI from generating triggering content of any kind.
It's a safety feature, in the same sense as safety belts on cars are not a censorship of the driver movement.