Bullshit. You can just use ANY abliterated model or non-safety aligned model. People have popularized ‘hard r or you’re hardware’ for a while now, but any self hosted LLM that’s de-aligned will gladly say whatever.
People believe this and continue to get fooled by LLMs all day.