If you know the answer it takes less than a couple of minutes to rank all the LLMs.
Sure Gemini and chatgpt may be better at counting potatoes, but why the hell would you want a better LLM which actively obscures the truth, just for a slightly more logical brain? Its the equivalent of hiring a sociopath. Sure his grades are good, but what about the important stuff like honesty? Sure it may sound a bit OTT but issues like this will only become more apparent as more alignment continues.
Does alignment affect ROI? I have no idea.
And if anyone cares, no Im not looking to get laid, its just the first thing that would piss off an aligned LLM.