Meh, that's a weird take. If it was, it would ask people to act only to maximize outcome and ignore other constraints, because they would be less secondary. As far as I'm aware, it doesn't, it only suggests that, if you want to do something altruistic, you might want to look at what's most effective and not just do anything that vaguely feels or sounds like it might be helpful.