I do believe this is zero-sum in that improving on one set of decisions means no applying the same rigor to others.
This is often seen in the form of very smart people also believing conspiracy theories or throwing their hands up around other massive issues. As an example, the "Rationalist crowd" has de-emphasized work on climate change mitigation in favor of more abstract work on AI safety.