>Using the most commonly version of the product, on the commonly used hardware, at least 2 days a week should be a prerequisite for every product owner.I am a firm believer that the software should also be developed on commonly used hardware.
Your average user isn't going to have the top-of-the-line MacBook pro, and your program isn't going to be the only thing running on it.
It may run fine on your beefed up monstrosity, and you'll not feel the need to care about performance (worse: you may justify laggy performance with "it runs fine on my machine"). And your users will pay the price for the bloat, which becomes an externality.
Same for websites. Yes, you are going to have a hundred tabs open while working on your web app, but guess what - so will your users.
Performance isn't really product's domain, as in — they would always be happier with things being more snappy; they have to rely on the developer's word as to what's reasonable to expect.
And the expectation becomes that the software should and can only run fine on whatever hardware the developer has, taking all the resources available, and any optimization beyond that is costly and unnecessary.
Giving the devs more modest hardware to develop with (limited traffic/cloud compute/CPU time/...) solves this problem preemptively by making the developers feel the discomfort resulting from the product being slow, and thus having the motivation to improve performance without the product demanding it.
The product, of course, should also have the same modest hardware — otherwise, they'll deprioritize performance improvements.
----
TL;DR: overpowered dev machines turn bloat into an externality.
Make devs use 5+-year-old commodity hardware again.