I halfway agree with the first part, and I like the statement by the maintainers.
But, I think other libraries have done this too. React is a great example. It prioritized making the reactivity model really strong, and everything else is less important. (Like performance or bundle size.) To the extent that the official solution to performance issues is writing a compiler for the library to auto-optimize everything, because they don’t want to compromise on the reactivity model itself.
To me, that feels like a subjective choice, going more off vibes than certain metrics. (Not making a value judgement, just explaining what I’ve heard at conferences.)
In other words, I don’t think “objective BS” got us into this place. Shops which care a lot about performance will have “performance budgets” for things like TTFB or time to reactive. It’s a more objective approach that tries to minimize lots of MB going over the wire.