And honestly we need better mechanics for machine-intended references regardless. It's ridiculous that you resolve and load a webpage, only for that webpage to require you to resolve a bunch more human readable (ie non-decentralized) names for loading subresources. For example, going to a bookmarked page shouldn't result in any DNS queries for human readable names.
And why should that distinction be made? "New gTLDs" have been active since 2013. That's nearly 10 years of usage you're still arguing should be invalidated.
To be clear I'm not picking on things like .biz etc. Even though those were obvious cash grabs, they're at least widely applicable and thus widely adopted. But things like .christmastrees .business .companyname .morebiglongwords etc. Over "nearly 10 years of usage" I can still count on my fingers the number of times I've seen things like this used.