What services? Web crawlers? I'm sure the ones I would care about are smart enough to know how this works. There are many ways infinite valid URLs can be made. Query params, subdomains and hashroutes to name a few.
> if the title changes the URLs distributed are now permanently wrong as they stored part of the content (and if you redirect to correct, can lead to temporary loops due to caches),
You don't redirect. The server doesn't even look at the slug part of the URL for routing purposes. You can change the url with javascript post-load if it bothers you (as stackoverflow does). Cache loops are an entirely avoidable problem here.
> the URL is now extremely long and since most users don't know if a given website does this weird "part of the URL is meaningless" thing there are tons of ways of manually sharing the URL that are now extremely laborious
Extremely long and extremely laborious seems a bit of an exaggeration. Most users copy and paste, no? Adding a few characters of a human readable tag doesn't warrant this response I feel. Especially when the benefit means that if I copy and paste a url into someplace, I can quickly error-check it to make sure it's the title I mean. When using the share button, the de-slugged URL can be given.
> users who share the links will think "the person can read the URL, so I won't include more context" and the person receiving the links thinks "and the URL has the title, which I can trust more than what some random user adds".
I guess? I wont bother with a rebuttal because this issue seems so minor. The benefit far outweighs some users maybe providing less context because the link url made them do it. If someone says "My typescript wont compile because of my constructor overloading or something please help", I can send stuff like:
stackoverflow.com/questions/35998629/typescript-constructor-overload-with-empty-constructor
stackoverflow.com/questions/26155054/how-can-i-do-constructor-overloading-in-a-derived-class-in-typescript
which I think is so much more useful than just IDs.
> Many web browsers don't even show the URL anymore: the pretense that the URL should somehow be readable is increasingly difficult to defend
Most do. Even still, the address bar is not the only place a URL is seen. Links in text all over the internet has URLs - particularly when shared in unformatted text (ie not anchor tags). And URLs should be readable to some extent. Would you suggest that all pages might as well be unique IDs? A URL like:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...
Is much better than
https://developer.mozilla.org?articleId=10957348203758
> how about doing it using a # on the page, so at least everyone had a chance to know that it is extra
Fair enough - I think that's a fine idea.