One of the core ideas of working with LTS is that you can build your software on an LTS release and ship it to somebody else on the same LTS release, either as a source or as a binary.
If you want the latest GCC, that's fine, you're not forced to use the default compiler distributed with your OS. But it doesn't make sense to update the default compiler used in an LTS release. If you want that, then you don't want LTS.
If you want to develop an application, you use your own toolchain. But yes I know most C++ people don't d this because C++ tools don't easily support it. But that's on C++ for not having pyenv, rustup, multiruby, etc equivalent.
IIRC there's still an issue with gethostname which must be dynamically linked.
This needs to be emphasized.
Yes, and updating compilers don't prevent that at all. You can use GCC 10 to ship code that will build and run on Ubuntu 12.04 without issues. Xcode 11 can ship code that works back to macOS 10.6 and Visual Studio 2019 can still optionally target windows fucking XP !
This is incorrect. In practice, for larger code bases, upgrading to a newer version of GCC or Clang is something that must be done purposefully, and you must test.
Sometimes it turns out that your code relies on some compiler behavior which has changed. Sometimes newer compilers are stricter than older compilers. There are plenty of real-world cases of these problems!
> Xcode 11 can ship code that works back to macOS 10.6 [...]
There are a number of features that are specific to the macOS toolchain which make this possible. Take a look at the "-mmacosx-version-min" flag on the macOS compiler. This selectively enables and disables various APIs. These features don't solve all the compatibility problems, either.
> Visual Studio 2019 can still optionally target windows fucking XP !
We're talking about Linux here. The Windows toolchain is radically different.
In practice this can be "undefined behavior" like dangling pointers or data races. Maybe the new version of the compiler happens to reorder a couple of instructions (which it's perfectly within it's rights to do) which turns a "benign" race into a crash or an exploitable security issue. Is it your fault for writing these bugs? Sure. But if you're a big organization, and you know you have bugs like this, is this a reason not to upgrade your compiler? Absolutely. All the real world testing you've done on your current binaries has value, and losing some of that value needs to be weighed against the benefits of upgrading.
So if you hit issues, do what you do on every other system which is "installing older Xcode / Visual Studio" ? That would not be an issue at all if the toolchain wasn't vendored as part of the distro, you'd just have something like rustup that allows you to use whatever version of the toolchain your project requires.
> There are a number of features that are specific to the macOS toolchain which make this possible. Take a look at the "-mmacosx-version-min" flag on the macOS compiler. This selectively enables and disables various APIs.
yes ? https://gcc.gnu.org/onlinedocs/gcc/C_002b_002b-Dialect-Optio...
> We're talking about Linux here. The Windows toolchain is radically different.
what I'm saying is exactly that the Linux desktop world would be in a far better place if Linux followed the Windows / macOS way of vendoring toolchains.
The only thing that is an actual issue on Linux if you want backward compatibility is glibc which does not have an easy way (AFAIK) to say "I want to target this old glibc version". But that's not the issue for what we are talking about which is "getting newer compilers on a given distro" - Red Hat & derivatives manage this without issue with the various devtoolsets for instance.
Or you could switch to LTS, which achieves the same thing.
You do understand that ABI backward compabitility is not ensured, don't you?
https://gcc.gnu.org/onlinedocs/libstdc++/manual/abi.html
Some software packages even break between distro releases.
The primary value of a distro is to provide a fixed platform that application developers and users can safely target. Risking ABI breakups just because a rare number of users wish to be on the bleeding edge without wanting to do any of the work to install their own software is something that's very hard to justify.
> The GNU C++ compiler, g++, has a compiler command line option to switch between various different C++ ABIs. This explicit version switch is the flag -fabi-version.
If you want to target a given distro, you -fabi-version this distro's ABI, just like you set -mmacosx-version-min on mac or set _WIN32_WINNT on windows
That is also mind-numbingly absurd to force upon the vast majority who couldn't care less about the bleeding edge and want a stable platform to act as a fixe target without risking random ABI breakages.
I should not be forced to endure a brittle and fragile and overly-complex compilation process just because a random guy somewhere had a whim about taking a compiler out for a spin.
The world expects stability. If you wish to try out some stuff, just download the compiler and build the damn thing yourself. Hell, odds are that there's already a PPA somewhere. So where's the need to screw over everyone?