Say you have a function like
int foo_do(struct foo *, int);
but to fix a bug and/or tweak the API you change it to long foo_do(struct foo *, int, int);
then ELF symbol versioning allows you to do __asm__(".symver foo_do_v1_1,foo_do@@v1.1");
long foo_do_v1_1(struct foo *F, int arg1, int arg2) {
...
}
__asm__(".symver foo_do_v1_0,foo_do@v1.0");
int foo_do_v1_0(struct foo *F, int arg1) {
long rv = foo_do_v1_1(F, arg1, 0);
assert(rv >= INT_MIN && rv <= INT_MAX);
return rv;
}
where the runtime linker will link foo_do_v1_0 as foo_do for programs originally compiled against the v1.0 release, while programs built against v1.1 will be linked to foo_do_v1_1. You can do this as often as you want, though you can't generally go back further than when you first began using ELF symbol versioning to compile and release libfoo. You only need to add an ELF .symver alias for functions that have multiple aliases, but you do need to at least enable versioning (usually by specifying a version file with a catchall "*" entry which tag functions not explicitly aliased) at the point you begin maintaining a stable ABI.glibc is pretty much the only major library that makes use of this capability, despite the fact that it's been around for well over a decade. Most developers simply don't have the foresight or interest in providing rigorous forward and backward ABI and API compatibility. Partly that's because in the open source world, recompiling packages is much easier than in the proprietary world. And especially in the Windows world (where the CRT was never forward or backward compatible) you often packaged dependencies with your software, even if dynamically linked. And so newer languages like Go and Rust are being built with the presumption that both recompiling and bundled dependencies are the norm--it's what people are doing anyhow, and it simplifies the compiler and its runtime. That it's sad that this is the norm is beside the matter.