But "safe" and "unsafe" are not a binary thing.
Maybe for LOB applications, it would be better if languages defaulted to arbitrary precision arithmetic. But scientific computing is also a huge field that often uses the same languages and there, arbitrary precision is often the completely wrong tool, e.g. it would make certain key algorithms (like Gaussian elimination) exponential.
I feel like this is just one of these things that developers should know about so they can make the correct choice, just like DB indices.