It goes a bit deeper than that.
Most static typed languages - C, Java, Haskell, etc are memory representation centric in their primitive types. Ada is usage centric.
You see that two ways.
First, in most languages any item with underlying implementation X can represent any legal value of X, even if it makes no sense. If you want to represent a quantity whose maximum value is one million, putting it in anything shaped like a 32-bit int will let it contain the illegal value "one million and one". Contrast Ada which will let you cap its legal range.
Second, in most languages any item with underlying implementation X can contain anything else with the same implementation. Your typedef'd int can be put into anything else int-sized. Contrast Ada, which won't permit you to mix apples and oranges if they are defined as distinct types with int storage.