Yeah, but not so you'd notice, coming from an ML or Haskell background. /sarcasm
More seriously, Java and C++ (and C and Algol before them) fell into the trap of defining "data size specifications" as types: The discontinuity between "long long int" and "int" is one, as is the discontinuity between "float" and "double".
The real type difference is between things which are semantically different, like "numeric value used to represent age in years, rounded to integral value", "numeric value used to represent age in days, rounded to integral value", and "numeric value used to represent weight in pounds, rounded to nearest hundredth". Foogol (Algol-derived) languages largely only got this when they embraced Smalltalk-style OO, and they kept their size specification pseudo-types, just to muddy the issue.
Size specifications are useful when defining a wire protocol where the overhead of reading and writing an ASCII format with non-trivial syntax would kill your application, such that you need network-header style bits-on-the-wire protocol, and in file formats with similar constraints, but they're ultimately a low-level implementation detail.
The way this interacts with tooling like editor autocomplete is that editors can do more for you when they know what kind of data you're passing around, as opposed to just how big it is and other trivialities.