After dealing with JavaScript and Lua, I am ready to call this a complete anti-pattern. To be a good language, it must support at least one size each of machine floats and ints. To be really good, it should be possible for me to choose any size of machine-supported floats and ints. To be great, it should also support rationals, fixed-point and complex numbers out of the box.
Giving me floats but not ints just doesn't cut it. It works, in a kind of shoddy way, but … it's tasteless.
If you don't provide me with bitwise operations (earlier versions of Lua, I'm looking at you), then you don't get to call yourself a real language.
For a database, though, I suppose one could always store integers as their string representation. But please, no language ever do this again ever.