Oh, absolutely. This actually shows that floats are (in some sense) more rigorous than more idealised mathematical approaches, because they explicitly deal with finite memory.
Oh, I remembered! There's also interval arithmetic, and variants of it like affine arithmetic. At least you know when you're losing precision. Why don't these get used more? These seem more ideal, somehow.