I think in the worst case it can be uniform: you can have an average delta value of 100 where the values range uniformly from 0 to 200. I think that case is fine, as you can still encode 200 in less than eight bits, but I'm worried that some worst-case number spacing will waste bits in any encoding scheme chosen, and that wastage will be enough to ruin the result. You have to pick an encoding scheme in advance but you don't know the distribution. Say you encode using 8-bit words. When the high bit is not set, the word ranges from 0-127. When the high bit is set, what follows is the full 27 bits. This will fail for certain distributions. For example, say a quarter of the numbers are spaced 257 apart and the rest are spaced 47 apart. You'll go over budget. Perhaps an adaptive scheme can be proven to work in all cases, but I don't know what that scheme is. Or maybe there's a simple scheme which can be proven for all distributions.
Arithmetic coding avoids the "wastage" of entropy you're talking about entirely - there are no "waste bits" in the middle of the stream like there would be with a fixed-size encoding.