$30 / million tokens to $5 / million tokens since GPT-4 original release = 6X improvement
4000 token context to 128k token context = 32X improvement
5.4 second voice mode latency to 320 milliseconds = 16X improvement.
I guess I got a bit excited by including cost but that's close enough to an order of magnitude for me. That's ignoring the fact that's it's now literally free in chatGPT.