Talking to experts in the field, we're doubling the number of functional qbits (ie accounting for error correcting requirements) about every 12 months right now. Looks like current number is 16-qbits, where many algos get "really interesting" around 1000 qbits (and functional even before that). So 5-6 years until a potential total transformation in computing paradigms. Quantum is in a similar place as deep learning in 2012.
It's anyone's guess to how far we go, but generally extrapolating a curve far off the current reading is a case of innumeracy.
More generally: nothing is inevitable, especially when we're talking about orders of magnitude. Maybe it happens as quickly as we'd hope, maybe it doesn't. Maybe it doesn't even happen in the near to intermediate future at all. That goes for self-driving cars, quantum computers, general artificial intelligence, or whatever.
Moore's law was inevitable, until it wasn't.
I highly doubt we'll get there in 5 years, myself.