I am not sure how awesome it is. It looks like something you can stumble on your own by accident on a whiteboard on an interview and use successfully even if you don't know why it works exactly.
Honestly, I always thought it is a common knowledge and it is just too simple and for this reason gets omitted from books.
People in CS/IT tend to not spend a lot of time on algorithms with bad complexity and so I am used to people discounting algorithms with useful properties just because there is another that is a bit more efficient when n goes to infinity.