This is completely missing the point of Big-Oh notation. If you start trying to analyze how long integer operations take,
ALL your computational complexity analysis falls apart for any sufficiently large integer arithmetic for any algorithm. This is precisely why such operations are ignored, because it becomes completely impossible to do any meaningful platform-independent analysis if you try to take that into account.
That's not to say you shouldn't take it into account, but the algorithm he described is, in fact, O(log(n)) time. It just is. That's how the math works. Now, you should note that the O(log(n)) time constraint does not take into account arbitrary integer size which will cause progressively larger constant factors to influence the efficiency of the algorithm... but it is still a O(log(n)) algorithm.