Starting with bits and bytes and hardware would be pretty goddamn boring, I think. You'd lose a lot of students that way.
That said, in the third year we had a course in analog electronics, which was basically transistors and logic gates.
Following that course was one in digital electronics, where we all built our own little toy 8-bit computer, wiring the CPU ourselves, writing the microcode ourselves. I'll never forget the a-ha moment when you realize that your instruction set are just binary patterns representing which wires to put a current on, which units to toggle on and off. The instruction to move a value from a register to an address in RAM has to look like this, because you need to toggle the read input on the correct register, and the write input on the RAM unit, and everything else has to be off. Blew my mind at the time.
The clock was manual if you wanted to, so you could step through and watch your little CPU run a program, or you could set it to like 1Hz and watch the thing go. And from there, you can sort of get how a modern computer works, it's just a matter of going from 1Hz to 1GHz, wider buses, wider instruction sets, but it's no longer "magic" how the CPU works, it's all ones and zeroes, for a reason, and you now know that reason.