Just 1s and 0s don’t really do anything useful before we agree on some conventions, such as
- byte and word size,
- endianness,
- semantics of what means what in a string of bits (think Two’s Complement, IEEE 754, ASCII, ISO-8859-1, Shift-JIS, Unicode),
- what it means to do certain operations on bits (Boolean algebra),
- how different binary operations can be constructed from transistors / logic gates (ALU design),
- how information can be retained in and recalled from memory (basically just flip-flops),
- how said memory is laid out with respect to internal/external devices and program regions (conventions!)
- how said memory can be addressed, and how information can be transferred between CPU and memory (bus architectures),
- how the computer architecture can be programmed to do things (processor instruction sets),
- and whatever I forgot just now…
And then some people design and build trinary computers, imagine that.