Slower, in a different complexity class from the classical ones (AKA O(n^2) instead of O(n), but I don't know what exact complexities the top candidates have right now, as a function of key length).
Larger messages, in that a signature requires 100's or 10's of Mb instead of a few kb.
Larger public keys, as again 100's of Mb instead of a few kb.
Larger private keys, as in a few to 10's of Mb instead of 1/2 to a few kb.
But I guess the most relevant "much worse" is that people have much less confidence that those algorithms are secure, and they are complex enough to hide huge traps that make implement RSA look like a walk in the park.
Some of the other choices aren't so much slower but are far bigger, for example McEliece systems.
There's lots of opportunities to make different trade-offs - at least if all of them survive a bit more scrutiny by smart motivated opponents - but they're all generally worse than what we have now - except that they resist Shor's algorithm.
I don't think that's a problem for end users, you are not constantly generating keys. It will be a problem for servers handling thousands of connections per second, but I'm sure dedicated HSMs will appear if there is a need for them.
In any case, I'm not an expert in crypto, just a poor sysadmin-by-accident who likes reading about the latest security developments so the bad guys don't pwn my servers. And as you said, engineering is always full of trade-offs, let's see what the NIST PQC standardization process will decide.
larger messages
larger public keys
larger private keys