"like a yuibikey, some physical manifestation that can't be easily cloned"
And this is what I referred to by the "things you have" being just "things you know" wrapped in obscurity in practice. If you know the contents of a yubikey, you could store those in your password manager and use the password manager to emulate it.
Mind you, it can be good, solid obscurity. It's fun and educational to read about all the security in your yubikey, and certainly to me in practice it is a "thing I have" because I'm thousands of dollar's worth of hardware and weeks/months/years short of the requisite skills to penetrate one.
But there is still a sense in which it fails to be the platonic manifestation of a true "thing you have" because underneath the hood it's still a thing you know. At scale this matters.
At scale, biometrics also has the problem of becoming a thing you know. Again, in the platonically perfect world where, I dunno, authentication mechanisms have access to Star Trek transporters and can analyze you down to the atomic level to be sure you are you (though even Star Trek had trouble with the shapeshifters in Deep Space 9!), then, yes, it would be truly a "thing you are". But in the real world, where a biometric auth still involves presenting a sensor with some sort of input that it will agree is you, it still degenerates into a "thing you know" as you try to scale the system up. You can make it more and more difficult to fool the sensor, but then, that raises the price of the sensor and the risk of false negatives, both of which make it hard as you scale up. Which is why I think biometrics authentication is very powerful, but generally should be reserved for very important things and used as a mix of other methods, or, alternatively, used for things that hardly matter at all, but I think it's quite dangerous in the vast middle. I would be very concerned if my bank account could have arbitrary operations done on it just by presenting my fingerprint.
I don't actually mean this as "criticism" of things you know and things you are, because, like I've said in both cases, they do have their uses in the real world. I just think if you want to deeply understand the question of authentication, as they scale up, they all turn into a "thing you know" for a sufficiently motivated attacker, and in the discussions we have on HN we are generally talking about the largest possible scales, so this matters. I think that's an important aspect of understanding these systems, using them for security, understanding the attack surfaces and likelihoods, and properly modeling them. I see a lot of people making bad cost/benefit analyses because, for instance, they don't realize that biometrics are in the end a "thing you know" and that fingerprints can be faked, faces can be faked, etc., and that you can't model them as what you'd really like a platonic "thing you are" to be. They degenerate into "thing you know" at quite practical scales, depending on what goodies you are keeping behind those authentication barriers.