And where is this objective metric for consciousness? Last I checked we didn't even have a sensible definition for it.
It seems to me you're just kicking the can.
Setting that issue aside. While I certainly don't believe LLMs to be conscious (an entirely subjective and arbitrary take on my part I admit) I don't see any reason that concepts such as "intelligence" and "understanding" should require it. When considering how we apply those terms to humans it seems to me they are results based and highly contextual (ie largely arbitrary).