Or another way: reasoning is a socially constructed concept, developed by humans. Humans therefore have defined reasoning, and must therefore know how to reason.
Or a third way: I experience reasoning, you experience reasoning. I am currently reasoning. You are currently reasoning. I am human, as are you. Therefore humans reason.
Or another way: an LLM will respond the same way if you wait for one second or a decade between prompts. The only way it is interacted with is through a stream of tokens. There is exactly one stream at all times, and each time the stream is input again, it is barely different from the previous input. The LLM does not behave differently depending on the contents of the stream. It may produce the exact same token for two different streams. It may also encounter the same stream twice, and will act the same in both cases. If it were a "subject" "experiencing" say, a discussion, it would use its pasts "experience". But it does not.