It should involve consciousness. You would not call an AI reacting to red color as "seeing" red. Same thing.
It seems to me you're just kicking the can.
Setting that issue aside. While I certainly don't believe LLMs to be conscious (an entirely subjective and arbitrary take on my part I admit) I don't see any reason that concepts such as "intelligence" and "understanding" should require it. When considering how we apply those terms to humans it seems to me they are results based and highly contextual (ie largely arbitrary).
Some people argue that consciousness emerges in early childhood. I can get an infant to understand what I am saying even if they aren’t conscious.
1. Vibrate according input to the sound, is that hearing?
2. Generate electrical signals according to the sound, is that hearing?
3. Amplify electrical signal, does we cross the hearing mark?
4. Record the signal to a cassete tape (or use an ADC -> mp3), are we hearing yet?
5. Play it back through a speaker. Sure, we should be hearing now!
At which point exactly would you say the thing is definitely hearing?
You've fallen into the trap of human exceptionalism but you don't seem to be aware of that fact. Are you a substance dualist or not?
If a tree falls, does it make a sound? It depends on whether there is somebody to ultimately perceive the vibrations that the falling tree made (either directly or via recording).
Yea
>encoding a meaning is understanding.
encoding a meaning is encoding. Nothing more!
No need to gatekeep the word "understanding" behind subjective human experience eg qualia.