Yes, I understand that it can appear to synthesize something new, and no, I'm not looking for some inner experience.
I'm looking for it to show an ability to wield not only a set of strings (with language associations), but something actually like the platonic ideals - objects, with properties and relations.
A few errors show quickly there is no such concept being weilded.
>> I saw a fine example of this failure the other day: "Mike's mom has four kids. three are named Danielle, Liam, and Kelly. What is the fourth kid's name?" ChatGPT's reply is explanation of how there isn't enough info in question to tell. Told "The answer is in the question.", ChatGPT just doubles down on the answer. (Sorry, couldn't find the original example)
>> "My sister was half my age when I was six years old. I'm now 60 years old. How old is my sister?" ChatGPT: "Your sister is now 30 years old". [0]
>> Or this one where ChatGPT entirely fails to understand order/sequence of events. [1]
Or a plethora of math problem fails found...
Similarly, the image "AI"s fail to understand relationships between objects (or parts of one object), and cannot abstract a particular person's image from a photo, showing it has no understanding of what is a body... (I can look those up if necessary).
And, of course, the answers are entirely untethered from reality - it is completely by chance whether the answer is correct or just wrong. It is run through a grammatical filter/generator at the end so it's usually grammatical, but no sort of truth filter (or ethical filter for that matter either).
I don't expect some abstract experience, I expect it to be able to break down it's work into fundamental abstract concepts and then construct an answer, and this it cannot do, or it would not be making these kinds of errors.
[0] https://twitter.com/Bestie_se_smeje/status/16210919157469184...
[1] https://twitter.com/albo34511866/status/1621608358003474432