> Would you settle for "behave exactly as if they had some form of intelligence"?
Sure, it behaves as if it has some form of intelligence in the sense that it can take external input, perform actions in reaction to this input, and produce outputs dependent on the input.
This historically has been known as a computer program.
Never fails that when a techbro has been told LLMs aren't what they think they fall back to a field they certainly have more authority on: The human brain/intelligence.
The issue here is that the "LLMs have intelligence" side of the argument can lay out a simple mainstream conception of intelligence (general problem solving) and explain directly how LLMs meet this definition. The other side of the argument, at least here in this thread, seems to be an empty insult or two and... Nothing else?
Again, just say what you think intelligence is and why you think LLMs don't have it. If you can't do that then you have no business expressing an opinion on the subject. You really aren't expressing an opinion at all.
Brother if I could get people who believe ChatGPT is intelligent to post something more than "oh and arne't you just an autocomplete" then I would be so god damn happy.
This fantasy land you live in where people who have no formal training in the matter are making this high brow elegant reasoned argument doesn't exist and the reason you think the "other side of the argument" is just being insulting is because the burden of proof is not on us.
It doesn't help that half the time you guys post you directly contradict the main researcher's own assertions.