story
I'm not assuming that, that's literally the definition of mimicry: to imitate closely.
You might say I'm assuming that it is mimicking and not actually thinking, but there's no evidence it's actually thinking, and we know exactly what is IS doing because we created the code that we used to build the model. They're not thinking, it's doing math, mathematical transformations of data.
Whatever thinking fundamentally is, it also has an equivalence as a mathematical transformation of data. You're assuming the conclusion by saying that the two mathematical transformations of data are not isomorphic.
A simulation of information processing is still information processing, just like running Windows in a QEMU VM is still running Windows.
Do not confuse the mathematical description of physical processes as the world being made of math.
> You're assuming the conclusion by saying that the two mathematical transformations of data are not isomorphic.
Correct. They're not isomorphic. One is simple math that runs on electrified sand, and one is an unknown process that developed independently across a billion years. Nothing we're doing with AI today is even close to real thought. There are a billion trivial proofs that make the rounds as memes, like one R in strawberry, or being unable to count, etc.
I know I am a mind inside a body, but I'm not sure about anyone else. The easiest explanation is that most of the people are like that as well, considering we're the same species and I'm not special. You'll have to take my word on that, as my only proof for this is that I refuse to be seen as anything else.
In any case LLMs most likely are not minds due to the simple fact that most of their internal state is static. What looks like thoughtful replies is just the statistically most likely combination of words looking like language based on a function with a huge number of parameters. There's no way for this construct to grow as well as to wither - something we know minds definitely do. All they know is a sequence of symbols they've received and how that maps to an output. It cannot develop itself in any way and is taught using a wholly separate process.
Yes, debated and refuted. There are many well known and accepted rebuttals of the Chinese Room. The Chinese Room as a whole does understand Chinese.
How would the mind know which one it is?
Maybe your mind is being simulated right now.
I'm not assuming it is without hard proof - that's my only argument.
> Maybe your mind is being simulated right now.
I'm experiencing consciousness right now, so that would have to be a damn good simulation.