Raining is a well defined, tangible, indisputable thing. The properties of having beliefs or opinions or material interest or consciousness are not so universally defined as to be in the same category as rain.
A sufficiently realistic emulation of humans must necessarily eventually obtain those same properties of humans in order to be realistic. Either you don't think chatGPT is realistic enough, in which case we kick the can down to the next iteration of AI language model, or you are painted into the corner of having to perpetually rationalize your denial with new reasons for why the AI isn't really <insert whatever>.
You couldn't define for me exactly what it means to have consciousness. You certainly couldn't define it in a universally accepted manner like you could define rain. But at the same time you'll insist that chatGPT certainly doesn't have any form of consciousness or sense of self or whatever other nebulous concept, all the while being certain the beings its emulating absolutely have all those things. Sounds like doublethink to me.
How are you so sure it lacks all these things you can't define? What's the difference between clocking in for a 9-5 every day vs playing a performance piece about a character who clocks in for a 9-5 every day and never stepping out of character?
For all I know you are chatGPT. Maybe this is all text generated for a character with these opinions.