Expecting or abstracting human characteristics onto a probabilistic black-box modeled on human behavior is a trap. It's borderline "Finder smiles so my computer is happy" logic. We have created these things to closely model (but not replicate) human behavior. This is distinctly a high-level emulation, with zero consideration for human concepts like extended memory or physical sensation.
I would say that emotion is something more intangible, that you can't simulate by taking shortcuts with math and language. If I tell ChatGPT "I shot you dead!" and it says "ow!" back, nothing has transpired. The machine "felt" nothing, it just intuited what a human might do in that situation.