Of course GPT 4, or even 3, are impossible to run on any consumer product. As far as I know it's an ensemble of several models which are huge by themselves, with enormous hardware requirements.
But there's a lot of smaller LLMs, and my point is that these models can already run in mobile phones.