I'm watching people host models on things like LangSmith and OpenRouter for a fraction of the cost you are talking about. We have other people reporting their M4 Macs providing them with performance close to what they get with ChatGPT and Claude all locally with just a 24 GB M4 Mac. We already spend money on laptops. I can put in a ticket for an M4 Macbook Pro from IT right now.