story
It’s not that running in the cloud is more expensive. It’s that people already have a $2000 laptop or maybe even $1600 RTX 4090. If I’ve got that I don’t want to pay $20/month to 6 different AI services.
Sam Altman said ChatGPT costs like 2 cents per message. I’m sure they can get that way down. Their bills are astronomical. But the data they’re collecting is more valuable than the money they’re spending.
Stable Diffusion isn’t super fast. It takes 30 to 60 GPU seconds. There’s minimal consumer advantage to running in the cloud. Id run them all locally if I could.