- OpenAI,etc will go bankrupt (unless one manages to capture search from a struggling Google)
- We will have a new AI winter with corresponding research slowdown like in the 1980s when funding dries up
- Opensource LLM instances will be deployed to properly manage privacy concerns.
You think we have these crazy valuations because the market thinks that OpenAI will make joe-schmoe buy enough of their services? (Them introducing "shopping" into the service honestly feels like a bit of a panicky move to target Google).
We're prototyping some LLM assisted products, but right now the cost-model isn't entirely there since we need to use more expensive models to get good results that leaves a small margin, spinning up a moderately sized VM would probably be more cost effective option and more people will probably run into this and start creating easy to setup models/service-VM's (maybe not just yet, but it'll come).
Sure they could start hosting things themselves, but what's stopping anyone from finding a cheaper but "good enough" alternative?