Prior to gpt-3 AI was rarely used in marketing or to talk about any number of ML methods.
Nowadays “AI” is just the new “smart” for marketing products.
Terms change. The current usage of AGI, especially in the context I was talking about, is specifically marketing from LLM providers.
I’d argue that the term AGI, when used in a non fiction context, has always been a meaningless marketing term of some kind.
> Prior to gpt-3 AI was rarely used in marketing or to talk about any number of ML methods.
In the decade prior to GPT-3, AI was frequently used in marketing to talk about any ML methods, up to and including linear regression. This obviously ramped up heavily after "Deep Learning" got coined as a term.
AI now actually means something in marketing, but the only reason for that is that calling out to an LLM is even simpler than adding linear regression to your product somewhere.
As for AGI, that was a hot topic in some circles (that are now dismissed as "AI doomers") for decades. In fact, OpenAI started with people associating or at least within the sphere of influence of LessWrong community, which both influenced the naming and perspective the "LLM industry" started with, and briefly put the outputs of LessWrong into spotlight - which is why now everyone uses terms like "AGI" and "alignment" and "AI safety".
However, unlike "alignment", which got completely butchered as a term, AGI still roughly means what it meant before - which is basically a birth of a man-made god. That's true of AGI as "meaningless marketing term" too, if people so positive on it paused to follow through the implications beyond "oh it's like ChatGPT, but actually good at everything".
Well now it is not: it is now "the difference between something with outputs that sounds plausible vs something with outputs which are properly checked".