> See this definition of AGI (link shamelessly stolen from someone else in this comment section) from before the latest AI hypecycle started warping things.
Every definition on that page, both theoretical and operational, match my definition and not yours. Notice that none of them would exclude an AGI with an IQ around 90, provided it's intelligence is general.
> Approximately that an unaided agent must, with no outside assistance, be able to solve ~90% of the most difficult tasks that we throw at it with a ~90% success rate. It's not a precise definition but that's approximately where I stand on the matter.
This isn't your definition. How hard are these "most difficult tasks"? Can 50% of humans solve them? 10%? If it were literally the most difficult problems, they would be the ones 0% of humans have solved.
> Said race as a class would presumably be capable of meeting or exceeding my above criteria
Some might, but this hypothetical alien race does not. Do you still consider them to have general intelligence if they can merely do everything a 95 IQ person could?
> Your attempt to compare to individual humans is an error. AGI applies on the class level
According only to you. LLMs are benchmarked individually. No one runs a benchmark where Claude gets half the questions right, GPT gets the other half right, and it's reported as a combined perfect score as a class. Instead the each score 50%. (Not that I think current AIs can solve the harder benchmark problems. The point is they are measured individually.)
No one else ascribes general intelligence only to a class. You can talk to one average person (or alien), give them some tests, and determine they have general intelligence. This is how everyone else uses the term.