The idea is that eventually we build something that, when it plateaus, builds its own successor. That’s the singularity: when the thing in question builds its successor and that builds its successor and this happens far outside our ability to understand or keep up.
Can GPT9 build GPT10, with zero human input?
I’d give 50/50 odds it can.
Can GPT15 build something that isn’t a large language model and is far superior in every way?
I’d give 50/50 odds it can.
Can both the above steps happen within one solar rotation of each other?
I’d give 50/50 odds they can.
Because at some point these models won’t need humans to interact with them. Humans are very slow- that’s the bottleneck.
They’ll simply interact with their own previous iterations or with custom-instantiated training models they design themselves. No more human-perceptible timescale bottlenecks.