Neither do you? None of us do, in fact I’d imagine the people trying for AGI right now would have a better guess than you or I.
> there would be an immediate and total power vacuum caused by the advancements. these advancements would be so huge that it would change the geopolitical equation beyond recognition.
This sounds like you’re assuming someone will flip a switch one day and the most powerful mind in history will be let loose. I’m not sure AGI will advance that fast. We might have alot of incredibly “stupid” iterations of AGIs first, for many years before a clever one rolls around.
> this is intrinsic and unavoidable. it cannot be disproved or denied.
Were all just making assumptions here, I don’t think yours get to be called “intrinsic and unavoidable”.
I understand the concerns here, but if you’re willing to claim the end of the world, I would suggest basing your claims on something, or atleast making your assumptions explicit. E.g. “assuming we achieve AGI, and its equipped to rapidly become more powerful/intelligent than the whole of the human population…)