Sure, there will be growing pains, friction, etc. Who cares? There always is with world-changing tech. Always.
That's right. Who cares about pains of others and why they even should are absolutely words to live by.
What you are likely doing, though, is making many more future humans pay a cost in suffering. Every day we delay longevity escape velocity is another 150k people dead.
I'd rather just... not die. Not unless I want to. Same for my loved ones. That's far more important than "wealth inequality."
“Oh well, I guess I can’t give the opportunities to my kid that I wanted, but at least humanity is growing rapidly!”
Everyone has always worried about this for every major technology throughout history
IMO AGI will dramatically increase comfort levels, lower your chance of dying, death, disease, etc.
People aren’t really on uproar yet, because implementations haven’t affected the job market of the masses. Afterwards? Tume will show.
Beyond immediate increase in inequality, which I agree could be worth it in the long run if this was the only problem, we're playing a dangerous game.
The smartest and most capable species on the planet that dominates it for exactly this reason, is creating something even smarter and more capable than itself in the hope it'd help make its life easier.
Hmm.
>But while the “making AGI” part of the mission seems well on track, it feels like I (and others) have gradually realized how much harder it is to contribute in a robustly positive way to the “succeeding” part of the mission, especially when it comes to preventing existential risks to humanity.
Almost every single one of the people OpenAI had hired to work on AI safety have left the firm with similar messages. Perhaps you should at least consider the thinking of experts?
You and I will likely not live to see much of anything past AGI.
The people experiencing the growing pains, friction, etc.