I also hope that parent is right in that it won't want to generate profit for its investors. I hope it does the moral thing instead and puts us in a post-scarcity state where we don't live and die by capital. :3 (Or kill us all. Whichever.)
>why could we not design an AGI that has a need (or a suitably chosed reward function) to fulfil some chosen goal?
But who knows what Pythia will do when she overrides the reward button[0]?
But then who should really care? Not like anyone can (or should?) argue with superintelligence.