I feel like the second part is a bit exaggerated. Humans inherently also aren't "made human" by something, there's no universal standard for morals and behaviors. You could also get reasonable "murder instructions" from an average person - it's not exactly forbidden knowledge, with how commonly it's depicted in media. Hell, I'm pretty sure there are detailed instructions on building a nuclear bomb available online - the reason why they're not viewed as some extreme threat is because the information isn't dangerous, having access to machines and materials required is.
As for the last paragraph - if the effects truly keep scaling up as much as people expect them to, I'd want society to be restructured to accommodate wide-reaching automation, rather than bowing down to a dystopian "everybody must suffer" view of the future.