It never said it was any of that. The point of terminator is that decisioning around war was taken out of the hands of humans, and then nobody could control it.
You people really don't get it do you? Skynet doesn't need to be evil, or conscious, or self improving. It can be good, very good. But when WE don't control it, we don't know the consequences of what we created. Nobody saw AI psychosis coming but we created it, by making the models good. By making the models listen to you and agree with you.
For fucks sake, you could make an automated system that just signs postcards and, if you give it enough access, it could wipe out the human race. Not because it's evil, it might not even have an understanding of evil, but because we don't control it, and it will meet it's own goals without concerns for us because it's not human.
> autokill bots are coming. Whether any of us like it or not.
Inevitability is not an argument, and I won't humor it. It's cognitively lazy and dishonest. With this reasoning you can justify ANYTHING. Rape, murder, nuclear warfare, killing and eating children. This reasoning is bad and stupid and nobody should do it anymore.