It's not just the machines that are ruthless. The operators also take on attributes of the machines they manage and in the process inflict willful cruelty on others through their control of the machinery. The most obvious example of this is, of course, war and the technology associated with its execution. So this essay is right, machines/algorithms are ruthless but it is the people that use them to inflict pain and suffering on others that makes the whole thing into a grand tragedy.
This mirrors my experience. About six months ago, I had an accident on the metro in my city, where my leg slipped between the train and platform while deboarding. Not only did passers by not help, the "operators" stood idly by. While I am lucky to have escaped with "just" an ACL tear, I would have lost my leg that day if my friend hadn't been there to pull me out.
I'm not generally a litigious person (at least I hope not), but that sounds like a law suit...
It's one thing to refuse to open the doors just to let a latecomer on board, but it's quite another for a person in a position of authority (and I would argue with a duty of care) to harm someone or put them in danger like this.
"We become what we behold. We shape our tools and then our tools shape us"
It seems to me an inherent property of the non-human universe, i.e "everything but us" - gravity is extremely unforgiving if you happen to step off the wrong thing, a sharp rock can slice you open with a moment's inattention. If you find yourself in the wrong environment you will die of thirst or hunger or asphyxiate. Nature is not "kind". Eventually even our bodies turn on us.
Human kindness and judgement have a very limited sphere of influence, it just (rightly) feels huge because it's at the center of human life.
Why do I emphasize this so strongly? Because popular science, as it is wont to do, often sacrifices correctness and intellectual substance with tawdry emotional appeal and sensationalism. To say "the universe is indifferent to us" has an emotional force than the banal reality of the situation does not. And this leads to intellectual confusion and a distorted view of reality, because a category, through emotional conditioning, has been falsely attached to reality.
I truly believe the only solution is to bring down the current system, which includes the technology that enables it.
The key strength of a free-market seems to be that it assumes people will act in their own self interest and it creates a space where we can a get a roughly ‘win-win’ situation. That while you act in your self interest both you and the community are rewarded. So starting a bakery would give you a financial reward and others baked goods at a competitive price. Assuming that changing the system will force people to stop acting in their self interest seems to be how alternatives go wrong.
The problem is humans. We need people who refuse to govern using oppression. Corporate leaders who refuse to prioritise profit over human rights. And developers who have a moral compass of what should and should not be built.
But why would this technology be any different to prior new technologies…
That said: what systems you encounter & how they behave, may tell you something about the ethics of the people that put them in place.
As a consequence: if corporation / system / product behaves 'evil', don't direct your anger at those. Instead, direct your anger (or praise!) at the people who created & manage them.
For convenience, "company Y did Z" remains a valid phrase. As long as you're aware that "company Y" is just a placeholder meaning "people working for company Y".
To say that anyhting is 'just a tool' is exceptionally naive and only works in small group settings.
We will never find people who govern fairly or corporate leaders who refuse to priortise profit as long as we maintain and support a system that is conducive to their growth, just as we will never stop the growth of bacteria in a bacteria-rich medium.
Yes, a hammer can be used for good or bad, but it will always inevitably be used for bad in an environment that encourages such uses.
Empirically that seems to be true.
> Yes, a hammer can be used for good or bad, but it will always inevitably be used for bad in an environment that encourages such uses.
From your previous line that I quoted, that would be any environment that has humans in large groups. Now what do you propose? Short of returning to hunter-gatherers (with the death of about 7.9 billion people), I don't see how you're going to prevent that kind of environment.
We don’t have the environment to change it
Humanity once Ressource starved becomes inherently oppressive as art and experiments compete with human mouths for survival. computers just amplify that retardation once it occurs.
Instead, I'd use a different, neutral term like someone used for nature here: they are indifferent to our emotions.
But that's about right: so are shovels, or meteorites, waterfalls, or a rock falling off a cliff. The main difference is that most of those inert objects act in accordance with natural forces, whereas machines have some unnatural movement (like sideways with train doors).
The other difference is that we have introduced many more of such inert objects "acting" into our environment, but we have been doing that long before we could build sophisticated machines and computers (ceilings did fall, statues and bridges collapsed, animals killed and hurt their stewards...).
As such, I would vehemently disagree: prescribing any moral direction to objects can only confuse and introduce FUD (has been done throughout history). With machines, we actually have an ability to choose the behaviour (adding sensors to train doors is pretty simple).
The fact that we don't is purely our choice, the same way we teach our kids by letting them fall, get a bad grade or experience anything negative — not because we don't love them. Do you feel like delaying a train of 1000 people because you are slightly late is ok? Would you go and thank everyone or apologize to anyone affected on the train — if you were not ruthless, you would, right?
I don't really believe the above, I am simply showing how easy it is to turn this on its head.
On the other hand, when "the algorithm" messes up there's often no one operating it to talk to and no way for the effected individual to bypass it. In that sense, perhaps computers behave like "entities" in a way that other tools do not?
There are plenty of things that can kill or hurt us that we've made that have no active operator either (I mentioned buildings and roads/bridges collapsing, and one could even place a shovel on a pile that slips or falls off a truck and hurts you; or using a wire gauge that's too thin for electric current it carries; or...). Lack of care (or expertise) in whatever humans construct or build can harm you without having an operator or any automation.