Even a scientifically inclined atheist has philosophical ideas grounding their world view. The idea that the universe exists as an objective absolute with immutable laws of nature is a metaphysical idea. The idea that nature can be observed and that reason is a valid tool for acquiring knowledge about nature is an epistemological idea. Ethics is another field of philosophy and it would be a mistake to assume a universal system of ethics that has been constant throughout all cultures across all of human history.
So while I certainly agree that there is a very common hand-wave of "look the atheists have just replaced God with a new 'god' by a different name", you don't have to focus on religion, theology and faith based belief systems to identify different categories of philosophical ideas and how they have shaped different cultures, their beliefs and behaviours throughout history.
A student of philosophy would identify the concept of "my truth" as being an idea put forward by Emmanuel Kant, for example, even though the person saying that doesn't know that that's the root of the idea that reality is subjective. Similarly, the empirically grounded scientist would be recognized as following in the footsteps of Aristotle. The pious bible thumper parroting ideas published by Plato.
The point is that philosophy is not the same thing as religion and philosophy directly shapes how people think, what they believe and therefore how they act and behave. And it's kind of uncanny how an understanding of philosophy can place historical events in context and what kinds of predictive capabilities it has when it comes to human behaviour in the aggregate.
I think you can also see this in the intensification of political discussion, which has a similar intensity to religious discussions 100-200+ years ago (i.e. Protestant reformation). Indicating that this "religious hardware" has shifted domains to the realm of politics. I believe this shift can also be seen through the intense actions and rhetoric we saw in the mid-20th century.
You can also look at all of these new age "religions" (spiritualism, horoscopes, etc.) as that religious hardware searching for something to operate on in the absence of traditional religion.
Max Stirner said that after the Enlightenment and the growth of liberalism, which is still very much in vogue to this day, all we’ve done is replace the idea of God with the idea of Man.
The object might be different, but it is still the unshakable belief in an idealised and subjective truth, with its own rituals and ministers i.e a religion.
I guess the Silicon Valley hyper-technological optimism of the past years is yet another shift from Man to religious belief in the Machine.
AI isn't a worldview; it's an extremely powerful tool which some people happen to be stronger at using than others, like computers or fighter jets. For people who empirically observe that they've been successful at extracting massive amounts of value from the tool, it's easy to predict a future in which aggregate economic output in their field by those who are similarly successful will dwarf that of those who aren't. For others, it's understandable that their mismatched experience would lead to skepticism of the former group, if not outright comfort in the idea that such productivity claims are dishonest or delusional. And then of course there are certainly those who are actually lying or deluded about fitting in the former group.
Every major technology or other popular thing has some subset of its fandom which goes too far in promotion of the thing to a degree that borders on evangelical (operating systems, text editors, video game consoles, TV shows, diets, companies, etc.), but that really has nothing to do with the thing itself.
Speaking for myself, anecdotally, I've recently been able to deliver a product end-to-end on a timeline and level of quality/completeness/maturity that would have been totally impossible just a few years ago. The fact that something has been brought into existence in substantially less time and at orders of magnitude lower cost than would have been required a few years ago is an undeniable observation of the reality in front of me, not theological dogma.
It is, however, a much more cognitively intense way to build a product — with AI performing all the menial labor parts of development, you're boxed into focusing on the complex parts in a far more concentrated time period than would otherwise be required. In other words, you no longer get the "break" of manually coding out all the things you've decided need to be done and making every single granular decision involved. You're working at a higher level of abstraction and your written output for prompting is far more information-dense than code. The skills required are also a superset of those required for manual development; you could be the strongest pre-LLM programmer in the world, but if you're lacking in areas like human language/communication, project/product management, the ability to build an intuition for "AI psychology", or thinking outside the box in how you use your tools, adapting to AI is going to be a struggle.
It's like an industry full of mechanics building artisan vehicles by hand suddenly finding themselves foisted with budgets to design and implement assembly lines; they still need to know how to build cars, but the nature of the job has now fundamentally changed, so it's unsurprising that many or even most who'd signed up for the original job would fail to excel in the new job and rationalize that by deciding the old ways are the best. It's not fair, and it's not anyone's fault, but it's important for us all to be honest and clear-eyed about what's really happening here. Society as a whole will ultimately enjoy some degree of greater abundance of resources, but in the process a lot of people are going to lose income and find hard-won skills devalued. The next generation's version of coal miners being told to "learn to code" will be coders being told to "learn to pilot AI".
Or we can just refuse this future and act as a society to prevent it from happening. We absolutely have that power, if we choose to organize and use it.
It leads me to the question, "Is it really 'religious hardware' or the same ol' 'make meaning out of patterns' hardware we've had for millenia that has allowed us to make shared language, make social constructs, mutually believe legal fictions that hold together massive societies, etc.?"
Do you assume that someone will stumble into creating a person, but with unlimited memory and computational power?
Otherwise, if we are able to create this person using our knowledge, we will most certainly be able to augment humans with those capabilities.
And it’s the atheists who continuously do it, claiming they don’t believe in God just markets or ai etc.
It’s an irony of ironies.
However, the present state is also worth a look!
The three uniquely human factors which people keep saying a machine can never do:
1. Empathy: they win by default. (My reference group is twenty friends and seven therapists.)
2. Critical Thinking: they win with the correct prompt. (You need to explicitly work against the sycophancy. i.e. the desire to appear empathetic limits the ability to convey true information to a human.)
3. Creativity: I want to say creativity lags behind, in LLMs at least, but Midjourney is blowing my damn mind, so I might have to give them that, too.
That's with the versions of AI we have today. My comment was referring to the ultimate goal, i.e. where this is all heading.
To put it explicitly, we intend to:
(1) make them in our image (trained after our mental output, and shaped after our body),
(2) while also making them vastly superior intellectually and physically (strength, endurance, etc.),
(3) while also expecting them to have no will of their own -- except as it aligns with ours. (We do actually need to give them a will to make them useful.)
I do not expect that to end very well.
The notion that history is just on some inevitable pre-planned path is not a new idea, except now the actor has changed from God to technology.
This is a more nuanced sentence.
YOU come off as poorly read, so I wouldn't trust your judgement on this one, champ. "common trope" lmfao.
I mean the lack of self awareness you have here is amazing.
In Adam Curtis‘s all watched over by machines of loving Grace, he makes a pretty long and complete argument that humanity has a rich history of turning over its decision-making to inanimate objects in a desire to discover ideologies we can’t form ourselves in growing complexity of our interconnectivity.
He tells a history of them constantly failing because the core ideology of “cybernetics” is underlying them all and fails to be adaptive enough to match our DNA/Body/mind combined cognitive system. Especially when scaled to large groups.
He makes the second point that humanity and many thinkers constantly also resort to the false notion of “naturalism” as the ideal state of humanity, when in reality there is no natural state of anything, except maybe complexity and chaos.
Giving yourself up to something. Specially something that doesn’t work is very much “believing in a false god.”
To be fair, we shouldn't bundle Augustine and Thomas Aquinas with John MacArthur and Joel Osteen. Meaning that some religious thought is more philosophically robust than other religious thought.