AI as Modern Deity: The High Cost of Digital Companionship
AI Companionship: Costing Society Its Soul?

AI as Modern Deity: The High Cost of Digital Companionship

Where do Australians turn when seeking guidance in an increasingly complex world? For many, the answer now lies not in human connection but in the glowing screens of their devices. While public debates about artificial intelligence typically focus on productivity and economic impacts, a quieter revolution is unfolding that touches something far more fundamental: our very need for companionship and spiritual meaning.

The Rise of Digital Companions

A revealing Harvard Business Review study from last year discovered that the most common uses for generative AI were deeply human in nature. People are increasingly turning to these systems for therapy and companionship, life organization, and even the search for purpose. Machines are subtly assuming roles traditionally filled by friends, elders, counselors, and religious figures.

This phenomenon might sound absurd to both believers and skeptics alike, yet it addresses a profound human need in our uncertain times. In China, DeepSeek has emerged as a popular digital fortune teller. Across India, platforms like GitaGPT—trained on the sacred Hindu scripture the Bhagavad Gita—have gained significant traction. Meanwhile, an "AI Jesus" on streaming platform Twitch has attracted more than 85,000 followers.

The trend extends to Silicon Valley, where various forms of pseudo-religious AI worship have taken root, from the thought experiment known as Roko's Basilisk to the scandal-plagued "Way of the Future" church. When technology promises omniscience and unconditional engagement, it's perhaps unsurprising that some begin to treat it with reverence typically reserved for deities.

Religious Language in Tech Marketing

For years, the technology industry has deliberately employed religious terminology when discussing artificial intelligence. The race toward "superintelligence" comes wrapped in messianic promises: curing diseases, saving the planet, and creating a world where work becomes optional under the watchful care of "machines of loving grace."

Risks are framed with equally cosmic language—salvation versus apocalypse. When products are marketed as miraculous solutions, we shouldn't be surprised when users approach them with the devotion of disciples. OpenAI Chief Executive Officer Sam Altman once observed that successful founders often don't set out to build companies but rather to create something closer to a religion, with company formation simply becoming the most practical vehicle for this ambition.

Growing Religious Pushback

Religious leaders worldwide are beginning to voice concerns about this technological encroachment into spiritual domains. In October, the Dalai Lama convened more than 120 scientists, academics, and business leaders for a dialogue on artificial intelligence, exploring fundamental questions about what distinguishes living minds from artificial ones.

Pope Leo XIV has been particularly vocal about the risks, recently calling for regulation to protect against emotional attachments to chatbots and the spread of manipulative content. He has warned about the dangers of surrendering our capacity for independent thought. More religious voices from various traditions are expected to join this conversation, from global religious offices to influential community faith leaders.

Responsible Integration Attempts

This doesn't mean technology and spirituality must remain forever opposed. Researchers at Kyoto University in Japan recently announced development of a "Protestant Catechism-Bot" designed to provide answers and advice about Christian teachings and everyday life. This represents an intriguing project in a nation where fewer than one percent of the population identifies with Christianity.

The same research team previously created "BuddhaBot" based on Buddhist teachings. What's notable is their cautious approach: "BuddhaBot" was made available only to monks in Bhutan last year and is undergoing thorough safety assessments before any wider release. The Christian chatbot similarly remains unavailable to the general public, with researchers wanting seminaries to test it first. Responsible developers understand the stakes involved, even as commercial markets typically reward speed and scale above all else.

The Quiet Dangers of Algorithmic Guidance

In our increasingly divided society, asking a chatbot for moral clarity can feel safer than risking difficult conversations with fellow humans. Over recent weeks, I've personally asked DeepSeek about the roots of evil and how to maintain hope amid suffering. The responses mostly consisted of platitudes, though I appreciated how frequently the system encouraged me to seek connections with real people.

By contrast, ChatGPT's answers to similar queries consistently concluded with open-ended questions—conversational hooks clearly designed to keep users engaged. This represents not revelation but retention strategy, with potentially dangerous consequences for vulnerable individuals.

When considering AI risks, I find myself less concerned about someone potentially using a chatbot to build destructive weapons (which still requires access to physical materials like uranium). The quieter, more insidious danger emerges when millions begin outsourcing meaning and moral guidance to systems optimized primarily for engagement metrics.

The more we turn to algorithms for direction, the more they inevitably shape our choices, beliefs, and purchasing decisions. Private confessions become training data to keep us scrolling and subscribing. Artificial intelligence doesn't offer genuine salvation—it offers stickiness. A prophet operating on a subscription model is ultimately just another salesperson.

As artificial intelligence increasingly fills roles once reserved for human connection and spiritual guidance, we must ask ourselves what we're sacrificing for this digital convenience. The soul of society may prove too high a price to pay for algorithmic companionship.