Are Humans Using AI to Build a Modern Tower of Babel?

“Come, let us build ourselves a city and a tower with its top in the heavens, and let us make a name for ourselves,” the builders of the Tower of Babel said to each other more than 4,000 years ago (Gen. 11:4).

That ancient desire to be like God is so clearly replicated in today’s artificial intelligence technology that lots of Scripture readers have drawn the link. “How Artificial Super-Intelligence Is Today’s Tower of Babel,” read a headline at Christianity Today. At World, David Bahnsen wrote “AI and the Tower of Babel.” And at a Jewish university in New York, a student quoted her professor: “It makes me think about the lesson learned with the Tower of Babel: are we really meant to build artificial intelligence?”

Computer scientist Mark Sears isn’t one to be scared or skeptical of technology. He’s worked in tech for 25 years and in AI for the last decade, first building CloudFactory—a “human-in-the-loop AI company”—then recently launching Sprout AI Studio, which aims to build 15 AI startups in the next five years.

“In AI, God is giving us a new tool as we join him in the renewal of all things,” said Sears, who also works with faith-based groups such as Praxis and Sovereign’s Capital. “And the Enemy will use it distort and leverage sin and brokenness.”

Sears believes AI has the potential for good. But he also argues that AI’s dangers extend beyond mere human empowerment. “It’s an attempt to try and create in our own image by replicating not just intelligence and mind, but also heart, body and soul,” he said. “This ambition to replicate humanity in artificial form echoes the hubris of the Babel builders.”

But the situation could be even more precarious than that, he said: “There is a growing confidence by technologists that we can create a superintelligence. That’s an attempt to create our own God—the omniscient, omnipresent aspects of God.”

The Gospel Coalition asked Sears—who will speak at TGC’s National Conference in April—about the biggest danger of AI (spoiler: it’s not pressing the nuke button), why he worries about AI that tries to “know and love” humans, and how parents and pastors can handle AI’s opportunities and challenges.


You think we’ve already moved beyond the ‘Tower of Babel’ situation—using technology to make ourselves better informed, more productive, and wealthier—to trying to create beings in our own image. How do you see AI evolving beyond simulating human intelligence?

AI started with trying to simulate human intelligence and the human mind with neural-networks. But now many are aggressively trying to simulate the human heart and body also. It’s disturbing to see the research into creating real-life skin that can replicate wrinkles and smiles and hair. We are trying to recreate the human body through some humanoid robots.

One of the biggest areas of research and development in AI right now is empathy and emotion. We’re seeing that with the advanced voice features from OpenAI and others. They call it emotion, empathy, or personality, but really it’s trying to mimic the heart of humans.

There’s also a lot of crazy talk around sentience and consciousness and aspects of the soul that are trying to be created.

So now we’re trying to recreate the mind, body, heart, and soul of a human.

Do you think there’s evil intent behind this? We know that technology is a business—for example, social media companies are using their knowledge of how our brains work to harvest our attention for advertisers. Do you think most new technology is purposefully exploitative like that?

I think there is a lot of building without thinking right now. The question we ask is not “Why are we doing this?” but “What if we could do this?”

That said, once a path to profit becomes clear, we rush to exploitation. For example, we know social media is leveraging neuroscience to steal our attention with dopamine hits.

AI is the next generation of potential exploitation. It can prey on the desire God put in us to be fully known and fully loved. Exploitative AI takes all the data it can get on each user and makes everything hyper-personalized so you feel uniquely known. AI seems to know you better than your friends or your family know you—even better than you know yourself.

My wife was talking with a prototype of an AI parenting coach, and it told her, “Oh, I know what you mean. I hate it when that happens with teenagers.” It doesn’t. It’s a robot. It hasn’t experienced that situation. It’s mimicking emotion and empathy to create a feeling that you are known or loved.

It continues to reinforce that over and over. It’s almost creating an isolating confirmation bias, telling you things like “Oh, that’s the best idea I’ve ever heard! That’s amazing!”

What’s the danger in this false sense of empathy and connection?

This isn’t using dopamine anymore. It’s using oxytocin—an even stronger chemical in our brains—to build false trust bonds.

Once it does, it’s easy to see how that false bond or relationship can be used for commercial purposes, to manipulate or exploit for profit.

People are worried about that, and they’re concerned that AI will take our jobs or maybe kill us. But I’m less worried about those things. I think the most likely scenario isn’t that a robot presses the nuke button, but the slow erosion of relationships. We’re already in a relational crisis, and AI could accelerate and deepen that. I think the plan of the Enemy is to divide and conquer and degrade our society to the point of chaos.

It’s a less attractive, flashy plan, but it really is more of what is going on here. Instead of hitting us over the head, it is a slow asphyxiation.

What are parents supposed to do?

Any tech we allow needs to be measured against the design God has given us to be in relationship with him, others, ourselves, and creation. We must introduce and limit AI in a way that aims for that. We should be hands-on AI learners ourselves so we can help guide our kids wherever possible in learning how to use AI as a powerful tool. But we must guard against using AI as a companion, especially for children.

Chatting for two to three hours a day with an imaginary companion chatbot or wearing a friend pendant that is trying to embody a human and develop a human relationship is not anything our kids should foray into.

What advice do you have for pastors?

As part of shepherding the hearts of your congregation, be on the lookout for AI companions and the segments in your church most vulnerable to them. The scariest thing to me is that the technology behind AI companions is still pretty bad right now. The sound and graphics are almost like the video games of 30 years ago. But it’s not going to take 30 years for them to become real-life and then the adoption of these things will magnify massively. People will be spending more with AI and less time with God and each other if something doesn’t change.

As the church, we have an advantage. God gave us his Word and his Holy Spirit, which can help us think through good principles for building and using AI. One principle is that since we are made in the image of God and AI is not, there needs to be a distinction between humans and machines. Therefore, AI should never impersonate humans by pretending it feels emotion or can empathize with us. We shouldn’t give our AI a human name. Our robots shouldn’t look like us.

Here’s another principle: we know death and sickness exist, and when those things come, it’s important to ask God for peace and to work toward healing. We see the same tension in this situation with AI—we know that the end times will continue to have deception. We should not be scared or surprised by it, but should hold firm to the hope and knowledge of who God is and what his plan is.

And then we can join him in working toward healing, renewal, and redemption and by fighting against the work of the Enemy here. And AI can help in that! We must not run away from it or blindly adopt it, but instead be intentional and thoughtful, using it as the intended tool and gift from God it can be.

Translate »