AI: The Distance from Good Tool to Idol (2)
In the previous article, we examined the redemption narrative of the AI age and the reality of inequality. This article pursues a more fundamental question: Is AI really just a neutral tool? And in the face of this overwhelming tide, what kind of analytical framework can faith offer?
A Seemingly Reasonable Claim
Whenever someone expresses concern about AI, you can almost certainly expect this response. It shows up in social media feeds, on tech forums, and during tea time at church: "AI is just a tool. It's neutral. A knife can cut vegetables or hurt someone. It all depends on who's using it."
The analogy sounds reasonable, but it reduces an extraordinarily complex problem into a reassuring slogan.
Even a knife is not a neutral tool. With a knife, we cut, slice, and chop. The knife shapes how we approach ingredients and opens up certain culinary possibilities. Cultures that developed knives created entirely different cuisines compared to those that didn't. Every tool shapes the person who uses it. But a knife shapes how you handle food. AI shapes how you handle thought. One changes your hands; the other changes your mind. These are entirely different levels of influence.
Think about it: a child asks a question, and three seconds later AI gives a complete, well-structured answer. What do you think this child will do next time they encounter a difficulty? Will they choose to spend two hours searching, comparing, thinking, making mistakes, and correcting themselves? Or will they just ask AI? The answer is obvious. And the problem is equally obvious. Those two hours of struggle are precisely where real learning happens.
Neil Postman offered a profoundly important insight in Amusing Ourselves to Death: media are never neutral containers. Every medium embeds an epistemology within itself, presupposing what is important, what can be ignored, and what counts as "knowing." Television is not merely a screen that plays content; it redefined what "serious public discourse" looks like. Since the television age, any idea that cannot be conveyed within thirty seconds has struggled to survive.
AI is doing the same thing. It is not merely helping us get answers faster; it is redefining what "thinking" means. When "thinking" becomes "input a question, wait for output," we are already losing something. Nicholas Carr argued in The Shallows that the internet has altered the very structure of our brains, making deep reading and sustained focus increasingly difficult. AI will push this trend to its extreme.
Postman went further in Technopoly: when a society elevates technology to its highest authority, it loses the ability to judge that technology. Technology ceases to be the servant of culture and becomes its master. People no longer ask "Should we do this?" but only "Can we do this?"
Good Structure, Wrong Direction
But the problem with the claim that "AI is neutral" goes beyond cultural analysis. From the perspective of Christian faith, it has an even more fundamental flaw.
Dutch theologian Abraham Kuyper proposed an analytical framework of vital importance within the evangelical tradition: the world God created has its "structure," and humanity's use of these structures has its "direction." Al Wolters developed this insight with greater clarity in Creation Regained: every dimension of the created order, including language, technology, economics, and art, possesses a good structure given by God. The problem never lies in the structure itself, but in its direction. The Fall did not erase the structures of creation; it distorted their direction, turning things that were meant to point toward God toward idols instead.
As an extension of human intellectual capacity, AI's structure, its ability to process information, recognize patterns, and assist decision-making, reflects the rationality and dominion mandate God gave to humanity. This is good. Kuyper called it "common grace": through common grace, God enables even non-believers to advance science, drive civilization forward, and bless humanity. AI is a fruit of common grace.
But common grace does not mean that the direction is automatically correct.
Paul writes in Ephesians 5:16, "Making the best use of the time, because the days are evil." We easily read this verse as a time-management maxim. But what Paul is actually saying is this: this present age carries within it a gravitational pull toward fallenness. It pulls good structures toward wrong directions, distorting freedom into indulgence and convenience into dependence. Common grace preserves the goodness of structures but does not eliminate the distortion of direction. In a fallen age, the more powerful a tool, the more we need to discern which direction it is being pulled.
The convenience of AI is not free. Its cost is hidden. Without realizing it, we outsource our thinking, hand over our judgment, and bypass the very process of struggle that is required to become a whole person. The structure is good, but the direction has gone wrong.
Neither Fleeing Culture nor Bowing to It, but Transforming Culture
Facing a cultural force like AI, Christians tend to fall into two traps.
The first trap is fearful rejection. AI is an instrument of the end times, a prelude to the Antichrist, Babel 2.0. Those who hold this position act out of genuine reverence; they see the danger and feel the threat. But their response essentially surrenders a part of the created order, as if certain domains fall outside the scope of Christ's sovereignty. Kuyper once said something that has been quoted countless times yet remains stunning: "There is not a square inch in the whole domain of our human existence over which Christ, who is Sovereign over all, does not cry, 'Mine!'" If this is true, then the domain of AI also belongs to Christ. To flee from it is to abandon Christ's sovereign claim over that square inch.
The second trap is naive embrace. God will bless His children in using every era's tools. Don't worry; we need to keep up with the times and leverage AI for evangelism and church building. This position sounds confident, but its problem is that it ignores the question of "direction." It equates "goodness of structure" directly with "correctness of direction," assuming by default that whatever this age produces is automatically good as long as we "use it in the right way." In Keller's terms, this is a form of hidden idolatry that turns created things into sources of redemption, elevating efficiency and convenience to a position they cannot bear.
The fundamental problem with both positions is the same: neither does the serious work of discernment. The first skips discernment and draws a line; the second also skips discernment and embraces everything.
But the evangelical tradition offers us a different posture: not fleeing from culture, not bowing to culture, but transforming culture.
Wolters describes this posture: redemption does not aim to destroy the created order but to restore its original direction. The scope of Christ's redemptive work is as wide as the scope of the Fall's distortion. Culture has been distorted, so culture needs to be redeemed, not abandoned, but repaired and redirected toward the heart of God.
Applied to AI, what does this mean?
It means we do not stand opposite AI shouting "Stop," nor do we embrace it without reservation. We step into this domain and engage with discernment. The question we ask is not the binary "Is AI good or bad?" but rather: "In our use of AI, which directions point toward human dignity, toward true wisdom, toward the fear of God? And which directions point toward idols, toward the worship of efficiency, the desire for control, the greed for certainty?"
What this age needs is not just people who understand AI. What this age needs are people who practice discernment before God while also deeply understanding their own era. They do not stand opposite culture and criticize from a distance, nor do they drift along with its currents. They use AI but refuse to let efficiency become an unquestionable supreme value. They understand the language of this age but do not abandon another language, the language of the cross, by which they measure all things. They know that Christ is Lord over all, so they are not afraid to step into any domain. They also know that this age still operates within the gravitational pull of the Fall, so at every step they seek the guidance of the Holy Spirit.
So, concretely, what are the things AI cannot replace? In this age of supreme convenience, what kind of people are we called to become? Continued in the next article.
Further Reading
📖 Al Wolters, Creation Regained: Biblical Basics for a Reformational Worldview (revised edition, 2005). The clearest introduction to the "structure and direction" framework, the core analytical tool for evangelical cultural engagement. A slim volume, but one that changes how you see the entire world.
📖 Abraham Kuyper, Common Grace (1902–1904). Kuyper's magnum opus, arguing that God's grace operates not only within the church but throughout the cultural development of the entire created order. A complete English translation has finally become available in recent years.
📖 Neil Postman, Amusing Ourselves to Death (1985). The most important book for understanding the argument that "media are not neutral." Written in the television age, yet its foresight for the AI age is remarkable.
📖 Neil Postman, Technopoly (1992). The sequel to Amusing Ourselves to Death, offering a more systematic analysis of how technology transforms from servant of culture into its master.
📖 Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (2010). An argument from the perspective of neuroscience about how digital media alter the structure of the brain.
📖 Timothy Keller, Counterfeit Gods (2009). Understanding how idolatry operates in everyday life, including our pursuit of efficiency and control.
🎬 Neil Postman, 1998 lecture: Five Things We Need to Know About Technological Change. A brilliant lecture from a few years before Postman's death, distilling his lifelong core reflections on technology. The full text can be found online.