I recently had lunch with my former pastor, Tony Merida. Heâd heard that I work in the world of artificial intelligence (AI), and he was curious. âI get asked about AI all the time,â he told me. âMy usual response is to joke, âAre you talking about Allen Iverson? I can answer that one.â Then, I move on, because I donât know what I donât know.â
As I talked with Tony, we discussed all things AI, from its uses in the military to its use in ministry. It became clear to me that pastors need a technologically grounded and theologically informed framework for thinking about AI, one that understands the technologyâs core limitation and discerns its usefulness and dangers for discipleship, and especially for preaching.
AI Is a Simulation, Not a Soul
The large language models (LLMs) behind AI tools like ChatGPT donât understand anything. Theyâre machines that simulate reasoning. OpenAIâs GPT-4o model is a massive neural network, a âtransformerâ trained to do one thing: predict the next most probable word in a sequence.
When you ask it a question, itâs not comprehending the semantic meaning. Itâs performing a complex statistical analysis based on the trillions of words it was trained on. It uses that analysis to generate a sequence of new words thatâs statistically likely to follow your prompt. Its âreasoningâ is a mathematical artifact of pattern matching on a planetary scale, not a function of consciousness or understanding.
LLMs are masterful mimics, not thinking minds.
LLMs are masterful mimics, not thinking minds.
This is a real distinction. In my work, I see companies like Palantir building what they call an ontological layer for their AI systems, especially those used for high-stakes applications. This ontology is a formal, structured representation of knowledge, a map of concepts and their relationships. Developers create a âknowledge graphâ with nodes (e.g., âPastor Tonyâ) and edges (âis a pastor atâ) that define relationships to other nodes (âImago Dei Churchâ) to create a fixed, human-verified reality model. The LLM is then forced to operate within the constraints of this graph, combining the pattern-matching of its neural network with the logical rigor of the map.
Palantirâs belief in the necessity of an ontological map is a crucial admission of their technologyâs core limitation. The map prevents the AI from âhallucinatingâ or inventing facts, which itâs otherwise prone to do.
For AI to be reliable in critical situations, its creators must build a digital fence around it and give it a strict, human-made map of reality to follow so it doesnât get lost. In this fascinating way, cutting-edge tech companies have discovered the practical necessity of what Christian philosophers like Alvin Plantinga call âproperly basic beliefs.â For any reasoning system to function coherently, it must be grounded in presuppositions accepted as true without prior proof. This technological necessity for a grounded worldview provides the starkest contrast between LLMs and the human soul, and it brings us to a critical danger for individual users: anthropomorphism.
Because an AIâs output feels so human, weâre tempted to treat it like a person. This arises from our deeply ingrained human experience. We justifiably associate the typical outputs of an inner lifeâan intelligent argument or an emotive piece of writingâwith the presence of that inner life itself. Throughout human history, the communication layer and the inner being have been inextricably linked. Now, for the first time, a machine can flawlessly replicate our communication without possessing any inner life. This creates a unique and subtle threat to human growth and even Christian discipleship, because both depend on genuine connection.
The New Testament describes Christian community as koinĆnĂa, a deep, shared fellowship (e.g., Phil. 1:5). This involves the church bearing one anotherâs burdens (Gal. 6:2), speaking the truth in love (Eph. 4:15), and confessing our sins to one another (James 5:16). Each of these commands presupposes an interaction between two beings who possess an inner life, a soul or spirit.
For the believer, this inner life is indwelled by the Holy Spirit, who guides, convicts, comforts, and sanctifies us through our relationships with other Spirit-indwelled people. An AI has no inner life to engage with and no indwelling Spirit to minister from. Imagine a counselor whose sole purpose is to validate you. Imagine a friend you can bare your soul to who then instantly deletes her memory of the conversation. You wouldnât be in a relationship; youâd be using a tool. Thatâs precisely what ChatGPT is. It canât replace the embodied, soul-on-soul community God has ordained for our sanctification.
Imitation and Inquiry
Iâve experimented extensively with using an AI for sermon preparation. My first experiments were simplistic: âWrite an expositional sermon manuscript on John 1:1â3 in the style of Tim Keller.â

Beyond its superficiality, such an approach has raised ethical questions about plagiarism. Iâve heard many express concerns that an AI is just copying and pasting from an authorâs work. However, thatâs not how the technology works.
Recent legal victories for AI companies have begun to solidify the argument that when an LLM writes âin the style ofâ a known author, its use is transformative, not infringement.
The reason is that the AI doesnât store an author like Kellerâs books in a database. Instead, during its training, the LLM analyzes an authorâs body of work to create a high-dimensional âsemantic mapâ of his styleâa complex web of statistical relationships between his word choices, sentence structures, and theological concepts. When prompted, it navigates this map to generate entirely new text that shares the statistical properties of Kellerâs writing, rather than lifting his actual sentences.
Asking an AI for a âKeller sermonâ gets you a statistically generated echo of his style. Itâs an imitation, not a theft, but itâs ultimately a derivative and hollow one that lacks the incisive depth of the real thing.
Cutting-edge tech companies have discovered the practical necessity of what Christian philosophers like Alvin Plantinga call âproperly basic beliefs.â
My next approach to using an AI for sermon prep was more rigorous. It involved a step-by-step âcontext engineeringâ process to collaboratively research a passage with the AI before asking it to write.
I used a discipleship framework I call âAsk the Text.â I provided the AI with a series of questions designed to mirror the logical order of a disciplined exegetical process: moving from the original authorial and historical context, through grammatical and literary analysis, into the textâs place in the biblical metanarrative, and only then moving to theology and personal application. When I teach this structured journey to church members, it prevents the common error of jumping to âwhat it means to meâ before understanding âwhat it meant.â

The outputs from this method were far superiorâmore nuanced, logical, and persuasive.

But then I began benchmarking the AIâs work against my own. When I performed the same exegetical steps myself, side by side with the machine, the AIâs final devotional, which previously seemed so insightful, suddenly struck me as hollow.
This validated a crucial distinction I already knew to be true: The work of teaching Godâs Word is primarily a spiritual discipline and only secondarily an academic one. Itâs an act of wrestling with God through his Word in dependence on his Spirit. The goal isnât merely an insightful analysis but a word from the Lord for his people. An AI can assist with parts of the academic task, but itâs categorically excluded from the prayerful, worshipful, Spirit-dependent reality of the process.
The work of teaching Godâs Word is primarily a spiritual discipline and only secondarily an academic one. Itâs an act of wrestling with God through his Word in dependence on his Spirit.
Even if we bracket the spiritual dimension and judge the AIâs work purely on academic grounds, its output today rarely surpasses that of a well-trained human exegete. A master interpreter synthesizes countless layers of context: the flow of argument in the original Greek, the weight of a particular allusion to the Old Testament, the nuances of a debate in Second Temple Judaism, and the implications for a complex body of systematic theology.
While an AI is impressive at processing its given context, a human expertâs mind is a far more sophisticated instrument for weighing and integrating these disparate domains. I expect this particular technological gap will narrow quicklyâperhaps before the end of 2025. But for now, the difference in output quality is stark.
This leads to my current use of AI platforms, which is far more modest and practical. A few years ago, my daily Bible study was an unhurried, multihour affair of mining insights, writing in notebooks, and slowly working through lexicons. Now that I have a 2-year-old, a 1-year-old, and a business to run, that luxury has evaporated. My devotional time is more focused. Iâll read a passage like John 1, meditate on it, and then turn to an AI with a simple prompt: âExplain John 1.â

The response is a generic but well-informed summary of the main points: the historical context, key theological themes (like the logos doctrine), and different interpretive nuances. Itâs a helpful way to get a quick, scholarly lay of the land.
Why do I use AI instead of picking up a concise Bible commentary or study Bible that has the same information? Two reasons: speed and flexibility.
Regarding speed, even if you already know where to find this information in a trusted resource, thereâs a surprising amount of cognitive overhead in pulling the book off the shelf, flipping to the right section, cross-referencing context, and so on. With AI, I can just ask directly and get an answer in seconds. This easily saves 5 to 10 minutes in an hour-long study, which adds up over time. And thatâs important now that Iâm a parent with a shorter studying window. Using an AI also reduces context switching. Instead of flipping between the start of a commentary for background and then back to my specific passage, I can simply ask the AI for authorial context, themes, and a passage explanation in one flow.
Regarding flexibility, using an AI lets me interact with the text in a way a commentary canât. I can push back, ask clarifying questions, or branch into related topics, such as how different movements or traditions have developed an idea since the commentary was written. In that sense, itâs like having a custom commentary I can shape in real time.
Of course, AI shouldnât replace trusted resources. I still lean on my study Bible, commentaries, and reference works when preparing to teach. But just as a concise, devotional commentary summarizes an academic and exegetical one, an AI can be used as another summary layerâas a tool that helps me when Iâm studying devotionally to zoom out, move quickly, and make connections. Crucially, Iâm leaning on my theological and hermeneutical training. I donât treat the AIâs output as authoritative but as a cognitive accelerator. In this season of life, itâs a valuable tool to get my bearings as I do the real, prayerful work of interpretation myself.
Word-Wrangling Machines
This brings me to what I believe is the most profound, biblical critique of AIâs misuse: Paulâs frequent warnings in the Pastoral Epistles against âword-wranglingâ (1 Tim. 1:6, 6:4, 20; 2 Tim. 2:14, 16, 23). What does Paul mean? Letâs carefully analyze the linguistic and contextual evidence. In 2 Timothy 2:14 (NASB95), Paul writes, âRemind them of these things, and solemnly charge them in the presence of God not to wrangle about words (logomacháșœin), which is useless and leads to the ruin of the hearers.â
The cluster of related terms Paul uses (logomachĂ©Ć, logomachĂa, kenophĆnĂa, mataiologĂa), and especially the unique and probably technical use of the compound word logomachĂ©Ć, suggests he was addressing a specific and identifiable problem in the churches rather than merely offering general warnings against divisiveness and argumentative behavior.
These terms most likely referred to the hyperliteral and decontextualized interpretation of Scripture. The consistent pairing of these terms with references to âmythsâ and âgenealogiesâ (e.g., 1 Tim. 1:4) points to an interpretive method that extracted and debated minute details while missing broader meaning. Paulâs emphasis on âsound teachingâ (hugiainoĂșsÄ didaskalĂa) as the antithesis to word-wrangling suggests the latter represented an unsound, perhaps deliberately obtuse, interpretive approach.
The historical context strengthens this reading. The Pastoral Epistles emerged in an environment where Hellenistic rhetorical techniques were being applied to Jewish and early Christian texts. The sophisticated Greek philosophical and rhetorical education in cities like Ephesus created conditions where religious texts could be subjected to the kind of hair-splitting analysis common in sophistic debates. Paul is pushing back against the use of these techniques in Christian teaching.
His reference to âmeaningless talkâ (mataiologĂa) is particularly telling. This suggests the debates werenât merely pedantic but that Paulâs opponents actively obscured or distorted the biblical textâs meaning. Such word-wrangling likely leveraged technical arguments to support elaborate interpretive frameworks (âmythsâ) divorced from what the biblical author intended. This reading aligns with what we know about Hellenistic philosophy and early Gnosticism in the church, both of which tended to build complex metaphysical systems through creative interpretation of Scripture (e.g., Philoâs allegorical readings of the Pentateuch in works like De opificio mundi).
Paulâs critique focuses on his opponentsâ methods (fighting about words, empty discussion) rather than their doctrinal conclusions. This suggests his primary concern was with an approach to Scripture that privileged clever verbal manipulation over substantive engagement with the textâs meaning. His repeated emphasis on âsound teachingâ implies these techniques werenât producing mere academic disagreements but fundamentally distorted understandings of Christian doctrine.
Moreover, the Pastoral Epistles focus on church leadership and teaching authority. Paulâs warnings about word-wrangling appear alongside his instructions about selecting and training church leaders. This suggests these debates and interpretive methods werenât merely theoretical but that the opponentsâ clever and superficial arguments actively undermined the churchâs established teaching authorities.
Such dynamics still exist today. Too often, contemporary Christian discourse mirrors Paulâs concerns precisely. The techniques may differ, but the error is the same.
On one end of the spectrum, celebrity pastors and âdiscernment ministriesâ leverage emotional rhetoric and manufacture outrage, twisting Scripture simply to generate followers. On the other end, academic performers employ excessive technical displays and linguistic analysis that obscure the textâs meaning behind a smokescreen of expertise. One group appeals to raw pathos and the other to a facade of logos, but both approaches share the fundamental vice Paul condemned: the use of sophisticated verbal manipulation to override rather than serve the textâs meaning.
Into this landscape, artificial intelligence now enters as a powerful new accelerant. Itâs a tool uniquely capable of perfecting both flawed approaches on command.
An AI can be prompted to produce the emotionally charged language of the populist and the dense, technical jargon of the performative academic with equal ease. It can do this because, at their core, both forms of human word-wrangling are exercises in manipulating semantic patterns for rhetorical effect.
An AI can be prompted to produce the emotionally charged language of the populist and the dense, technical jargon of the performative academic with equal ease.
This is just the kind of task LLMs have been designed to master. Whether the manipulation is human or machine-driven, the result is the same as what Paul witnessed: communities impressed by clever speech rather than built up by faithful engagement with Godâs Word, that is, form triumphing over substance in ways Paul would instantly recognize.
What is an LLM if not the ultimate word-wrangling machine? Itâs an engine built to simulate reasoning by manipulating the semantic relationships between words. It wrangles with words because words are all it has.
Tool, Not Theologian
The key is to use AI for good and avoid both making it into something itâs not (anthropomorphism) and using it for evil (word-wrangling). Here are two suggestions for wise use.
1. Use it to accelerate your research.
A pastor can leverage an AI for massive-scale information retrieval and summarizationâtasks that would take hours in a library. You can ask it to synthesize the views of five different commentaries on a passage or trace a theological concept through church history. In such research, however, the pastor must always retain the role of hermeneutical agent, evaluating the data through his own theological grid and interpretive skill.
Think of the AI as the worldâs fastest research assistant. It can run to the library and pull all the books and relevant articles for you, but youâre still the one who has to read them and understand them, and ultimately you need to write the sermon.

2. Use it to ideate or to refine ideas.
Using speech-to-text, I speak my raw âbrain dumpâ thoughts on a topic into the AI, and it provides an instant transcript. This process uses the AI as an interactive medium for what psychologists call externalizing cognition.
The act of articulation forces thought-organization, and seeing thoughts in written form bypasses the âblank page problem.â The AI is a nonjudgmental partner that facilitates a flow state for getting ideas out of my head and onto the page.

Talking out your sermon ideas to an AI and having it instantly transcribe them is a powerful way to get started. Just ignore its sycophantic feedback (âThatâs a brilliant insight!â) and use the text as your raw material.
Toward Artisanal Content
As a rule, AI raises the floor, but it doesnât raise the ceiling. For a novice, an AI provides a scaffold that can dramatically improve baseline performance. For an expert, whose knowledge is deep and nuanced, the AIâs general output offers little marginal value. This is a principle of diminishing returns for expertiseâAI can help a bad writer become an average writer, but it wonât help a great writer become a better one. Itâs a tool for achieving competence, not producing mastery.
For this reason, weâre entering an age of artisanal content. When AI drives the cost of producing generic content to zero, that content becomes a commodity. In any commoditized market, value shifts to signals of authenticity, provenance, and costly human effort. âArtisanalâ will be a signifier for products that arenât merely generated but authored, products from a specific human consciousness with its unique perspective, labor, and spiritual insight.
When anyone can generate a generic sermon in seconds, the sermon a pastor has prayed over, wrestled with, and labored over for a week will be infinitely more precious. Our preaching and teaching arenât scalable products. Theyâre labors of love offered up to God and for the good of his people. In an age of artificial minds, the churchâs call is to lean ever more deeply into the one thing AI can never replicate: a human heart set aflame by Godâs Spirit with the truth of Godâs Word.
News Source : https://www.thegospelcoalition.org/article/ai-usefulness-dangers-preachers/