How Does God Feel About AI?
I do some ad hoc mentoring of a few pastors. Engaging with them is a delight. One question that has come up a lot recently—actually, it’s more like a series of questions—concerns pastors’ use of artificial intelligence (“AI”), such as ChatGPT and Copilot.
I am the last person to consult about the technology of AI. Thankfully, the questions posed to me have been mostly to do with the ethics involved.
Most seem to agree that using AI to aid in routine administrative tasks, generating graphics, or researching questions (rather as one might “Google” something) presents few problems (though I would always ask AI to identify its sources).
But what about matters that are at the heart of a pastor’s calling—specifically, should we use AI to write a sermon?
I did some research to identify a dozen or so online articles from a range of (original) sources, including Calvinist, Reformed, Pentecostal, mainstream traditional, and academic. This blog distils what I felt to be the most helpful insights, on which there seemed to be a broad consensus.
The core question seems not to be whether we could use AI for sermon writing, but whether we should. The concern is not technological but theological: what happens when the sermon is no longer the fruit of a pastor’s own engagement with God, Scripture, and people? How does God feel about that?
In terms of how pastors feel about it, Premier recently reported findings from Barna organisation research that only a small minority of pastors (around 1 in 10) are comfortable with the idea of AI writing sermons.
In terms of how congregations would feel about it (if they found out), around 7 out of 10 pastors worried it would damage trust in them, personally.
Trust in a pastor is the big one. People expect their pastors to be seeking to “hear from God” and want to believe they’ve followed the guidance of the Holy Spirit. They don’t expect perfection; in fact, they’re usually very generous in that, but they do anticipate authenticity.
It can be humiliating for someone to be praising the pastor’s brilliant sermon—so deep, so rich, such wonderful insights—only then to discover they’ve actually been praising ChatGPT or Copilot. That awareness could spread like wildfire through a congregation.
Some feel that presenting AI’s work as if it’s our own raises integrity questions akin to plagiarism. But pastors have always learned from others—through books, commentaries, and other preachers’ ideas. Yet there’s a big difference between us being influenced and effectively handing over the writing of the sermon to a machine that chooses and assembles its own influences on our behalf.
Whether an AI-written sermon will be “better” is missing the point—if people wanted that, they could simply watch celebrity preachers on YouTube every week. Most of us value inherent authenticity over surface quality. What we don’t want to have to wonder is, “Did this come from my pastor or from a bot?”
In the days before AI, I knew of a pastor who was so worn out that he was known to be getting his talks from sermoncentral.com (a mostly US site where preachers post their work). One or two in the congregation realised this and were quietly using their phones on Sunday mornings to Google notable phrases, competing to be the quickest to find the talk. It was sad that he felt the need to resort to that.
I rarely quote from the Gospel Coalition website, but I think this is interesting:
“Because an AI’s output feels so human, we’re tempted to treat it like a person. This arises from our deeply ingrained human experience. We justifiably associate the typical outputs of an inner life—an intelligent argument or an emotive piece of writing—with the presence of that inner life itself. Throughout human history, the communication layer and the inner being have been inextricably linked. Now, for the first time, a machine can flawlessly replicate our communication without possessing any inner life.”
“For the believer, this inner life is indwelled by the Holy Spirit, who guides, convicts, comforts, and sanctifies us through our relationships with other Spirit-indwelled people. An AI has no inner life to engage with and no indwelling Spirit to minister from.”
A recent Firebrand magazine article titled, “AI, Ministry, and Self-Deception,” is also interesting—and quite blunt.
It said inappropriate uses of AI “allow a pastor to pretend to be more educated, more talented, or serving a bigger church than the pastor actually is. Do these tools really make more disciples of Jesus Christ? Likely not. Instead, they feed the part of our ego that is tied to pastoral performance. By appearing to be better-read, more talented, and pastoring a bigger church, we feed our ego with self-deception.”
“Self-deception in church leaders did not begin with AI. Yet AI positions itself as a tool that can be used to improve pastors, allowing self-deception on a scale not yet seen. Rather than taking a sermon from someone else, we are now using a tool to develop a sermon (which is largely based on someone else’s work). The participation in using AI, as well as its ability to pull from enough sources to make the borrowing of ideas less obvious, enables us to deceive ourselves into thinking we are using these tools for the best.”
“Pastors face the temptation to believe that our value is in the quality of our presentation rather than our faithfulness to God.”
The founder of the Methodist movement, John Wesley, developed a list of 22 questions for personal daily self-examination. The first on the list was this one: “Am I consciously or unconsciously creating the impression that I am better than I really am? In other words, am I a hypocrite?” (all 22 are available at https://www.mie.org.uk/dtp-blog/2023/11/14/john-wesleys-self-examination.)
Is the availability of AI, in part, a temptation to resist, as well as, in part, a resource from which to benefit?
According to the research, less than 5% of churches have formal policies concerning the use of AI by their pastors and staff. At one level, this is perhaps unsurprising, but I suspect it will be increasingly on the radar of boards of elders and trustees in the next few years. Perhaps, as we’ve seen with safeguarding and accountability, it will become more of a concern when stories begin to circulate about the potential risks and dangers for churches.
I can imagine many pastors identifying with the cautionary tone of this blog, yet saying, “That’s all very well, but you don’t realise how busy I am—with everything else that’s expected of me—so if I can’t use ChatGPT to write my sermons, I won’t cope.”
That’s perfectly understandable, but the answer to it is not, “OK, then, fair enough.” It’s pointing to a broader well-being concern that the pastor should be sharing with their mentor, overseers, or board (how best to sympathetically respond to that concern depends on the particular circumstances).
What, then, might we offer as some conclusions?
1. AI itself is not the problem; how we use AI can be
If we think of AI as a search engine—a more sophisticated form of Google—it can save a great deal of time and assist our productivity. It’s perfectly fine to use AI as a research assistant.
It’s OK to ask AI things like, “Suggest some ways I can lose 500 words in this talk without affecting my core points.” Or “This paragraph I’ve written feels clumsy, can you suggest some alternative ways of phrasing what I’m trying to say here?”
It’s fine to ask AI to find something online, or to ask “Who said such-and-such?”
Where things go wrong is when we ask AI—or allow AI; we may not have asked!—to produce material for us. When we realise how AI works, copying and pasting its work product should concern us, and not least because of the next point.
Oh, and if you ask AI to evaluate your sermon, don’t assume from the lengthy positivity you get back that it’s an objective reviewer. AI is inclined towards flattery—telling us things we would like to hear.
2. People can tell if a sermon is AI-generated
Not everyone can, but many can. People will begin to realise, and increasingly so. Significant reputational damage may follow when they do realise.
The most obvious giveaways are words or phrases that people who know the speaker well simply cannot imagine them saying—it’s just not their vocabulary or their voice. That’s the biggest giveaway. It’s obviously come from somewhere else. A sermon has to “sound like us.” Imperfections and all.
Then there’s how AI constructs its points and its sentences. They’re too perfect, too smooth, too symmetrical, too staccato. There’s none of the occasionally awkward phrasing and idiosyncrasy we see in typical sermons.
AI overuses parallel phrasing (“not this, but that”) and loves cute-sounding phrases. It uses easily identifiable link lines, such as “Think of it like this,” and “What, then, are we saying?”
AI offers a surface-level use of Scripture (lots of individual verses), but it can’t offer the personal insight or real-life application that we get from a preacher who’s written their own materials. An AI-generated sermon could be preached by “anyone, anywhere.”
And AI tends to avoid tensions in the text, or possible theological controversy, to a point of potential blandness. It tends towards Christian truisms and is standardly “evangelical-ish” in its content.
3. Where is the Holy Spirit in all this?
I pose this as a question rather than a statement because it’s not necessarily straightforward. I’m sure some would say, “Why can’t the Holy Spirit be guiding what comes out of AI?” I question that, personally, but I can definitely see a role for the Holy Spirit guiding us in the questions that we pose to AI and wisdom in what we do with the results.
Others might say, “Isn’t it a bit ‘super-spiritual’ to imply that everything we personally produce is generated through intimate engagement with the Holy Spirit?”—and I agree. Hard work and wrestling with the talk—perhaps over and over again—is also an integral part of it.
But rather than the time taken up in that engagement and wrestling being thought of as a burden—to be sidestepped if we can—ought we not to see it as part of our commitment to the seriousness of the task and the responsibility we’ve been given? If we’re struggling, we needn’t be struggling alone: that’s what friends are for—as well as AI!
I leave you with one last question.
What’s worse: a mediocre sermon that’s all our own work but still took us two long days to write (and we sense, remains mediocre), or a good sermon written for us in less than half an hour by AI (that people are praising)?
How we answer that will be determined by what we think preaching is in the first place. What it’s there to do. How it should be doing it. What role the Holy Spirit plays. And what we think “mediocre” and “good” mean. And maybe … how God feels about it.
AI is being used well when it’s assisting us.
It’s being used badly when it’s substituting for us.