Digital Necromancy: The Dark Side of AI Voice Cloning
By Elizabeth Prata

SYNOPSIS
The NatGeo documentary “Endurance” employed AI voice cloning to recreate the voices of Ernest Shackleton and his crew from historical recordings and written diaries. While reanimating voices or likenesses has potential benefits, like aiding those who lost their voices, it also poses risks of misuse, including fraud and discrediting individuals, demanding vigilance and regulation. It is called ‘digital necromancy’ and necromancy is biblically forbidden. This article explores the ethical and theological issues of digital necromancy.
In NatGeo’s documentary “Endurance” about the survival of the men on the 1914-1917 failed polar expedition, I learned voice cloning was used to reanimate the men’s voices. The documentarians took existing recordings of the captain of the Endurance, Ernest Shackleton and some of his men, (mostly from wax cylinders), used them to train AI on tone and inflection, then used actors to recreate men’s voices reading the men’s actual words from their diaries, which AI swapped into the Shackleton men’s voices. This is known as voice cloning.
The filmmaker’s caveats and explanations were plentiful regarding this use of voice cloning. The filmmakers seemed to be concerned they stay edon this side of ethics in the use of AI (artificial intelligence). They used a software called Respeecher, and they explained their process here.
But there is an unethical, evil use of voice cloning (and video cloning, too). These evil uses have become apparent already. Unethical uses could involve putting words in someone’s mouth they did not say in order to discredit them or get them fired. Voice or Video could be used in court, if someone is nefarious enough, in order to get a conviction. I mean, we have all seen cases where documents are forged and evidence is planted. It happens. Voice or video cloning to force an unjust conviction can happen too.
Fouru days after Charlie Kirk, founder of TPUSA was killed, we read, “Church gives standing ovation to AI audio clip of Charlie Kirk saying words he never said. A church in Texas played an AI audio clip of Charlie Kirk saying words he never said to a congregation who gave it a standing ovation,” this article reports.
The article continued, On Sunday (14 September), pastor Jack Graham was delivering a service at Prestonwood Baptist Church when he paused to play an AI-generated clip using the voice of the 31-year-old [Charlie Kirk], who was killed on 10 September.
There is currently a boom in China to create AI chatbots using a person’s likeness and voice. “A 37 year old executive said he talks to his mother via a digital avatar on a tablet device, rendered from the shoulders up by artificial intelligence to look and sound just like his flesh-and-blood mother, who died in 2018. “I do not treat [the avatar] as a kind of digital person. I truly regard it as a mother,” the article reported.
It is further stated in the article that, “The rise of AI simulations of the deceased, or “deadbots” as academics have termed them, raises questions without clear answers about the ethics of simulating human beings, dead or alive.” These are also known as ghostbots.
These two examples are nefarious uses of AI video and voice technology.
Two years ago, the Federal Trade Commission announced their concerns with voice cloning. The FTC issued a challenge designed for “Preventing the Harms of AI-enabled Voice Cloning“. They wrote-
Speech synthesis has been around for several decades. Perhaps one of the most famous examples is CallText 5010, the robotic-sounding speech synthesizer Stephen Hawking used after he lost his voice in 1985. And now, going beyond digital voices like Hawking’s and Apple’s Siri, it is possible to clone people’s voices thanks to improvements in text-to-speech AI engines. Voice cloning systems are generally built on large training sets composed of people’s real voices. Since many of these systems are commercially available or even open sourced, they can be easy for anyone to access, equipping people with a powerful tool to replicate human voices in a way that is hard to detect by ear.–end of FTC quote.

But the FTC article goes on to mention the downsides of such advances. Every technology advance has an upside and a downside. An upside results in the betterment of humanity, such as medical advances; and a downside is its potential to be used for evil against man. Because we are sinners, the downsides are potent and prevalent. The FTC article goes on-
This progress in voice cloning technology offers promise for Americans in, for example, medical applications—it offers the chance for people who may have lost their voices due to accident or illness to speak as themselves again. It also poses significant risk—families and small businesses can be targeted with fraudulent extortion scams; and creative professionals, such as voice artists, could potentially have their voices appropriated in ways that could jeopardize an artist’s reputation and ability to earn income. –end of FTC quote
The article is actually a challenge. It is a public call for ideas and papers to people who are involved in creating technology to create “products to policies to procedures—aimed at protecting consumers from AI-enabled voice cloning harms including fraud and the broader misuse of biometric data and creative content. We are asking the public to submit ideas to detect, evaluate, and monitor cloned voices.” The FTC did a similar call ten years ago to prevent harms from robocalling. That challenge ended up creating call-blocking features in response to the potential harms from the then-new robocalling technology.
In addition to the ethical concerns with AI, we also have a theological concern. Pastor Gabe Hughes, the WWUTT guy, raised on Twitter the other day. He wrote-
WWUTT? @WWUTTcom said,
“Using digital likenesses of the dead to deliver messages as if they were alive today and could speak into current events is an AI form of necromancy. Christians should avoid contributing to or participating in this.”
Pastor Gabe has an excellent point.

Let’s first define what necromancy is, according to the Bible.
The Holman Illustrated Bible Dictionary defines necromancy as: “A form of divination which queries dead souls about the future.”
The International Standard Bible Encyclopedia (ISBE) says to refer to the topic of Divination, which includes “enchantment, sorcery, witchcraft, soothsaying, augury, necromancy, divination in numberless forms, and all kinds of magic art. Nine varieties are mentioned in one single passage in the Pentateuch (Dt 18:10, 11); other varieties in many passages both in the OT and NT, e.g. Lev 19:26, 31; Isa 2:6; 57:3; Jer 27:9; Mic 5:12; Acts 8:9, 11; 13:6, 8; Gal 5:20; Rev 9:21.“
With so many scriptures referring to all forms of divination, you can see the Lord is serious in warning us away from it.
ISBE defines necromancy as “consulting the dead“.
In the technology world, the reanimation of voices and/or bodies is actually called “digital necromancy”!! This 2023 paper from Hutson and Ratican called “Life, death, and AI: Exploring digital necromancy in popular culture—Ethical considerations, technological limitations, and the pet cemetery conundrum” stated that ‘digital necromancy’,
“involves using AI to reanimate deceased individuals for various purposes. Reasons for desiring to engage with a disembodied or bodied replica of a person include the preservation of memories, emotional closure, cultural heritage and historical preservation, interacting with idols or influential figures, educational and research purposes, and creative expression and artistic endeavors.” –end of Hutson & Ratican quote
Whatever the reasons one may want to reanimate the dead in voice or body in order to hear from them or communicate with them, it’s vehemently forbidden in scripture.
The article quoted below is from “The Conversation” an organization that touts itself as an academically rigorous and newsworthy publishing outfit where they unpack complex or thorny issues for the general public. The article completely dismisses concerns with AI/voice cloning/Large Language Models (LLMs) like ChatGPT and also dismisses issues with image and video generators like DALL·E 2, calling them simply’ helpful tools that aid in a human’s grieving process.’
They wrote, [W]e should remember that we do not ordinarily treat our personal messages, photographs or videos of the dead as if those records themselves were our loved ones“. Instead, we use them as conduits to their memory, standing in for them as proxies for us to think of or communicate through. To suggest we routinely get confused or delude ourselves about such media is a misconception.
Not a misconception. I already gave an example above of a rising trend of people doing just that. The Chinese gentleman considers the resurrected digital, chatting likeness of his dead mother as real. he is certainly not the only one. Continuing The Conversation article:
That’s why general worries about digital necromancy are wildly overblown: when we overly concentrate on their strange and sinister aspects…we lose sight of the ways in which these new technologies speak to and resonate with what we are and do as human beings already. –end The Conversation quote
We can admire the helps new technology gives us while at the same time warn about evil uses. It’s not a one OR the other situation, as The Conversation tries to peddle. It’s a both-and situation. Stating a concern about AI does not make Christians lose sight of the benefits this technology has afforded us.
Here are some questions to ask yourself as we navigate this thorny digital landscape, in pursuit of holiness and honor to Jesus-
1–Does the show/movie/video/game I’m watching or playing have a digitally resurrected character in it? Is their voice cloned, and is what they are saying or doing accurate to the person? (How would we know, anyway?)
I thought long and hard about the NatGeo documentary I watched. I do not think it is digital necromancy because the filmmakers took the men’s actual voices and the digital reproduction used their own words from diaries or interviews. But it’s close to the ethical line, and it made me uncomfortable. I repented just in case it was an ‘unknown sin’. King David wrote,
Who can discern his errors? Acquit me of hidden faults. (Psalm 19:12).
2–Resurrecting a dead person is creating an imperfect likeness of him or her, and it’s effectively creating a ghost.
God adamantly forbids this activity.
“Saul, who had refused to listen to Samuel when the prophet was alive, sinfully sought a word from Samuel after he was dead. And that was part of why Saul was judged (1 Chronicles 10:13–14).” Source GotQuestions
3–How does the deceased person’s family feel about their dearly departed loved one’s likeness and voice being used?
Even before John MacArthur passed away, the Director of Grace To You Europe had posted the following statement. The issue became worse after Dr MacArthur passed away.
Grace to You Europe Statement: John MacArthur and A.I. Sermons
YouTube and social media are currently overrun with videos purporting to be John MacArthur—but produced by Artificial Intelligence. Superficially, these may appear to be authentic, but they have not been authorized by Dr. MacArthur, nor do they reliably represent his opinions. Many of these video channels are monetized in a deliberate attempt by fraudsters to financially benefit from John’s faithful verse-by-verse teaching of God’s word. Please do not donate to, or “like”, these video channels. The only way to be certain a video or audio recording is legitimately John MacArthur’s is to get it from gty.org
Is the person’s reputation harmed by digitally resurrecting him or her, and putting words in their mouth they did not say?
4–Is someone illegitimately benefiting from this AI version of the deceased person? Note this sentence in the above statement from GTY Europe, “Many of these video channels are monetized in a deliberate attempt by fraudsters to financially benefit”. Digital assets are still assets.
We Christians need to be careful with regard to Artificial Intelligence, ChatGPT, and other technologies that may unwittingly be giving us opportunity to sin. We love our Jesus so much we do not want to disobey, even unintentionally. We have to remain vigilant more now that ever.
Further Resources
AI and the Gospel– long form podcast by Darrell Harrison and Virgil Walker of the Just Thinking Podcast. “To many people, AI (artificial intelligence) is a boogeyman, whereas for others, it’s merely the next cool technology. But how should Christians view AI? Is it something to be feared or embraced? Listen as Darrell and Virgil discuss those and other questions through the lens of Scripture in this episode of the Just Thinking podcast.“
Answers in Genesis wrote,
There are some hidden consequences to using large language models (LLMs) like Chat GPT. Large language models are trained on datasets to generate text by predicting the most likely response in the context of the parameters they are given. This means that large language models do not think; they can only give the explanation that the majority of its training data favors. more at the link above. It goes to Facebook.