AI resurrection can turn your grief into “spectral labor”
|
By
Paulo Vargas Published January 20, 2026 |
Generative AI is getting good at making the dead talk. The newest critique isn’t about whether it sounds real. It’s about what happens when a person’s voice, face, and emotional presence get rebuilt into something that can be reused.
In a 2025 paper in New Media and Society, researchers Tom Divon and Christian Pentzold call this “spectral labor.” The concept frames AI resurrection as a form of posthumous production, where a person can keep “working” through their data after death. That can happen without consent, and without any clear guardrails.
“What we resurrect may not be what we remember, but what technology renders back to us.” That gap is why the output can feel less like closure and more like a copy shaped by the toolmaker.
Divon and Pentzold analyzed 51 AI resurrection cases collected between January 2023 and June 1, 2024, spanning the US, Europe, the Near East, and East Asia. They sort them into spectacle, sociopolitical use, and everyday grief use.
Spectacle is the glossy version, icons restaged for entertainment. Sociopolitical projects re-invoke the dead for testimony or messaging. The everyday mode is the most intimate, chatbots and synthetic media built to simulate ongoing contact. It’s also the easiest to normalize. Fast.
The paper’s sharpest line is its labor claim. The authors write that “the dead become involuntary sources of data, likeness, and affect.” In this framing, a person’s traces become raw material, then a sellable presence that can be extracted, distributed, and monetized.
In a separate essay, the authors argue the unease isn’t only about realism. It’s about agency. These figures can look responsive while still being authored by someone else’s prompts, edits, and platform rules. It can feel personal but isn’t.
The research argues consent, privacy, and end-of-life choices need a rethink as personal traces get folded into generative systems. Governance still lags behind how quickly these tools can be built and shared.
For you, the practical move is to treat your voice, images, and accounts like assets. Decide who can access them, and put those instructions in writing where possible.
If you’re considering an AI “afterlife” service, ask one question first. Who gets to decide what your future version says.
Related Posts
Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns
The paper, published on arXiv with the evocative title Reasoning Models Generate Societies of Thought, posits that these models don't merely compute; they implicitly simulate a "multi-agent" interaction. Imagine a boardroom full of experts tossing ideas around, challenging each other's assumptions, and looking at a problem from different angles before finally agreeing on the best answer. That is essentially what is happening inside the code. The researchers found that these models exhibit "perspective diversity," meaning they generate conflicting viewpoints and work to resolve them internally, much like a team of colleagues debating a strategy to find the best path forward.
Microsoft tells you to uninstall the latest Windows 11 update
https://twitter.com/hapico0109/status/2013480169840001437?s=20
You could see faster AMD Ryzen AI Max chips soon
The rumored Gorgon Halo series would essentially be a clock-bumped iteration of the current Strix Halo-branded processors, with the same core counts but higher boost speeds on both the CPU and Radeon iGPU sides. Additionally, it'll also add support for faster LPDDR5X-8533 memory to further improve responsiveness and performance under AI-heavy workloads.