Can AI Bring the Dead to Court? One Family’s Groundbreaking Use of Technology in Justice

Can AI Bring the Dead to Court? One Family’s Groundbreaking Use of Technology in Justice
Photo by Hansjörg Keller / Unsplash

When a grieving family used AI to let their murdered brother speak in court, it sparked a legal and ethical revolution. But is this the future of justice—or a dangerous precedent? Stacey Wales’ brother, Christopher Pelkey, was killed in a 2021 road rage incident. At his killer’s sentencing, she didn’t just read a statement—she resurrected him with AI. The result? A courtroom first that’s forcing us to ask: How far should technology go in shaping legal outcomes? Let’s dive in.


  • Victim impact statements often struggle to convey the essence of the deceased. Wales spent two years crafting her statement but felt it couldn’t capture her brother’s forgiving nature.
  • Courts rely on cold evidence: autopsy photos, surveillance footage. Pelkey’s family wanted the judge to see him alive—not just as a victim.
  • AI’s legal role is exploding, but this case marks the first use of AI to recreate a victim for their own statement. No precedent, no rules—just raw emotion meets cutting-edge tech.

✅ The Solution: A Digital Resurrection

  • Stacey and her husband, both tech professionals, used AI software trained on photos and old videos of Pelkey to create a lifelike avatar.
  • The AI Pelkey delivered a scripted message of forgiveness, written by Stacey but voiced in his recreated tone: “In another life, we probably could have been friends.”
  • Result: The judge added 1 extra year to the killer’s sentence (10.5 years vs. the state’s requested 9.5), citing the AI’s emotional impact.
  • ✅ Healing for the family: Stacey’s 14-year-old son said, “I needed to see and hear from Uncle Chris one more time.”

man in black suit standing on stage
Photo by Jeremy McGilvrey / Unsplash
  • ⚠️ “Unfair advantage?” Defense attorney Jason Lamm called the AI video a potential appeal issue, arguing it may have swayed the judge unduly.
  • 🚧 No prior notice: The defense wasn’t warned about the AI statement, raising questions about procedural fairness.
  • ⚠️ Slippery slope: Duke Law professor Paul Grimm warns AI could distort court records by amplifying sympathy or bias. Example: A 2024 New York case banned an AI avatar from arguing in court after it impersonated a human lawyer.

🚀 Final Thoughts: A New Frontier—With Guardrails

This case isn’t just about closure—it’s a legal watershed. For AI in courtrooms to work:

  • 📈 Transparency: Opposing counsel must review AI content pre-trial to flag distortions.
  • 📉 Limits: Grimm suggests restricting AI to post-verdict phases (like sentencing) to avoid jury bias.
  • 🚀 Innovation: As Stacey noted, this wasn’t evidence—it was a “human that’s no longer here for who he was.”

Could AI help victims’ families heal while keeping courts fair? Or does it risk turning justice into a tech spectacle? What do you think?

Let us know on X (Former Twitter)


Sources: Clare Duffy. He was killed in a road rage incident. His family used AI to bring him to the courtroom to address his killer, May 9, 2025. https://www.cnn.com/2025/05/09/tech/ai-courtroom-victim-impact-statement-arizona

H1headline

H1headline

AI & Tech. Stay Ahead.