AI used to create a video of a man who was shot to death in a road rage incident – addresses his killer in court

CHANDLER, AZ – In 2021, Chris Pelkey was shot and killed in a road rage incident. He was just 37-years-old when he was killed; he was devoutly religious and an Army combat veteran, and for two years, his sister Stacey Wales, kept a running list of everything she planned to say at the sentencing hearing for the man who took her brother from her.

According to NPR, when the time came to finally write her victim impact statement, Wales said she was stuck and struggled to find the right words. All she could do was think of what Pelkey would say. She said, “I couldn’t help but hear his voice in my head of what he would say.”

That’s when she thought of using AI to generate a video of how her brother would address the courtroom and specifically the man who took his life at a red light in 2021. On Thursday, May 8th, Wales stood before the court and played the video — in what AI experts say is likely the first time the technology has been used in the United States to create a victim impact statement read by an AI rendering of the deceased victim.

Wales said she had been thinking about her victim impact statement since the initial trial in 2023. The case was retried in 2025 because of procedural problems with the first trial. The chance to speak in court at the sentencing meant a great deal to Wales. She said, “You’re told that you cannot react, you cannot emote, you cannot cry. We looked forward to [sentencing] because we finally were gonna be able to react.”

Wales’ attorney told her to humanize Pelkey and offer a complete picture of who he was. Wales contacted as many people as she could, from her brother’s elementary school teach to his high school prom date to soldiers he served alongside in Iraq and Afghanistan.

In total, there were 48 victim impact statements, not counting her own. When it was time to write hers, she was torn between saying how she truly felt and what she thought the judge would want to hear. She said, “I didn’t wanna get up there and say, ‘I forgive you,’ cause I don’t, I’m not there yet. And the dichotomy was that I could hear Chris’ voice in my head and he’s like, ‘I forgive him.'”

Wales said that her brother’s mantra had always been to love God and love others. She said he was the kind of man who would give the shirt off his back, and while she struggled to find the right words for herself, writing from his perspective came naturally. She said, “I knew what he stood for and it was just very clear to me what he would say.”

The night before the sentencing hearing, Wales called her victim rights attorney, Jessica Gattuso, to tell her about the video she had created with the help of her husband and their business partner, Scott Yentzer. Gattuso said that she was initially hesitant about the idea because she had never heard of it being done before in an Arizona court.

She said she was also worried that the video may not be received well, but after seeing it for herself, she felt compelled that it should be viewed in court. She said, “I knew it would have an impact on everyone including the shooter, because it was a message of forgiveness.”

After the video played in court, neither the defense nor the judge pushed back. Later in the hearing, Judge Todd Lang said, “I loved that AI. Thank you for that.” He added, “It says something about the family because you told me how angry you were and you demanded the maximum sentence. And even though that’s what you wanted, you allowed Chris to speak from his heart, as you saw it. I didn’t hear him asking for the maximum sentence.”

The shooter, Gabriel Horcasitas, was sentenced to 10.5 years for manslaughter. Over the years, there have been a growing number of examples testing the bounds of AI’s role in the courtroom and the use of it for a victim impact statement appears novel, as noted by Maura Grossman, a professor at the University of Waterloo.

Grossman has studied the applications of AI in criminal and civil cases. She said she did not see any major legal or ethical issues with use of the AI video in Pelkey’s case. She said, “Because this is in front of a judge, not a jury, and because the video wasn’t submitted as evidence per se, its impact is more limited.”

Gary Marchant, a professor of law, ethics and emerging technologies at Arizona State University’s Sandra Day O’Connor College of Law said, “Victim statements like this that truly try to represent the dead victim’s voice are probably the least objectionable use of AI to create false videos or statements. Many attempts to use AI to create deep fakes will be much more malevolent.”

Wales herself cautions others who may follow in her footsteps to act with integrity and not be driven by selfish motives. She said, “I could have been very selfish with it. But it was important not to give any one person or group closure that could leave somebody else out.”

About D A I L Y B O O S T N E W S

View all posts by D A I L Y B O O S T N E W S →

Leave a Reply

Your email address will not be published. Required fields are marked *