AZ - Road rage victim in Arizona resurrected through AI to deliver his own impact statement

Knox

Well-Known Member
Joined
May 17, 2009
Messages
33,105
Reaction score
216,329
1746589162423.webp


Christopher Pelkey was killed in a road rage incident in Chandler, Arizona in 2021, but last month, artificial intelligence brought him back to life during his killer’s sentencing hearing.

It was the first time in Arizona judicial history — and possibly nationwide — that AI has been used to create a deceased victim’s own impact statement.

Pelkey’s sister and brother-in-law used the technology to recreate his image and voice likeness to “talk” to the courtroom about his life and the day he met Gabriel Paul Horcasitas, who shot him during a confrontation near Gilbert and Germann roads.

“In another life, we probably could have been friends,” the AI creation of the 37-year-old Army veteran said, addressing Horcasitas. “I believe in forgiveness…”

 
I think this is actually legally problematic.

A statement from family about what the victim was like is one thing.

An artificial creation speaking for the victim has the potential to unconsciously sway someone who's responsible for sentencing as it's pleading in the 'first person'.

Except it isn't, because it's a fictional construction.

Nobody, however well they know someone, can know the completeness of another's innermost thoughts and feelings. And nobody can feed an AI with what they don't know, only the impression they have of someone, which will be biased by the fact that the construct is of a person who is deceased.

It's pretending to be the victim, but it's actually the surviving family speaking. And I suspect it could be grounds for an appeal of the sentence.

MOO
 
I was shocked something like this was allowed in a courtroom. Highly prejudicial, my first thought was it’s really close to a principle of hearsay.

But I guess since was delivered during the sentencing phase, its considered to be a victim impact statement?
 
I was shocked something like this was allowed in a courtroom. Highly prejudicial, my first thought was it’s really close to a principle of hearsay.

But I guess since was delivered during the sentencing phase, its considered to be a victim impact statement?
And that's why I think it's woolly.

The victim cannot ever make a victim impact statement, because he's dead.

His loved ones can, but they are not him, and it should not be presented in such a way that there is ambiguity about what is being said and by whom.

The best and fairest of judges could be influenced by something like this. And that is not okay.

MOO
 
From my same news link-

Chief Justice Timmer offered the following response about the use of AI:

"AI has the potential to create great efficiencies in the justice system and may assist those unschooled in the law to better present their positions. For that reason, we are excited about AI’s potential. But AI can also hinder or even upend justice if inappropriately used. A measured approach is best. Along those lines, the court has formed an AI committee to examine AI use and make recommendations for how best to use it. At bottom, those who use AI—including courts—are responsible for its accuracy."
 
I was shocked something like this was allowed in a courtroom. Highly prejudicial, my first thought was it’s really close to a principle of hearsay.

But I guess since was delivered during the sentencing phase, its considered to be a victim impact statement?
Hearsay was one of my first thoughts, but then I thought that I was being ridiculous. So nice to see that I’m not the only one with “unique” thoughts!
 
I think this is actually legally problematic.

A statement from family about what the victim was like is one thing.

An artificial creation speaking for the victim has the potential to unconsciously sway someone who's responsible for sentencing as it's pleading in the 'first person'.

Except it isn't, because it's a fictional construction.

Nobody, however well they know someone, can know the completeness of another's innermost thoughts and feelings. And nobody can feed an AI with what they don't know, only the impression they have of someone, which will be biased by the fact that the construct is of a person who is deceased.

It's pretending to be the victim, but it's actually the surviving family speaking. And I suspect it could be grounds for an appeal of the sentence.

MOO
Agree, probably appealed.
 

Members online

Online statistics

Members online
115
Guests online
4,890
Total visitors
5,005

Forum statistics

Threads
622,413
Messages
18,449,669
Members
240,021
Latest member
Marisol56R
Back
Top