cross-posted from: https://mander.xyz/post/32658309
In November 2021, in the city of Chandler, Arizona, Chris Pelkey was shot and killed by Gabriel Horcasitas in a road rage altercation.
Horcasitas was tried and convicted of reckless manslaughter.
When it was time for Horcasitas to be sentenced by a judge, Pelkey’s family knew they wanted to make a statement – known as a “victim impact statement” – explaining to the judge who Pelkey had been when he was alive.
They found they couldn’t get the words right.
The solution for them turned out to be having Pelkey speak for himself by creating an AI-generated avatar that used his face and voice, allowing him to “talk” directly to the judge.
[…]
AI avatar of the victim spoke of forgiveness. Family that created it asked for the maximum sentence, more than the prosecutors, and the judge delivered the maximum, saying he could hear the genuine forgiveness in the AI voice.
This sounds like some black mirror doublespeak.
Did he prepare the statement before he died?
Article says his sister wrote the words, and she and her husband created the avatar.
Very confusing as to why this is considered a good use of AI.
This is what happens when people simply do not know what an LLM actually does.