• @OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    385 months ago

    He posted online, telling his friends it was time to say goodbye. Then his friend called him up, saying he had an opportunity at his company Eternos.Life for Bommer to build an interactive AI version of himself.

    It doesn’t get more tech bro than that

    • @Zaktor@sopuli.xyz
      link
      fedilink
      English
      215 months ago

      But in this case it seems like an entirely good thing? The offer was made by an actual friend, the guy himself wanted this, his wife too, and they’re both pretty cognizant about what this is and isn’t.

      • @averyminya@beehaw.org
        link
        fedilink
        65 months ago

        Yeah contrary to all the negativity about this in this thread, I think there’s a lot of worthwhile reasons for this that aren’t centered on fawning over the loss of a love one. Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes. These are all ways of keeping someone with us without making their death the main focus.

        Yes, death and moving on are a part of life, we also always say to keep people alive in our hearts. I think there are plenty of ways to keep people around us alive without having them present, I don’t think an AI version of someone is inherently keeping your spirit from continuing on, nor is it inherently keeping your loved one from living in the moment.

        Also I can’t help but think of the Star Trek computer but with this. When I was young I had a close gaming friend who we lost too soon, he was very much an announcer personality. He would have been perfect for being my voice assistant, and would have thought it to be hilarious.

        Anyway, I definitely see plenty of downsides, don’t get me wrong. The potential for someone to wallow with this is high. I also think there’s quite a few upsides as mentioned – they aren’t ephemeral, but I think it’s somewhat fair to pick and choose good memories to pass down to remember. Quite a few old philosophical advents coming to fruition with tech these days.

        • frog 🐸
          link
          fedilink
          English
          115 months ago

          Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes.

          An AI isn’t going to magically know these things, because these aren’t AIs based on brain scans preserving the person’s entire mind and memories. They can learn only the data they’re told. And fortunately, there’s a much cheaper way for someone to preserve family recipies and other memories that their loved ones would like to hold onto: they could write it down, or record a video. No AI needed.

          • trev likes godzilla
            link
            fedilink
            105 months ago

            We have a box of old recipe cards from my grandmother that my wife cherishes. My parents gifted them to her because out of all their daughter-in-laws, she is the one that loves to cook and explore recipes the most. I just can’t imagine someone wanting something like that in a sterile technological aspect like an “AI-powered” app.

            “But Trev, what if you used an LLM to generate summaries-” no, fuck off (he said to the hypothetical techbro in his ear).

            • frog 🐸
              link
              fedilink
              English
              85 months ago

              I also suspect, based on the accuracy of AIs we have seen so far, that their interpretation of the deceased’s personality would not be very accurate, and would likely hallucinate memories or facts about the person, or make them “say” things they never would have said when they were alive. At best it would be very Uncanny Valley, and at worst would be very, very upsetting for the bereaved person.

              • @Zaktor@sopuli.xyz
                link
                fedilink
                English
                35 months ago

                This is a very patronizing view of people who all seem to be well informed about what this is and isn’t and who have already acknowledged that they will put it aside if it scares them. No one is foisting this on the bereaved wife and the husband has preemptively said it’s ok if her or her children never use it.

                This might fail in all the ways you think it will. That’s a very small dataset of information, so it’s likely to be either be an overcomplicated recording or to need to incorporate training other than what he personally said, but it’s not your place to tell her what’s best for her personal grieving process.

                • frog 🐸
                  link
                  fedilink
                  English
                  3
                  edit-2
                  5 months ago

                  Given the husband is likely going to die in a few weeks, and the wife is likely already grieving for the man she is shortly going to lose, I think that still places both of them into the “vulnerable” category, and the owner of this technology approached them while they were in this vulnerable state. So yes, I have concerns, and the fact that the owner is allegedly a friend of the family (which just means they were the first vulnerable couple he had easy access to, in order to experiment on) doesn’t change the fact that there are valid concerns about the exploitation of grief.

                  With the way AI techbros have been behaving so far, I’m not willing to give any of them the benefit of the doubt about claims of wanting to help rather than make money - such as using a vulnerable couple to experiment on while making a “proof of concept” that can be used to sell this to other vulnerable people.

                  • @Zaktor@sopuli.xyz
                    link
                    fedilink
                    English
                    1
                    edit-2
                    5 months ago

                    So just more patronizing. It’s their life, you don’t know better than them how to live it, grief or no.

              • trev likes godzilla
                link
                fedilink
                25 months ago

                I have no doubts about that either, myself. Though even if such an abomination of a doppelganger were to exist, and it seems that these companies are hellbent on making it so, it would be worse for the reasons you described previously: prolonging and molesting the grieving process that human beings have evolved to go through. All in the name of a dollar. I apologize for being so bitter about this (this bitterness is not directed at you, frog), but this entire "AI’ phenomenon fucking disgusts and repulses me so much I want to scream.

                • frog 🐸
                  link
                  fedilink
                  English
                  15 months ago

                  I absolutely, 100% agree with you. Nothing I have seen about the development of AI so far has suggested that the vast majority of its uses are grotesque. The few edge cases where it is useful and helpful don’t outweigh the massive harm it’s doing.

              • @intensely_human@lemm.ee
                link
                fedilink
                15 months ago

                I think it would be the opposite of upsetting, but in an unhealthy way. I think it would snap them out of their grief into a place of strangeness, and theyd stop feeling their feelings.

                There is no cell of my gut that likes this idea.

                • frog 🐸
                  link
                  fedilink
                  English
                  15 months ago

                  Yeah, I think you could be right there, actually. My instinct on this from the start is that it would prevent the grieving process from completing properly. There’s a thing called the gestalt cycle of experience where there’s a normal, natural mechanism for a person going through a new experience, whether it’s good and bad, and a lot of unhealthy behaviour patterns stem from a part of that cycle being interrupted - you need to go through the cycle for everything that happens in your life, reaching closure so that you’re ready for the next experience to begin (most basic explanation), and when that doesn’t happen properly, it creates unhealthy patterns that influence everything that happens after that.

                  Now I suppose, theoretically, there’s a possibility that being able to talk to an AI replication of a loved one might give someone a chance to say things they couldn’t say before the person died, which could aid in gaining closure… but we already have methods for doing that, like talking to a photo of them or to their grave, or writing them a letter, etc. Because the AI still creates the sense of the person still being “there”, it seems more likely to prevent closure - because that concrete ending is blurred.

                  Also, your username seems really fitting for this conversation. :)

            • @averyminya@beehaw.org
              link
              fedilink
              25 months ago

              I more meant in the case of someone whose life was cut short and didn’t have the time to put something like this together. I agree that ideally this is information you’d get to pass down, but life doesn’t always work out like that.

              Also like you said about the AI powered app, it’s only a matter of time before Adobe Historical Life comes out and we’re paying $90 a month for gramma’s recipes (stories are an additional subscription).

              • @intensely_human@lemm.ee
                link
                fedilink
                35 months ago

                I went back and read old emails from my mother who died in 2009. I had unread emails from her.

                One of them contained my grandmother’s peanut butter cookie recipe, which I thought was lost when she passed in 2003.

                It might have been nice if an LLM had found that instead of me, but it felt very amazing to discover it myself.

        • @intensely_human@lemm.ee
          link
          fedilink
          45 months ago

          Think of how many family recipes could be preserved

          We solved this problem long before we invented writing.

          LLMs do not enable the keeping of family memories. That’s been going on a long time.

        • @Zaktor@sopuli.xyz
          link
          fedilink
          English
          25 months ago

          This is a weirdly “you should only do things the natural way” comment section for a Tech-based community.

          Humans also weren’t “meant” to be on social media, or recording videos of themselves, or even building shrines or gravesites for their loved ones. They’re just practices that have sprung up as technology and culture change. This very well could be an impediment to her moving on without him, but that’s her choice to make, and all this appeal to tradition is patronizing and doesn’t actually mean tradition is the right path for any given individual. The only right way to process death is:

          • Burn their body and possessions so that no trace remains
          • Pump their body full of chemicals so they won’t be decomposing when people ceremonially visit their corpse weeks later
          • Entomb them with their cats, slaves, and riches
          • Plant a tree nourished by their decomposing corpse
          • Turn their ashes into a piece of jewelry to be carried with you always
          • Make a shrine to the dead in your home to be prayed at regularly
          • Cast a death mask to more accurately sculpt their bust
          • Freeze their head so they may be resurrected later
      • @intensely_human@lemm.ee
        link
        fedilink
        15 months ago

        and they’re both pretty cognizant about what this is and isn’t

        This will be communicating with a dead person. Nobody has any idea what this and what it isn’t.

        It’s like planning to go to Morocco and thinking you know in advance what it’s gonna be like.

        This is new technology. People who think they know the outcomes here are deluding themselves.