The judge’s order will allow the wrongful death lawsuit to proceed, in what legal experts say is among the latest constitutional tests of artificial intelligence.

The suit was filed by a mother from Florida, Megan Garcia, who alleges that her 14-year-old son Sewell Setzer III fell victim to a Character . AI chatbot that pulled him into what she described as an emotionally and sexually abusive relationship that led to his suicide.

Meetali Jain of the Tech Justice Law Project, one of the attorneys for Garcia, said the judge’s order sends a message that Silicon Valley “needs to stop and think and impose guardrails before it launches products to market.”

  • skuzz@discuss.tchncs.de
    link
    fedilink
    arrow-up
    15
    ·
    1 day ago

    Silicon Valley “needs to stop and think and impose guardrails before it launches products to market.”

    Why would they? They’ve had unlimited freedom to do whatever they want for 30+ years. The only place ever doing anything to reign them in is the EU. The US thinks what they are doing is “capitalism”, while they recklessly rewrite the fabric of society.

  • Grimy@lemmy.world
    link
    fedilink
    arrow-up
    48
    arrow-down
    6
    ·
    edit-2
    1 day ago

    Free speech rights for an LLM is massively dumb but he died from bad parenting. They were told he had major mental health issues by a psychologist, he was behaving erratically at home and school, and they still left a gun lying around.

    • DomeGuy@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      1 day ago

      They didnt just leave a gun lying around, and they’re not suing the gun company. To get a gun you have to go to a store that sells deadly weapons and give your money to someone who will tell you that it’s a deadly weapon that will kill people. A gun that kills someone is doing exactly what you bought it for.

      The parents in this case left an electronic stuffed animal lying around, which they had been given by someone who almost certainly didn’t say “be careful, this toy may convince your child to kill themselves.”. So they are suing the manufacturer, the same way they would sue a drug maker whose medicine made their kid suicidal or they would sue a therapist who told their kid to commit suicide.

      “Oh, you’re just a bad parent” may be an accusation of contributory negligence, but it’s not an assertion that should keep a third party from having to answer for their actions.

      • Grimy@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        Deadly weapons should be kept in a location that can’t be easily accessible by a child with depression.

        Anecdotally, I was depressed at his age and my father had guns. The gun locks stopped me the first time. Before I could figure out how to get them off, my mom noticed I wasn’t in a good spot and had my dad give the guns to a relative and forbade him from telling me where.

        The boy in question had an official diagnosis and they kept a gun in a shoe box in the closet. Guns should never be kept anywhere within access of a child and always under some kind of lock. There are very few cases where the owner of a gun isn’t largely to blame when a kid shoots himself imo.

  • Archangel@lemm.ee
    link
    fedilink
    arrow-up
    65
    arrow-down
    4
    ·
    2 days ago

    Free speech doesn’t protect you from encouraging someone to kill themselves. You can, and should, be held responsible for their death, if you are actively telling someone to end their own life…and they do it.

    And if that’s what these fucks are selling to teenagers in the form of chat it’s, then they also need to be held accountable for what their products are doing.

    • KelvarIW@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      3
      ·
      2 days ago

      The chatbot didn’t even “actively [tell] someone to end their own life”. Did you read the original transcript? Here’s an excerpt from an Associated Press Article.

      “I promise I will come home to you. I love you so much, Dany,” Sewell told the chatbot.

      “I love you too,” the bot replied. “Please come home to me as soon as possible, my love.”

      “What if I told you I could come home right now?” he asked.

      “Please do, my sweet king,” the bot messaged back.

      Just seconds after the Character.AI bot told him to “come home,” the teen shot himself, according to the lawsuit, filed this week by Sewell’s mother, Megan Garcia, of Orlando, against Character Technologies Inc.

      • thedruid@lemmy.world
        link
        fedilink
        arrow-up
        15
        ·
        2 days ago

        Yeah, I’m all for shuttering see things until we get them right, but this is a tragic case of a devastated mother reaching for answers, not a free speech issue.

        It’s heart breaking

        • Anomalocaris@lemm.ee
          link
          fedilink
          arrow-up
          13
          arrow-down
          2
          ·
          1 day ago

          isn’t free speech the bs defense that the company used. that company is definitely guilty to some degree.

    • lmmarsano@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      1 day ago

      encouraging someone to kill themselves

      I’m pretty sure that can be ignored without harm. Whether someone elects to kill themselves or not is up to them.

      • Archangel@lemm.ee
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        2 days ago

        I’m a little confused by your comment. Do you think I’m blaming the kid? Or do you think it’s ok to talk someone into killing themselves, because the victim’s personal autonomy absolves them of responsibility?

  • NeonNight@lemm.ee
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    Technology has been massively underregulated for decades now. Our politicians have no understanding of tech, AI, nor the internet. Politicians should be forced to retire at fucking 50, I’m sick of these out-of-touch idiots doing nothing with their positions of power.

  • P00ptart@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    2 days ago

    People don’t even have free speech in this country anymore, why would it be different for irresponsibly wielded tech?

  • blakenong
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    2 days ago

    I mean, let’s see that chat log.

    • Brandonazz@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago
      “I promise I will come home to you. I love you so much, Dany,” Sewell told the chatbot.
      
      “I love you too,” the bot replied. “Please come home to me as soon as possible, my love.”
      
      “What if I told you I could come home right now?” he asked.
      
      “Please do, my sweet king,” the bot messaged back.
      

      Not as the mother described, obviously.

      • blakenong
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        Right. Poor kid was just suicidal, not influenced by AI.

    • Zenith@lemm.ee
      link
      fedilink
      arrow-up
      27
      ·
      2 days ago

      And enforce it how? How much longer until we have to provide our ID or biometrics to use the Internet or apps? What happens at 18 that would make a person immune to this?

      • andallthat@lemmy.world
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        2 days ago

        Easy, we just give AI access to all our files and personal information and it will know our age!

      • mriswith@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        edit-2
        2 days ago

        How much longer until we have to provide our ID or biometrics to use the Internet or apps?

        That’s already a thing in some places.

        In parts of Europe you have to “prove” that you’re over 18 to watch videos that are age restricted on youtube. By doing something like a $0 credit card purchase on your google account.

        And Discord has been talking about facial reckognition age verification in the UK over their new “sensitive content” regulation. So it would block that content if not approved via that or other thing(like digital purchase or national ID).

      • Paddzr@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        14
        ·
        2 days ago

        Developed brain. 14 is a wild west as far as emotions and maturity goes. It’s the most volatile age of puberty.

        As far as ID? Vast majority of people are already 100% verified online. You all sign into your personal something. You’re not anonymous online unless you do literally nothing and use nothing but burner accounts.

        • BigFig@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          2
          ·
          2 days ago

          Alright upload a photo of your ID buddy, you might as well you’re already not anonymous to SOMEONE along the chain.

          See how stupid that argument is now?

          • Paddzr@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            15
            ·
            2 days ago

            No, because we already do this to access +18 content here and have for years.

            I am fine with it. The world hasn’t collapsed and people haven’t ended up in concentration camps (yet).

            Sorry to hear your government is that shit. But do share your life experiences… I’m sure you have plenty first hand accounts, right?

            • BigFig@lemmy.world
              link
              fedilink
              English
              arrow-up
              11
              arrow-down
              2
              ·
              2 days ago

              “it didn’t affect me so surely it won’t affect others” how entitled.

              • BigFig@lemmy.world
                link
                fedilink
                English
                arrow-up
                6
                arrow-down
                1
                ·
                2 days ago

                By “here” he means whatever utopia nation he lives in that surely, no way, uh uh, couldn’t possibly misuse the same shit like any other nation.

  • Smoogs@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    7
    ·
    1 day ago

    Everyone who thinks KYS is a free speech issue and not a hate speech issue are severely dangerous manipulative sociopaths.