• @whereisk@lemmy.world
    link
    fedilink
    118 months ago

    Can’t wait for the wave of lawsuits after the ai hallucinantes lethal advice then insists it’s right.

    • @HerrBeter@lemmy.world
      link
      fedilink
      138 months ago

      They did a trial test in Sweden but the LLM did tell a patient to take a ibuprofen and chill pill. The patient had a hard time breathing, pressure over the chest, and some other symptoms I can’t remember.

      A nurse overseeing the convo stepped in and told the patient to immediately call the equivalent of 911

    • @ArtVandelay@lemmy.world
      link
      fedilink
      English
      7
      edit-2
      8 months ago

      Reminds me of an AI that was programmed to play Tetris and survive for as long as possible. So the machine simply paused the game. Except in this case, it might decide the easiest way to end your suffering is to kill you, so slightly different stakes.

      • @extant@lemmy.world
        link
        fedilink
        English
        38 months ago

        My favorite was a rudimentary military scenario where they asked the AI to destroy a target so it just bombs it. Then operator said you can’t bomb it because there are civilians. So it opted to kill the operator who applied the limitation and then bomb the target again.

      • @RvTV95XBeo@sh.itjust.works
        link
        fedilink
        38 months ago

        Patient: AIbot3000, will drinking bleach make my pain go away?

        AIbot3000: Yes, bleach is a powerful disinfectant, and patients who drink bleach have been shown to experience less pain after it has disinfected their system.