Google’s AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery’s positives.

    • ninjakitty7
      link
      fedilink
      141 year ago

      Honestly AI doesn’t think much at all. They’re scary clever in some ways but also literally don’t know what anything is or means.

      • @aesthelete@lemmy.world
        link
        fedilink
        English
        3
        edit-2
        1 year ago

        They don’t think. They think 0% of the time.

        It’s algorithms, randomness, probability, and statistics through and through. They don’t think any more than a calculator thinks.

    • Bluskale
      link
      fedilink
      01 year ago

      LLMs aren’t AI… they’re essentially a glorified autocorrect system that are stuck at the surface level.

      • @aesthelete@lemmy.world
        link
        fedilink
        English
        3
        edit-2
        1 year ago

        We should always fact check things we believe we know and seek additional information on topics we are researching.

        Yay yet another person saying that primary information sources should be verified using secondary information sources. Yes, you’re right it’s great actually that in your vision of the future everyone will have to be a part time research assistant to have any chance of knowing anything about anything because all of their sources will be rubbish.

        And that’s definitely a thing people will do, instead of just leaning into occultism, conspiratorial thinking, and group think in alternating shifts.

        All I have to say is thank fuck Wikipedia exists.

      • @somethingsnappy@lemmy.world
        link
        fedilink
        English
        11 year ago

        Nobody said we were relying on that. We’ll all keep searching. We’ll all keep hoping it will bring abundance, as opposed to every other tech revolution since farming. I can only think at the surface level though. I definitely have not been in the science field for 25 years.