Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • @Buttons@programming.dev
    link
    fedilink
    English
    -1
    edit-2
    10 months ago

    It’s putting human biases on full display at a grand scale.

    The skin color of people in images doesn’t matter that much.

    The problem is these AI systems have more subtle biases, ones that aren’t easily revealed with simple prompts and amusing images, and these AIs are being put to work making decisions who knows where.

    • @intensely_human@lemm.ee
      link
      fedilink
      English
      910 months ago

      In India they’ve been used to determine whether people should be kept on or kicked off of programs like food assistance.

      • @rottingleaf@lemmy.zip
        link
        fedilink
        English
        -1
        edit-2
        10 months ago

        Well, humans are similar to pigs in the sense that they’ll always find the stinkiest pile of junk in the area and taste it before any alternative.

        EDIT: That’s about popularity of “AI” today, and not some semantic expert systems like what they’d do with Lisp machines.