Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • @intensely_human@lemm.ee
    link
    fedilink
    English
    910 months ago

    In India they’ve been used to determine whether people should be kept on or kicked off of programs like food assistance.

    • @rottingleaf@lemmy.zip
      link
      fedilink
      English
      -1
      edit-2
      10 months ago

      Well, humans are similar to pigs in the sense that they’ll always find the stinkiest pile of junk in the area and taste it before any alternative.

      EDIT: That’s about popularity of “AI” today, and not some semantic expert systems like what they’d do with Lisp machines.