@dvtt to News@lemmy.worldEnglish • 10 months agoStudy Reveals Gender Bias in ChatGPT Translationsresearchinenglish.comexternal-linkmessage-square32arrow-up171arrow-down117
arrow-up154arrow-down1external-linkStudy Reveals Gender Bias in ChatGPT Translationsresearchinenglish.com@dvtt to News@lemmy.worldEnglish • 10 months agomessage-square32
minus-square@AndOfTheSevenSeas@lemmy.worldlinkfedilink-12•10 months agoComputers do not have the sentience required to be sexist.
minus-squareknightly the Sneptaurlinkfedilink13•10 months agoThey don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
minus-square@AndOfTheSevenSeas@lemmy.worldlinkfedilink-16•10 months agoInteresting then that you chose to describe the LLM as sexist and not the programmers, regardless of the fact that you know nothing about them.
minus-square@lolcatnip@reddthat.comlinkfedilinkEnglish17•edit-210 months agoProgrammers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
Computers do not have the sentience required to be sexist.
They don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
deleted by creator
Interesting then that you chose to describe the LLM as sexist and not the programmers, regardless of the fact that you know nothing about them.
Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
This is a nothing argument.
They’re nuts. Easy block, IMO.