dvtt@lemmings.world to News@lemmy.worldEnglish · 1 年前Study Reveals Gender Bias in ChatGPT Translationsresearchinenglish.comexternal-linkmessage-square31linkfedilinkarrow-up171arrow-down117
arrow-up154arrow-down1external-linkStudy Reveals Gender Bias in ChatGPT Translationsresearchinenglish.comdvtt@lemmings.world to News@lemmy.worldEnglish · 1 年前message-square31linkfedilink
minus-squareAndOfTheSevenSeas@lemmy.worldlinkfedilinkarrow-up5arrow-down17·1 年前Computers do not have the sentience required to be sexist.
minus-squareknightly the Sneptaur@pawb.sociallinkfedilinkarrow-up14arrow-down1·1 年前They don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
minus-squareAndOfTheSevenSeas@lemmy.worldlinkfedilinkarrow-up1arrow-down1·edit-21 年前deleted by creator
minus-squareAndOfTheSevenSeas@lemmy.worldlinkfedilinkarrow-up1arrow-down17·1 年前Interesting then that you chose to describe the LLM as sexist and not the programmers, regardless of the fact that you know nothing about them.
minus-squarelolcatnip@reddthat.comlinkfedilinkEnglisharrow-up17·edit-21 年前Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
Computers do not have the sentience required to be sexist.
They don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
deleted by creator
Interesting then that you chose to describe the LLM as sexist and not the programmers, regardless of the fact that you know nothing about them.
Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
This is a nothing argument.
They’re nuts. Easy block, IMO.