Otter Raft@lemmy.ca to Medicine@mander.xyzEnglish · 1 month agoAI medical tools found to downplay symptoms of women, ethnic minorities - Ars Technicaarstechnica.comexternal-linkmessage-square4linkfedilinkarrow-up139arrow-down12
arrow-up137arrow-down1external-linkAI medical tools found to downplay symptoms of women, ethnic minorities - Ars Technicaarstechnica.comOtter Raft@lemmy.ca to Medicine@mander.xyzEnglish · 1 month agomessage-square4linkfedilink
minus-squareWhyDoYouPersist@lemmy.worldlinkfedilinkEnglisharrow-up5·1 month agoSeriously, human generated data is what AI is trained on–this should come as a no-brainer.
minus-squareOtter Raft@lemmy.caOPlinkfedilinkEnglisharrow-up2·edit-21 month agoYep, the companies are pushing AI models as being a “fair” and “unbiased” alternative to human workers. In reality LLLMs are going to be biased depending on the training data
Seriously, human generated data is what AI is trained on–this should come as a no-brainer.
Yep, the companies are pushing AI models as being a “fair” and “unbiased” alternative to human workers. In reality LLLMs are going to be biased depending on the training data