• Tb0n3@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      16
      ·
      5 months ago

      It’s funny that anybody would expect models trained on information from current doctors to not have the same blind spots.

  • LostWanderer@fedia.io
    link
    fedilink
    arrow-up
    18
    arrow-down
    5
    ·
    5 months ago

    Imagine, a hallucination engine mostly developed by white men and trained on data gathered by white men failing to treat symptoms experienced by women and ethnic minorities seriously. Who would’ve guessed this outcome?!