- cross-posted to:
- medicine@mander.xyz
- cross-posted to:
- medicine@mander.xyz
You must log in or # to comment.
Just like real doctors 👀
It’s funny that anybody would expect models trained on information from current doctors to not have the same blind spots.
Bias training data = bias LLM. Who knew?
Imagine, a hallucination engine mostly developed by white men and trained on data gathered by white men failing to treat symptoms experienced by women and ethnic minorities seriously. Who would’ve guessed this outcome?!
Garbage in, garbage out.
Especially when you shove it into a garbage maker.
deleted by creator




