Geographic Bias Present in Medical AI Tools

October 9, 2020

Historically, diversity has not been a serious consideration in the minds of researchers when studying new medications. Most clinical trials enrolled white males located near urban research centers. Federal guidelines implemented in the 1990’s have done a good job in addressing these inequities but now the same mistakes are being made with new AI technologies. Too many algorithms rely on datasets from the same states. In fact, patient data from just three states trains most AI diagnostic tools.

“AI algorithms should mirror the community,” says Amit Kaushal, an attending physician at VA Palo Alto Hospital and Stanford adjunct professor of bioengineering. “If we’re building AI-based tools for patients across the United States, as a field, we can’t have the data to train these tools all coming from the same handful of places.” Read more here.

(Source: Shana Lynch, HAI, September 21, 2020)

Share This Story!