Medical devices which demonstrate ethnic bias must be tackled urgently to avoid “substantial” harm to patients, a review has found.
The independent review found that for people with darker skin, pulse oximeter devices could prove to be less accurate, and could result in falling oxygen levels being missed.
There was also a warning that skin cancer in people with darker skin tones could be underestimated by devices which rely on artificial intelligence (AI).
- Positive mindset amongst ethnic minority women associated with living longer
- Diabetes and ethnicity
- Diet Coke associated with heart complications, research indicates
Researchers say the issue is made worse by the fact that medical devices are most often tested and calibrated on people with fairer skin.
Professor Dame Margaret Whitehead, from the University of Liverpool, chaired the review, which was commissioned in 2022 in light of growing concerns that ethnic minorities faced worse risks from COVID-19.
She said: “The advance of AI in medical devices could bring great benefits, but it could also bring harm through inherent bias against certain groups in the population, notably women, people from ethnic minorities and disadvantaged socio-economic groups.
“Our review reveals how existing biases and injustices in society can unwittingly be incorporated at every stage of the lifecycle of AI-enabled medical devices, and then magnified in algorithm development and machine learning.”
The review looked at three areas:
- Optical medical devices including pulse oximeters, which use light waves through the skin to gauge oxygen levels in the blood. Skin tone can affect how the light behaves.
- AI in healthcare
- Polygenic risk scores, which are used mostly for research and pool the findings from a number of genetic tests to help predict someone’s risk of disease.
The review also highlighted concerns about using AI to interpret x-rays as these systems tend to rely on images of men, who usually have bigger lung capacity. This has potential for women with heart disease to be underdiagnosed.
- AI model flags up hypo symptoms to drivers with diabetes
- Study to detect type 2 diabetes from voice launches
- AI-powered eye exams improve screening rates among young people with diabetes
Prof Habib Naqvi, chief executive of the NHS Race and Health Observatory, said the lack of diversity when it comes to representation in health had “led to racial bias in medical devices, clinical assessments and in other healthcare interventions”.
Minister of State in the Department of Health Andrew Stephenson said: “Making sure the healthcare system works for everyone, regardless of ethnicity, is paramount to our values as a nation. It supports our wider work to create a fairer and simpler NHS.”