Sunday, 28 April, 2024
HomeResearch IssuesUK investigation into racial and gender bias in medical devices

UK investigation into racial and gender bias in medical devices

The UK Health Secretary has ordered a review into whether medical devices — such as oximeters, spirometers and respirators — are equally effective regardless of the patient's ethnicity.

Writing in the Sunday Times, Sajid Javid said: “It is easy to look at a machine and assume everyoneʼs getting the same experience. But technologies are created and developed by people, and so bias, however inadvertent, can be an issue here too.”

In response to concerns, he has called for a review into systemic racism and gender bias in certain medical devices.

Oximeters

Oximeters estimate the amount of oxygen in a personʼs blood, and are a crucial tool in determining which COVID patients may need hospital care – not least because some can have dangerously low levels of oxygen without realising.

Concerns have been raised, however, that the devices work less well for patients with darker skin. NHS England and the Medicines and Healthcare products Regulatory Agency (MHRA) say pulse oximeters can overestimate the amount of oxygen in the blood.

Javid told The Guardian last month that the devices were designed for Caucasians. “As a result, you were less likely to end up on oxygen if you were black or brown, because the reading was just wrong,” he said.

Experts believe the inaccuracies could be one of the reasons why death rates have been higher among minority ethnic people, although other factors may also play a role, such as working in jobs that have greater exposure to others.

Respirator masks

Medical-grade respirators are crucial to help keep healthcare workers safe from COVID because they offer protection against both large and small particles that others exhale. To offer the greatest protection, however, filtering face piece (FFP) masks must fit properly and research has shown they do not fit as well on people from some ethnic backgrounds.

“Adequate viral protection can only be provided by respirators that properly fit the wearerʼs facial characteristics. Initial fit pass rates [the rate at which they pass a test on how well they fit] vary between 40% and 90% and are especially low in female and in Asian healthcare workers,” one review published in 2020 notes.

Another found that studies on the fit of such PPE largely focused on Caucasian or single ethnic populations. “BAME people remain under-represented, limiting comparisons between ethnic groups,” said another.

Spirometers

Spirometers measure lung capacity, but experts suggest there are racial biases in the interpretation of data gathered from such gadgets.

Writing in the journal Science, Dr Achuta Kadambi, an electrical engineer and computer scientist at the University of California, Los Angeles, said black or Asian people are assumed to have lower lung capacity than white people – a belief he noted may be based on inaccuracies in earlier studies. As a result, “correction” factors are applied to the interpretation of spirometer data, a situation that can affect the order in which patients are treated.

“For example, before ‘correctionʼ a black personʼs lung capacity might be measured to be lower than the lung capacity of a white person” Kadambi writes. “After ‘correctionʼ to a smaller baseline lung capacity, treatment plans would prioritise the white person, because it is expected that a black person should have lower lung capacity, and so their capacity must be much lower than that of a white person before their reduction is considered a priority.”

Another area Kadambi said may be affected by racial bias is remote plethysmography, in which pulse rates are measured by looking at changes in skin colour captured by video. Kadambi said such visual cues might be biased by subsurface melanin content – in other words, skin colour.

Artificial intelligence systems

AI is increasingly being developed for applications in healthcare, including to aid professionals in diagnosing conditions. There are concerns, however, that biases in data used to develop such systems means they risk being less accurate for people of colour.

Such concerns were recently raised in relation to AI systems for diagnosing skin cancers. Researchers revealed that few freely available image databases that could be used to develop such AI are labelled with ethnicity or skin type. Of those that did have such information recorded, only a handful were of people recorded as having dark brown or black skin.

It is an issue Javid has acknowledged, adds The Guardian report. Announcing new funding last month for AI projects to tackle racial inequalities in healthcare, such as the detection of diabetic retinopathy, he said one area of focus would be the development of standards to ensure datasets used in developing AI systems were “diverse and inclusive”.

“If we only train our AI using mostly data from white patients it cannot help our population as a whole. We need to make sure the data we collect is representative of our nation,” he said.

 

The Guardian article – From oximeters to AI, where bias in medical devices may lurk (Open access)

 

See more from MedicalBrief archives:

 

GPs fail to spot two out of every three cases of pneumonia

 

Surgical masks match respirators for flu and respiratory virus protection

 

CDC's health sector masking guidelines were a deadly mistake

 

MedicalBrief — our free weekly e-newsletter

We'd appreciate as much information as possible, however only an email address is required.