Vai menu di sezione

UK - Department of Health and Social Care – Equity in Medical Devices: Recommendations to Promote Equity in Medical Devices
Anno 2024

This document analyzes potential inequalities arising from the use of certain types of medical devices, including those incorporating artificial intelligence systems, and offers recommendations to eliminate or mitigate them, with the aim of ensuring equity within the English National Health Service (NHS).

The report presents the findings of a study on the biases found in certain types of medical devices and their consequences. The investigation was carried out by an independent panel of experts appointed by the Secretary of State for Health and Social Care. Specifically, the panel explored the extent and impact of various factors, primarily racial and ethnic, on the design and use of three categories of medical devices, one of which consists of devices that use artificial intelligence. After highlighting that biases affecting vulnerable population groups can exacerbate existing inequalities within the NHS, the report sets out several recommendations to reduce or avoid scenarios in which the use of medical devices results in discriminatory outcomes in terms of quality and access to care.

The goal is to implement the principle of equity, whereby medical devices used by the NHS should:

1.    Be available to all in proportion to need;

2.    Ensure that patients are selected for treatment based on criteria of need and risk;

3.    Maintain high and equal standards of performance for all population groups.

With regard to medical devices using artificial intelligence, biases may arise from:

1.    The way health issues are selected and prioritized for AI system development;

2.    The data used to develop and validate the device, which may be under-representative or incomplete;

3.    How health systems define and prioritize desired outcomes;

4.    How AI systems are developed and validated;

5.    How they are used hence the importance of monitoring their use, clinical impact, and performance.

Moreover, biases embedded in AI systems can not only exacerbate but also multiply health inequalities.

To develop and use AI-powered medical devices that respect the principle of equity, eight recommendations are set forth, which call for:

· Developers and stakeholders to engage diverse patient groups, representative organizations, and the public in a co-design process for AI systems, thereby incorporating their perspectives and needs (Recommendation 8);

· The Government to establish an academy (online and in-person) to produce materials for both technical audiences (e.g., healthcare professionals) and non-experts, aimed at educating them on equity-related issues in AI-powered medical devices. The academy must ensure that pre- and post-graduate training for healthcare professionals includes awareness of AI-related equity risks and instructions on how to identify and mitigate such bias. Materials for developers and computer scientists should delve into the structural and social determinants of racism and discrimination in healthcare (Recommendation 9);

· Researchers, developers, and distributors of AI systems to ensure transparency regarding the diversity, completeness, and accuracy of data used at all stages of research and development. The government should fund the Medicines and Healthcare products Regulatory Agency (MHRA) to develop guidelines on how to assess equity-impacting bias and on the level of data detail required to ensure proper performance across all population groups. Dataset owners should be encouraged to build trust with minority communities and to report demographic data as fully and accurately as possible, while respecting privacy. Regulators should require manufacturers to disclose data diversity used for training algorithms and to document and publish information on device limitations. Simultaneously, they should provide manufacturers with guidance to identify biases, mitigate adverse effects on performance, and make such documentation and publication mandatory (Recommendation 10);

· All stakeholders to engage in coordinated actions to reduce bias. In particular, MHRA is encouraged to revise the risk classification of AI-powered medical devices to, in principle, assign them to higher risk categories. Manufacturers and stakeholders should also adopt MHRA's Guiding Principles for Good Machine Learning Practice for Medical Device Development and its Change Programme Roadmap (Recommendation 11);

· Long-term resource allocation so that regulatory bodies can develop agile and adaptable guidelines and assist innovators and businesses in implementing bias-reduction processes throughout the device lifecycle (Recommendation 12);

· The NHS to leverage its equity principle, influence, and purchasing power to promote the use of equitable medical devices within the healthcare system. This includes, for instance, requiring a minimum equity standard in the prequalification stage of national framework agreements for digital technology, and incorporating equity as one of the criteria that healthcare and social care teams must assess when procuring digital technologies. The NHS should also collaborate with manufacturers and regulators to establish shared responsibility for algorithm safety monitoring and audits to ensure equitable outcomes of AI-powered devices (Recommendation 13);

· Research funders to prioritize diversity and inclusion, encourage participation from underrepresented groups, and ensure their access to funding. AI research projects should be required to integrate equity considerations at every stage, and an independent research ethics committee should assess the social, economic, and healthcare equity impacts of such research (Recommendation 14);

· The Government to allocate resources to address challenges posed by generative AI and foundation models in healthcare, and to appoint a multidisciplinary expert panel to evaluate and monitor the potential impact of such models on health equity (Recommendation 15).

The document is available at the following link and in the download box.

Sergio Sulmicelli
Pubblicato il: Lunedì, 11 Marzo 2024 - Ultima modifica: Venerdì, 20 Giugno 2025
torna all'inizio