News

Implications of the EU AI Act on medtech companies

Image
Image

The European Union has introduced comprehensive regulations for artificial intelligence (AI) systems, encapsulated in the EU AI Act (AI Act). The objective of the AI Act is to ensure that AI systems in the EU are safe, trustworthy, transparent, traceable, and non-discriminatory.

The AI Act is a horizontal legislation that governs the development, deployment, and use of AI systems within the EU, aiming to create a structured and consistent regulatory environment for AI systems. Notably, the AI Act is the first global regulation specifically targeting AI, setting a precedent for international AI governance.

In this article, we analyse the implications of the AI Act for the Medtech sector and highlight a number of key challenges for medical device manufacturers and their partners in aligning conformity routes for their products under the AI Act.

Background to the AI Act 

Proposed by the European Commission in April 2021, the AI Act was approved by the European Parliament on 13 March 2024 and by the Council of the European Union on 21 May 2024, following multiple rounds of intense interinstitutional negotiations. On 12 July 2024, the European Union published the final text of the AI Act in the Official Journal of the EU (OJEU) (available here). The AI Act will enter into force 20 days after publication in the OJEU. It will become fully applicable two years thereafter, on 2 August 2026, save for some requirements that are subject to longer transitional periods.

Scope of the AI Act

The AI Act is designed to be industry-agnostic, applying across a wide range of sectors including healthcare, medical technology, financial services, and consumer products. It not only applies to EU-based entities, but also has an extraterritorial reach, impacting non-EU entities that market, deploy, or utilize AI systems or products incorporating AI in the EU. It also applies to a broad array of economic operators active in the AI supply chain, including providers, importers, distributors, and deployers of AI systems as well as AI product manufacturers.

AI Act_LSHC_Icon

Each of these economic operators in the supply chain will have responsibilities under the AI Act. Manufacturers of medical devices may become providers of AI systems under the AI Act. Should a medical device be subject to the AI Act, all partners within its supply chain will have to comply with new AI Act requirements.

Risk-based approach and timeline

The AI Act adopts a risk-based approach to AI regulation, categorizing systems into unacceptable, high, medium, and low-risk. Systems posing an unacceptable risk are banned, while those in the high, medium, and low-risk categories are subject to regulations of varying stringency, proportional to the risk they present. The obligations for economic operators differ according to the AI system's risk level, aiming to strike a balance between the need for innovation and the imperative of protecting users from potential harms associated with AI.

Pyramid_LHSC

Like many other products, medical technologies (including medical device software) may come under the scope of the AI Act.

AI-enabled medical devices could fall within the definition of high-risk AI system if the AI system is used as a safety  component of a product1 or the AI system is itself a product, if it is covered under Union Harmonisation Law pursuant to Annex I to the AI Act (such as, the Medical Devices Regulation (MDR) and In Vitro Diagnostic medical devices Regulation (IVDR)) and if it is subject to a third party assessment under such Union Harmonisation Law.

In that case, the AI medical device will fall within the scope of the AI Act as well as the MDR or IVDR.

AI Act_LSHC_High Risk

Timeline for the application of the AI Act to medical devices: 2 August 2027

As noted above, the AI Act was published in the OJEU on 12 July 2024, will enter into force on 2 August 2024, and will become fully applicable as of 2 August 2026. Prohibited AI systems/practices outlined in Article 5 shall be removed from the market by 2 February 2025. The AI Office is required to develop codes of practice by 2 May 2025 at the latest to enable providers of AI systems to demonstrate compliance with the applicable requirements laid down in the AI Act.

Medical devices that qualify as high-risk AI systems will have an extra year (until 2 August 2027) to comply with the applicable requirements for high-risk systems. For a more detailed timeline, please see below:

AI Act_LSHC_Timeline

Manufacturers’ preparedness will be key considering the timelines. Manufacturers of medical devices should assess today whether the AI Act will apply to them tomorrow. While 2 August 2027 seems far away, experience with the MDR and IVDR demonstrates that preparing for new legislation takes time and resources (e.g., change to the quality management system, change to the risk management system, training, internal audits, addition of new personnel, generating new data…). As an illustration, Article 10 of the AI Act requires AI systems to be developed using high quality data sets for training, validation and testing. Such requirement must be considered today to avoid that a major non-conformity is raised by the manufacturers’ Notified Bodies in about three years from now. 

Conformity assessment

To avoid duplications, the AI Act allows a single conformity assessment under the MDR or IVDR and the AI Act. This is good news. It means that it will be possible for Medtech companies to have a single review of their technical documentation and quality management system by their Notified Body under both the AI Act and MDR or IVDR.

In practice, however, such combined conformity assessment may not always be possible:

  • Designation under the AI Act: To be able to provide this combined conformity assessment, MDR/IVDR Notified Bodies will have to be also designated under the AI Act (after an evaluation by their designating authority in the relevant EU Member State). Not all Notified Bodies may be willing to seek designation under the AI Act, and even if they do, the process may not proceed as quickly as anticipated. In practice, it could mean that some manufacturers may have to work with a Notified Body under the MDR/IVDR and another one for the AI Act. It would mean two conformity assessments with two different Notified Bodies with the possibility and the multiplications of audits for manufacturers.
  • AI personnel: Article 43(3) of the AI Act permits MDR/IVDR Notified Bodies to oversee AI conformity assessments if they meet specific requirements, such as independence and professional integrity. These requirements were already assessed for their MDR/IVDR designation and should therefore be easy for them to complete. However, they will also need to have administrative, technical, legal and scientific personnel who possess experience and knowledge relating to the relevant types of AI systems for the conduct of conformity assessments. Considering the fact that everyone will be looking for individuals with AI expertise at the same time (competent authorities, Notified Bodies, manufacturers), recruitment may prove to be complicated. Again, if Notified Bodies are unable to quickly recruit the right personnel, their designation under the AI Act could be delayed.
  • Duration of conformity assessments: Today, the conformity assessment of medical devices under the MDR or IVDR can be a lengthy (and costly) process (18 months or more on average). While the industry is looking at the EU regulators to find solutions to make the current CE marking process more efficient, there are some fears that the application of the AI Act may add an additional burden to the review by Notified Bodies and negatively affect the timeline necessary to affix the CE mark to medical devices in the EU. Considering the current workload of certain MDR/IVDR Notified Bodies, such potential risk cannot be dismissed.

Our recommendations

To prepare for the AI Act, we would recommend that medical device manufacturers consider the following essential steps:

  • Determine AI Act Applicability: Medtech companies should determine if the AI Act applies to their medical devices (e.g., whether a product falls under the definition of high-risk AI systems or under the definition of prohibited AI systems). Companies should also assess the potential regulatory role that they will play under the AI Act (a provider, deployer, or distributor/importer) and the related obligations which apply to them.

  • Perform a gap assessment: A number of obligations under the AI Act already apply to Medtech companies (e.g., the need to have a quality management system, risk management system, a technical documentation). Some requirements under the AI Act are, however, completely new for the Medtech sector (data governance, human oversight, accessibility requirements). A detailed comparison of the requirements in the AI Act versus those in the MDR/IVDR is necessary to identify the new requirements and potential gaps to address.

  • Update internal procedures and technical documentation. When required, Medtech companies should take steps to revise and update their quality management system, technical documentation and post market surveillance procedures to comply with the AI Act requirements.

  • Ensure that you have the right personnel within your organisation. If not, targeted recruitment of individuals with AI expertise should be considered. Training of personnel will also be important.

  • Access and use reliable datasets in compliance with the AI Act: Access to these datasets could be required by Notified Bodies during the conformity assessment of AI-enabled medical device. It is therefore essential that Medtech companies planning to train, validate and test AI-enabled medical devices today do it in consideration of the data governance requirements of the AI Act.

  • Monitor new developments. Medtech companies should monitor developments from the European Commission or the new EU AI Office for guidance on aligning conformity routes between the MDR/IVDR and AI Act.

Our team closely monitors the developments in relation to the AI Act and the MDR/IVDR. We can help you with the steps identified above. Please do not hesitate to reach out for more information if you have any questions.

 

Authored by Hélène Boland, Anastasia Vernikou, and Fabien Roy.

References
1 Safety component is defined as any part of a product or system that ensures safety, where its malfunction could endanger health or safety.

Search

Register now to receive personalized content and more!