Hogan Lovells 2024 Election Impact and Congressional Outlook Report
15 November 2024
The European Union has introduced comprehensive regulations for artificial intelligence (AI) systems, encapsulated in the EU AI Act (AI Act). The objective of the AI Act is to ensure that AI systems in the EU are safe, trustworthy, transparent, traceable, and non-discriminatory.
The AI Act is a horizontal legislation that governs the development, deployment, and use of AI systems within the EU, aiming to create a structured and consistent regulatory environment for AI systems. Notably, the AI Act is the first global regulation specifically targeting AI, setting a precedent for international AI governance.
In this article, we analyse the implications of the AI Act for the Medtech sector and highlight a number of key challenges for medical device manufacturers and their partners in aligning conformity routes for their products under the AI Act.
Proposed by the European Commission in April 2021, the AI Act was approved by the European Parliament on 13 March 2024 and by the Council of the European Union on 21 May 2024, following multiple rounds of intense interinstitutional negotiations. On 12 July 2024, the European Union published the final text of the AI Act in the Official Journal of the EU (OJEU) (available here). The AI Act will enter into force 20 days after publication in the OJEU. It will become fully applicable two years thereafter, on 2 August 2026, save for some requirements that are subject to longer transitional periods.
The AI Act is designed to be industry-agnostic, applying across a wide range of sectors including healthcare, medical technology, financial services, and consumer products. It not only applies to EU-based entities, but also has an extraterritorial reach, impacting non-EU entities that market, deploy, or utilize AI systems or products incorporating AI in the EU. It also applies to a broad array of economic operators active in the AI supply chain, including providers, importers, distributors, and deployers of AI systems as well as AI product manufacturers.
Each of these economic operators in the supply chain will have responsibilities under the AI Act. Manufacturers of medical devices may become providers of AI systems under the AI Act. Should a medical device be subject to the AI Act, all partners within its supply chain will have to comply with new AI Act requirements.
The AI Act adopts a risk-based approach to AI regulation, categorizing systems into unacceptable, high, medium, and low-risk. Systems posing an unacceptable risk are banned, while those in the high, medium, and low-risk categories are subject to regulations of varying stringency, proportional to the risk they present. The obligations for economic operators differ according to the AI system's risk level, aiming to strike a balance between the need for innovation and the imperative of protecting users from potential harms associated with AI.
Like many other products, medical technologies (including medical device software) may come under the scope of the AI Act.
AI-enabled medical devices could fall within the definition of high-risk AI system if the AI system is used as a safety component of a product1 or the AI system is itself a product, if it is covered under Union Harmonisation Law pursuant to Annex I to the AI Act (such as, the Medical Devices Regulation (MDR) and In Vitro Diagnostic medical devices Regulation (IVDR)) and if it is subject to a third party assessment under such Union Harmonisation Law.
In that case, the AI medical device will fall within the scope of the AI Act as well as the MDR or IVDR.
As noted above, the AI Act was published in the OJEU on 12 July 2024, will enter into force on 2 August 2024, and will become fully applicable as of 2 August 2026. Prohibited AI systems/practices outlined in Article 5 shall be removed from the market by 2 February 2025. The AI Office is required to develop codes of practice by 2 May 2025 at the latest to enable providers of AI systems to demonstrate compliance with the applicable requirements laid down in the AI Act.
Medical devices that qualify as high-risk AI systems will have an extra year (until 2 August 2027) to comply with the applicable requirements for high-risk systems. For a more detailed timeline, please see below:
Manufacturers’ preparedness will be key considering the timelines. Manufacturers of medical devices should assess today whether the AI Act will apply to them tomorrow. While 2 August 2027 seems far away, experience with the MDR and IVDR demonstrates that preparing for new legislation takes time and resources (e.g., change to the quality management system, change to the risk management system, training, internal audits, addition of new personnel, generating new data…). As an illustration, Article 10 of the AI Act requires AI systems to be developed using high quality data sets for training, validation and testing. Such requirement must be considered today to avoid that a major non-conformity is raised by the manufacturers’ Notified Bodies in about three years from now.
To avoid duplications, the AI Act allows a single conformity assessment under the MDR or IVDR and the AI Act. This is good news. It means that it will be possible for Medtech companies to have a single review of their technical documentation and quality management system by their Notified Body under both the AI Act and MDR or IVDR.
In practice, however, such combined conformity assessment may not always be possible:
To prepare for the AI Act, we would recommend that medical device manufacturers consider the following essential steps:
Determine AI Act Applicability: Medtech companies should determine if the AI Act applies to their medical devices (e.g., whether a product falls under the definition of high-risk AI systems or under the definition of prohibited AI systems). Companies should also assess the potential regulatory role that they will play under the AI Act (a provider, deployer, or distributor/importer) and the related obligations which apply to them.
Perform a gap assessment: A number of obligations under the AI Act already apply to Medtech companies (e.g., the need to have a quality management system, risk management system, a technical documentation). Some requirements under the AI Act are, however, completely new for the Medtech sector (data governance, human oversight, accessibility requirements). A detailed comparison of the requirements in the AI Act versus those in the MDR/IVDR is necessary to identify the new requirements and potential gaps to address.
Update internal procedures and technical documentation. When required, Medtech companies should take steps to revise and update their quality management system, technical documentation and post market surveillance procedures to comply with the AI Act requirements.
Ensure that you have the right personnel within your organisation. If not, targeted recruitment of individuals with AI expertise should be considered. Training of personnel will also be important.
Access and use reliable datasets in compliance with the AI Act: Access to these datasets could be required by Notified Bodies during the conformity assessment of AI-enabled medical device. It is therefore essential that Medtech companies planning to train, validate and test AI-enabled medical devices today do it in consideration of the data governance requirements of the AI Act.
Monitor new developments. Medtech companies should monitor developments from the European Commission or the new EU AI Office for guidance on aligning conformity routes between the MDR/IVDR and AI Act.
Our team closely monitors the developments in relation to the AI Act and the MDR/IVDR. We can help you with the steps identified above. Please do not hesitate to reach out for more information if you have any questions.
Authored by Hélène Boland, Anastasia Vernikou, and Fabien Roy.