News

FDA M-CERSI AI workshop highlights future FDA comment period on AI in drug development

Image
Image

The U.S. Food and Drug Administration (FDA) – in collaboration with the University of Maryland Center of Excellence in Regulatory Science and Innovation (M-CERSI) – recently hosted a one-day virtual public workshop entitled, “Application of Artificial Intelligence and Machine Learning for Precision Medicine.” The workshop aimed to review current methodologies, opportunities, challenges, and best practices to address the rapidly changing landscape of artificial intelligence (AI) and machine learning (ML) in the setting of drug development and precision medicine.

There were numerous presenters representing AI and technology companies, the pharmaceutical industry, research journals, and FDA, including Matthew Diamond, the Chief Medical Officer for FDA’s Center of Digital Health Excellence. These discussions allowed stakeholders to address some of the applications and benefits of AI across the drug development spectrum, including:

  • precision medicine, where tools like neural networks are being used to enhance and accelerate the process and success of drug modeling;

  • drug discovery and development, including drug target identification, selection, and prioritization;

  • clinical investigations, including dose optimization, recruitment and selection, adherence, retention, site selection, data collection, data management, data analysis, and clinical endpoint assessment;

  • pre-clinical research, including pharmacokinetic, pharmacodynamic, and toxicologic studies; and

  • manufacturing and post-market safety monitoring, where AI is being used for advanced pharmaceutical manufacturing and post-market surveillance or pharmacovigilance.

The presenters also discussed various challenges implicated when regulating and using AI:

  • the need to identify and mitigate bias caused by a lack of high-quality, large-scale, and fit-for-purpose datasets;

  • the validity and accuracy of results derived from AI tools;

  • the lack of transparency on how data is processed and evaluated when using black box algorithms;

  • poor generalizability;

  • the need to develop privacy and cybersecurity guidelines that safeguard sensitive health information, especially in the context of wearables and other digital health tools used in clinical investigations;

  • the lack of interoperability, which can lead to missing data and lacking data standardization; and

  • the lack of existing an U.S. regulatory scheme for the use of AI in drug development.

The conference highlighted the importance of the AI framework established by the National Institute of Standards and Technology (NIST) based on Congress’s request that NIST develop a voluntary tool for integrating AI risks into organizational structure based on input from the public and private sector. The framework aims to help designers, developers, users, and evaluators of AI systems better identify, assess, respond to, and communicate risks across the AI lifecycle. It suggests that valid and reliable AI is secure, resilient, explainable, interpretable, privacy-enhanced, and fair with management of bias. The four functions of the NIST framework are governing, mapping, measuring, and managing. NIST has also developed a Playbook with examples on how to implement the framework and plans to update it biannually. Further, NIST suggested that entities submit industry-specific recommendations to help NIST develop future, industry-specific AI frameworks.

FDA officials highlighted the fact that the agency had previously issued various guidance documents on AI in the medical device space. They also identified challenges of regulating AI systems, such as: evaluating algorithmic robustness; mitigating bias introduced from the training database or the variables selected for the algorithm; developing change protocols that would allow an algorithm to be transparent about how it learns and adapts to individual communities, health institutions, and patients; and assessing uses of AI for clinical investigations.[1] FDA noted that it would soon release guidance on Change Control Plans.

In the drug development space, the agency speakers stated they saw an uptick in clinical research sections of marketing applications mentioning AI from one in 2016 to 118 in 2021. FDA leadership also suggested that applications of AI for clinical research – such as the use of AI for simulation of a placebo arm of trial – are an area where FDA regulation or guidance may be necessary. They also stated that the “Good Machine Learning Practice for Medical Device Development” guidance may have some applicability to FDA oversight of AI for drug development. Importantly, Tala Fakhouri, Associate Director for Policy Analysis at the Center of Drug Evaluation and Research (CDER), suggested that FDA’s AI regulatory regime will soon apply to CDER-regulated entities. Specifically, she stated that FDA will soon release a discussion paper on AI, ML, and drug development. FDA will request feedback from stakeholders on certain related questions with the goal of providing regulatory clarity to industry. Currently, FDA is soliciting comments on the use of AI in pharmaceutical and biologic manufacturing until May 1, 2023, and has published a discussion paper on this topic.

FDA and the Duke-Robert J. Margolis, MD, Center for Health Policy will host another virtual public workshop “Understanding Priorities for the Development of Digital Health Technologies to Support Clinical Trials for Drug Development and Review” on March 28-29. The purpose of that public workshop is to understand the priorities for the development of Digital Health Technologies (DHTs) to support clinical trials, including accessibility, diversity, remote data acquisition, and clinical outcomes measures using DHTs. FDA defines DHTs as systems that use computing platforms, connectivity, software, and/or sensors for healthcare and related uses. These include devices that were highly discussed at the FDA and M-CERSI AI Workshop by FDA leadership, such as wearables and continuous glucose monitoring devices. The upcoming workshop relates to Sections 3606 and 3607 of the Food and Drug Omnibus Reform Act, which require FDA to make recommendations and issue guidance related to the use of digital health technologies in clinical trials. It will feature a session where leadership of CDER will discuss the role of DHTs in clinical trials for drug development. Registration for the workshop is online here.

Hogan Lovells has been involved in a wide variety of matters helping clients to leverage developments in artificial intelligence and digital health technologies in the life sciences industry. Additionally, Hogan Lovells has been at the forefront of medical device artificial intelligence regulation. We have advised our clients on the clearance and approval of numerous medical devices that incorporate AI algorithms. As AI begins to play a larger role in FDA’s regulatory agenda, Hogan Lovells will continue to monitor updates to FDA guidance and comment periods. If you have any questions on AI-related products, AI in clinical investigations, AI in precision medicine, or AI in drug development, please contact any of the authors of this alert or the Hogan Lovells lawyer with whom you regularly work.

 

Authored by Robert Church, Cybil Roehrenbeck, Blake Wilson, Yetunde Fadahunsi, and Ashley Grey

 


[1] FDA, Artificial Intelligence and Machine Learning (AI/ML) Software as a Medical Device Action Plan (Sep. 22, 2021), https://www.fda.gov/medical-devices/software-medical-device-samd/artificial-intelligence-and-machine-learning-software-medical-device. FDA, Guidance for Industry: Clinical Decision Support Software (Sep. 28, 2022), https://www.fda.gov/media/109618/download. FDA, Guidance for Industry: Policy for Device Software Functions and Medical Mobile Applications (Sep. 2022), https://www.fda.gov/regulatory-information/search-fda-guidance-documents/policy-device-software-functions-and-mobile-medical-applications. FDA, Guidance for Industry: Medical Device Data Systems, Medical Image Storage Devices, and Medical Image Communications Devices (Sep. 2022), https://www.fda.gov/regulatory-information/search-fda-guidance-documents/medical-device-data-systems-medical-image-storage-devices-and-medical-image-communications-devices. FDA, Guidance for Industry: Cybersecurity in Medical Devices: Quality System Considerations and Content of Premarket Submissions (Apr. 2022), https://www.fda.gov/regulatory-information/search-fda-guidance-documents/cybersecurity-medical-devices-quality-system-considerations-and-content-premarket-submissions. FDA, Guidance for Industry: Multiple Function Device Products: Policy and Guidance (Jul. 2020), https://www.fda.gov/regulatory-information/search-fda-guidance-documents/multiple-function-device-products-policy-and-considerations. FDA, Guidance for Industry: Content of Premarket Submissions for Device Software Functions (Nov. 2021), https://www.fda.gov/regulatory-information/search-fda-guidance-documents/content-premarket-submissions-device-software-functions. FDA, Good Machine Learning Practice for Medical Device Development: Guiding Principles (Oct. 2021), https://www.fda.gov/media/153486/download.

Search

Register now to receive personalized content and more!