News

Technical documentation obligations for AI providers: drafting manuals

Image
Image

The European Parliament's recent approval of the AI Act entails a significant step in regulating artificial intelligence systems in the European Union. Among its pivotal provisions, artificial intelligence (AI) providers of high-risk systems shall prepare comprehensive technical documentation before market introduction. This documentation is crucial for demonstrating compliance with the AI Act's requirements, for facilitating assessment by regulatory bodies and to provide comfort to deployers. This post provides an overview of its main content and useful tips for providers to prepare.

Context

In a landmark move, the European Parliament recently approved the AI Act, a comprehensive regulatory framework aimed at governing the development, deployment, and use of AI systems within the European Union. Final approval of the AI Act is expected in the next weeks. For a general overview of the principles under the AI Act, see our impact analysis of the AI Act (part 1 and part 2).

Among its many provisions, article 11 stands out as a key requirement for AI providers of high-risk AI systems: the drafting of technical documentation.

General-purpose AI models are also subject to an obligation to have in place technical documentation but we will address that in a separate publication.

Understanding the obligation to prepare technical documentation

Article 11 of the AI Act requires technical documentation of high-risk AI systems to be prepared before they are introduced to the market or put into service. This documentation must not only be comprehensive but also kept up-to-date throughout the lifecycle of the AI system. Its primary purpose is to demonstrate compliance with the requirements outlined in the different applicable sections of the AI Act and provide clear and comprehensive information for national competent authorities and to the deployers. The technical documentation is therefore a key component to be examined as part of the necessary conformity assessment procedures for high-risk AI system. In addition, it also forms one factor to be considered when assessing the “intended purpose” of the AI system (which is a relevant aspect to scope the obligations of a high-risk system falling under Annex III of the AI Act).

Content of technical documentation

The technical documentation, as outlined in Annex IV of the AI Act, is structured to cover various aspects crucial for assessing the AI system's compliance and to provide deployers with the necessary information to duly understand and manage the AI system in place. The documentation must include a description of (among other elements):

  • The AI system in general: This section should include an overview of the AI system: its intended purpose, interactions with hardware/software, hardware descriptions, user interface details, and instructions for use for the deployer, etc.
  • The elements and development process: In this section, AI providers must detail the development process, including methodologies, design specifications (i.e., the general logic of the AI system ), key design choices, assumptions, system architecture, a description of the data sets used for training, human oversight measures, validation and testing procedures, and cybersecurity measures.
  • Monitoring, functioning, and control: This section should delve into the system's capabilities and limitations, foreseeable unintended outcomes, human oversight measures, specifications on input data, and performance metrics' appropriateness.
  • The risk management system and the lifecycle: This includes how (i) risks to health and safety, (ii) fundamental rights, and (iii) discrimination are identified and mitigated and any changes made to the AI system during its lifecycle.

The technical information shall be kept for a period of ten years after the high-risk AI system has been placed on the market or put into service.

Resemblances with CE marking and AI governance obligations

The technical documentation requirements set forth in article 11 of the AI Act echo the principles of the EU declaration of conformity and CE marking for products, emphasizing the need for comprehensive documentation to demonstrate compliance with regulatory standards, and enabling efficient monitoring of operations and post market monitoring. Similarly, the technical documentation serves as a means for AI providers to show adherence to the AI Act's rigorous requirements. Furthermore, the focus on describing the development process, risk management systems, and post-market monitoring also aligns with broader trends in AI governance and GDPR.

Therefore, preparing the technical documentations should be carried out in conjunction (or at least in a consistent manner) with the AI governance policy and the documentation regarding the CE mark. For high-risk AI systems related to a product covered by EU harmonisation legislation listed in Section A of Annex I, such as for instance medical devices, toys or machinery, a single set of technical documentation can be drawn up to combine the information required under the AI Act and the respective EU legislation. This reflects the AI Act’s approach to ensure consistency, avoid duplication and minimise additional burdens for providers that have to undergo the conformity assessment and comply with the obligations and the existing EU legislation providing for the requirements for their products.

Next steps

Compliance with the above involves careful planning and execution by AI providers. Here are some key steps that should be considered:

  1. Assessment of current practices: AI providers should conduct a comprehensive assessment of their current AI systems and prepare an AI mapping and inventory, and a related applicability and compliance gap & risk analysis.
  2. Establishment of documentation processes: As part of the overall AI strategy & governance program, robust processes should be established to ensure the timely preparation and maintenance of technical documentation for all high-risk AI systems.
  3. Interplay with governance and CE mark: Technical documentation should be prepared also taking into account other AI obligations, such as the obligation to have in place an AI governance system, and must be closely aligned with any existing applicable EU conformity assessment and CE marking obligations under other EU harmonisation legislation relating to the product.

 

Authored by Martin Pflueger, Juan Ramón Robles, and Cristina Baron.

Search

Register now to receive personalized content and more!