This document is under active development and has not been finalised.
Skip to content

High-Risk AI – Requirements Art. 8–15

Overview

High-risk AI systems are subject to the full set of obligations under the AI Act. Art. 8 stipulates that these systems must fulfil the requirements of Art. 9–15, taking into account their intended purpose and the generally acknowledged state of the art.

BAUER GROUP Position

High-risk AI systems shall only be placed on the EU market if the Go/No-Go assessment yields a positive outcome. The following pages document the complete requirements in the event of a Go decision.

Catalogue of Requirements

ArticleRequirementEffort
Art. 9Risk management systemMedium – iterative process over the lifecycle
Art. 10Data and Data GovernanceHigh – training data documentation
Art. 11Technical documentationHigh – Annex IV full format
Art. 12Record-keeping (logging)Low – automated logs
Art. 13Transparency and information for deployersMedium – instructions for use
Art. 14Human oversightMedium – oversight mechanisms
Art. 15Accuracy, robustness, cybersecurityMedium – testing + CRA synergies

Integration with CRA Compliance

Art. 8(2) explicitly permits the integration into existing compliance processes:

"in order to ensure coherence, avoid duplication and minimise additional burden, providers may integrate the necessary testing, reporting and documentation into already existing procedures"

The BAUER GROUP leverages this possibility to incorporate AI Act documentation as far as possible into the existing CRA QMS and CRA technical documentation.

Documentation licensed under CC BY-NC 4.0 · Code licensed under MIT