High-Risk AI – Requirements Art. 8–15
Overview
High-risk AI systems are subject to the full set of obligations under the AI Act. Art. 8 stipulates that these systems must fulfil the requirements of Art. 9–15, taking into account their intended purpose and the generally acknowledged state of the art.
BAUER GROUP Position
High-risk AI systems shall only be placed on the EU market if the Go/No-Go assessment yields a positive outcome. The following pages document the complete requirements in the event of a Go decision.
Catalogue of Requirements
| Article | Requirement | Effort |
|---|---|---|
| Art. 9 | Risk management system | Medium – iterative process over the lifecycle |
| Art. 10 | Data and Data Governance | High – training data documentation |
| Art. 11 | Technical documentation | High – Annex IV full format |
| Art. 12 | Record-keeping (logging) | Low – automated logs |
| Art. 13 | Transparency and information for deployers | Medium – instructions for use |
| Art. 14 | Human oversight | Medium – oversight mechanisms |
| Art. 15 | Accuracy, robustness, cybersecurity | Medium – testing + CRA synergies |
Integration with CRA Compliance
Art. 8(2) explicitly permits the integration into existing compliance processes:
"in order to ensure coherence, avoid duplication and minimise additional burden, providers may integrate the necessary testing, reporting and documentation into already existing procedures"
The BAUER GROUP leverages this possibility to incorporate AI Act documentation as far as possible into the existing CRA QMS and CRA technical documentation.