EU-AIA-26-04
Monitoring
26 — Obligations of deployers of high-risk AI systems
Monitor AI system operation
Description
Full Analysis & Evidence Requirements
Sign in to view the full obligation text, AI-generated applicability analysis, evidence checklists, and compliance mapping.
Sign In to ViewRelated Obligations
EU-AIA-26-01
Requirement
Use high-risk AI systems according to instructions
EU-AIA-26-02
Human Oversight
Assign qualified human oversight
EU-AIA-26-03
Data Governance
Ensure input data quality when controlling input data
EU-AIA-26-05
Reporting
Inform providers of relevant issues
EU-AIA-26-06
Reporting
Report risks and suspend system use
EU-AIA-26-07
Reporting
Report serious incidents immediately
EU-AIA-26-08
Documentation
Keep system logs for minimum 6 months
EU-AIA-26-09
Transparency
Inform workers about AI system use
EU-AIA-26-10
Registration
Public authority registration compliance
EU-AIA-26-11
Requirement
Verify system registration before use
Map this obligation to your AI systems
ReguLume automatically maps regulatory obligations to your system inventory, identifies compliance gaps, and generates remediation plans.
Get Started