Compliance Library Blog Product Sign In

EU-AI-Act

Regulation (EU) 2024/1689 — Artificial Intelligence Act

EU Version 1.0 334 obligations
Showing 176–200 of 334 obligations

Title I — General Provisions

Title II — Prohibited AI Practices

Title III — High-Risk AI Systems

Chapter 1 — Classification of AI Systems as High-Risk

Chapter 2 — Requirements for High-Risk AI Systems

Chapter 3 — Obligations of Providers and Deployers of High-Risk AI Systems and Other Parties

Chapter 4 — Obligations of Deployers of High-Risk AI Systems

Article 26. Obligations of deployers of high-risk AI systems

16 obligations

EU-AIA-26-02 Human Oversight

Assign qualified human oversight

Deployers must assign human oversight to natural persons who have the necessary competence, training and authority, as w

EU-AIA-26-03 Data Governance

Ensure input data quality when controlling input data

To the extent the deployer exercises control over the input data, that deployer must ensure that input data is relevant

EU-AIA-26-04 Monitoring

Monitor AI system operation

Deployers must monitor the operation of the high-risk AI system on the basis of the instructions for use.

EU-AIA-26-05 Reporting

Inform providers of relevant issues

Where relevant, deployers must inform providers in accordance with Article 72 about issues arising from monitoring the A

EU-AIA-26-06 Reporting

Report risks and suspend system use

Where deployers have reason to consider that the use of the high-risk AI system in accordance with the instructions for

EU-AIA-26-07 Reporting

Report serious incidents immediately

Where deployers have identified a serious incident, they must immediately inform first the provider, and then the import

EU-AIA-26-08 Documentation

Keep system logs for minimum 6 months

Deployers must keep the logs automatically generated by the high-risk AI system to the extent such logs are under their

EU-AIA-26-09 Transparency

Inform workers about AI system use

Before putting into service or using a high-risk AI system at the workplace, deployers who are employers must inform wor

EU-AIA-26-10 Registration

Public authority registration compliance

Deployers of high-risk AI systems that are public authorities, or Union institutions, bodies, offices or agencies, must

EU-AIA-26-11 Requirement

Verify system registration before use

When public authority deployers find that the high-risk AI system that they envisage using has not been registered in th

EU-AIA-26-12 Data Governance

Use provider information for DPIA compliance

Where applicable, deployers must use the information provided under Article 13 to comply with their obligation to carry

EU-AIA-26-13 Requirement

Request authorization for post-remote biometric identification

In criminal investigations, deployers of high-risk AI systems for post-remote biometric identification must request auth

EU-AIA-26-14 Requirement

Limit biometric system use to necessary scope

Each use of post-remote biometric identification systems must be limited to what is strictly necessary for the investiga

EU-AIA-26-15 Requirement

Stop unauthorized biometric system use and delete data

If authorization for post-remote biometric identification is rejected, deployers must stop the use of the system with im

EU-AIA-26-16 Transparency

Inform natural persons of AI system use

Deployers of high-risk AI systems referred to in Annex III that make decisions or assist in making decisions related to

EU-AIA-26-17 Requirement

Cooperate with national competent authorities

Deployers must cooperate with the relevant national competent authorities in any action those authorities take in relati

Article 27. Fundamental rights impact assessment for high-risk AI systems

9 obligations

EU-AIA-27-01 Risk Management

Perform fundamental rights impact assessment before deployment

Prior to deploying a high-risk AI system, specified deployers must perform an assessment of the impact on fundamental ri

EU-AIA-27-02 Documentation

Describe deployer processes using AI system

As part of the fundamental rights impact assessment, deployers must provide a description of the deployer's processes in

EU-AIA-27-03 Documentation

Describe time period and frequency of AI system use

As part of the fundamental rights impact assessment, deployers must provide a description of the period of time within w

EU-AIA-27-04 Risk Management

Identify categories of persons affected by AI system use

As part of the fundamental rights impact assessment, deployers must identify the categories of natural persons and group

EU-AIA-27-05 Risk Management

Assess specific risks of harm to affected persons

As part of the fundamental rights impact assessment, deployers must assess the specific risks of harm likely to impact t

EU-AIA-27-06 Human Oversight

Describe human oversight implementation measures

As part of the fundamental rights impact assessment, deployers must provide a description of the implementation of human

EU-AIA-27-07 Risk Management

Define measures for when risks materialize

As part of the fundamental rights impact assessment, deployers must define the measures to be taken when identified risk

EU-AIA-27-08 Monitoring

Update fundamental rights impact assessment when elements change

If during the use of the AI system, the deployer considers that any assessment elements have changed or are no longer up

EU-AIA-27-09 Reporting

Notify market surveillance authority of assessment results

Once the fundamental rights impact assessment has been performed, the deployer must notify the market surveillance autho

Start your compliance assessment

Map obligations to your AI systems, identify gaps, and generate board-ready reports. Plans start at $149/mo.

Get Started