ISO-42001
ISO/IEC 42001:2023 — AI Management Systems
- I. ISO/IEC 42001:2023 AI Management System Requirements
- Ch. I — Context, Leadership, and Planning (Clauses 4-6)
- Art. 4.1. Understanding the organization and its context (8)
- Art. 4.2. Understanding the needs and expectations of interested parties (4)
- Art. 4.3. Determining the scope of the AI management system (9)
- Art. 4.4. AI management system (12)
- Art. 5.1. Leadership and commitment (10)
- Art. 5.2. AI policy (8)
- Art. 5.3. Roles, responsibilities and authorities (10)
- Art. 6.1.1. General (actions to address risks and opportunities) (7)
- Art. 6.1.2. AI risk assessment (13)
- Art. 6.1.3. AI risk treatment (6)
- Art. 6.1.4. AI system impact assessment (5)
- Art. 6.2. AI objectives and planning to achieve them (12)
- Art. 6.3. Planning of changes (7)
- Ch. II — Support and Operation (Clauses 7-8)
- Art. 7.1. Resources (9)
- Art. 7.2. Competence (5)
- Art. 7.3. Awareness (6)
- Art. 7.4. Communication (3)
- Art. 7.5. Documented information (9)
- Art. 8.1. Operational planning and control (10)
- Art. 8.2. AI risk assessment (operational) (6)
- Art. 8.3. AI risk treatment (operational) (6)
- Art. 8.4. AI system impact assessment (operational) (13)
- Ch. III — Performance Evaluation and Improvement (Clauses 9-10)
- Art. 9.1. Monitoring, measurement, analysis and evaluation (4)
- Art. 9.2. Internal audit (10)
- Art. 9.3. Management review (10)
- Art. 10.1. Continual improvement (9)
- Art. 10.2. Nonconformity and corrective action (10)
- Ch. IV — Annex A Controls — Policies and Organization (A.2-A.3)
- Art. A.2.2. AI Policy (9)
- Art. A.2.3. Responsible AI Topics in AI Policy (4)
- Art. A.3.2. Roles and Responsibilities for AI (6)
- Art. A.3.3. Reporting of AI Concerns (9)
- Art. A.3.4. Impact of Organizational Changes (6)
- Ch. V — Annex A Controls — Resources and Impact Assessment (A.4-A.5)
- Art. A.4.2. Resources Related to AI Systems (5)
- Art. A.4.3. Competencies Related to AI Systems (4)
- Art. A.4.4. Awareness of Responsible Use of AI Systems (4)
- Art. A.4.5. Consultation (6)
- Art. A.4.6. Communication About the AI System (6)
- Art. A.5.2. AI System Risk Assessment (5)
- Art. A.5.3. AI System Impact Assessment (8)
- Art. A.5.4. Impact of AI System Documentation (4)
- Ch. VI — Annex A Controls — AI System Life Cycle (A.6)
- Art. A.6.2.2. Design and Development of AI System (5)
- Art. A.6.2.3. Training and Testing AI Model (14)
- Art. A.6.2.4. Verification and Validation of AI System (7)
- Art. A.6.2.5. Deployment of AI System (10)
- Art. A.6.2.6. Operation and Monitoring of AI System (10)
- Art. A.6.2.7. Retirement of AI System (10)
- Art. A.6.2.8. Responsible AI System Integration (9)
- Art. A.6.2.9. AI System Documentation (7)
- Art. A.6.2.10. Defined Use and Misuse of AI System (5)
- Art. A.6.2.11. Management of Third-Party AI System Components (6)
- Ch. VII — Annex A Controls — Data, Information, and Relationships (A.7-A.10)
- Art. A.7.2. Data for Development and Enhancement of AI System (11)
- Art. A.7.3. Data Quality for ML and Data for AI System (11)
- Art. A.7.4. Data Preparation (11)
- Art. A.7.5. Data Acquisition and Collection (6)
- Art. A.7.6. Data Provenance (7)
- Art. A.8.2. Informing Interested Parties About AI System Interaction (6)
- Art. A.8.3. Informing Interested Parties About AI Outcomes (4)
- Art. A.8.4. Access to Information About AI System Interaction (5)
- Art. A.8.5. Enabling Appropriate Human Actions in Response to AI Outputs (7)
- Art. A.9.2. Objectives for Responsible Use of AI System (6)
- Art. A.9.3. Intended Use of AI System (4)
- Art. A.9.4. Processes for Responsible Use of AI System (7)
- Art. A.9.5. Human Oversight Aspects (11)
- Art. A.10.2. Suppliers of AI System Components (8)
- Art. A.10.3. Shared ML Models (14)
- Art. A.10.4. Provision of AI System to Third Parties (5)
Monitoring Obligations
31Title I — ISO/IEC 42001:2023 AI Management System Requirements
Chapter I — Context, Leadership, and Planning (Clauses 4-6)
Article 4.1. Understanding the organization and its context
2 obligations
ISO42001-4.1-07
Monitoring
Monitor issues at planned intervals
The organization must monitor the identified external and internal issues at planned intervals to ensure the AI manageme
ISO42001-4.1-08
Monitoring
Review issues at planned intervals
The organization must review the identified external and internal issues at planned intervals to ensure the AI managemen
Article 4.2. Understanding the needs and expectations of interested parties
1 obligation
Article 6.1.1. General (actions to address risks and opportunities)
1 obligation
Article 6.1.4. AI system impact assessment
1 obligation
Article 6.2. AI objectives and planning to achieve them
1 obligation
Article 6.3. Planning of changes
1 obligation
Chapter II — Support and Operation (Clauses 7-8)
Article 7.2. Competence
1 obligation
Article 8.1. Operational planning and control
1 obligation
Article 8.4. AI system impact assessment (operational)
1 obligation
Chapter III — Performance Evaluation and Improvement (Clauses 9-10)
Article 9.1. Monitoring, measurement, analysis and evaluation
1 obligation
Article 9.2. Internal audit
1 obligation
Chapter IV — Annex A Controls — Policies and Organization (A.2-A.3)
Article A.2.3. Responsible AI Topics in AI Policy
1 obligation
Article A.3.3. Reporting of AI Concerns
1 obligation
Chapter V — Annex A Controls — Resources and Impact Assessment (A.4-A.5)
Article A.4.3. Competencies Related to AI Systems
1 obligation
Chapter VI — Annex A Controls — AI System Life Cycle (A.6)
Article A.6.2.5. Deployment of AI System
1 obligation
Article A.6.2.6. Operation and Monitoring of AI System
4 obligations
ISO42001-A.6.2.6-02
Monitoring
Monitor System Performance Metrics
Organizations must implement monitoring that includes tracking of system performance metrics as part of their ongoing AI
ISO42001-A.6.2.6-03
Monitoring
Detect Data Drift and Model Degradation
Organizations must implement monitoring capabilities to detect data drift and model degradation in their AI systems.
ISO42001-A.6.2.6-05
Monitoring
Observe System Behavior in Production
Organizations must implement monitoring to observe AI system behavior while the system is operating in production enviro
ISO42001-A.6.2.6-06
Monitoring
Collect Feedback from Users and Affected Parties
Organizations must implement processes to collect feedback from users and affected parties as part of their AI system mo
Article A.6.2.10. Defined Use and Misuse of AI System
1 obligation
Article A.6.2.11. Management of Third-Party AI System Components
1 obligation
Chapter VII — Annex A Controls — Data, Information, and Relationships (A.7-A.10)
Article A.7.3. Data Quality for ML and Data for AI System
2 obligations
ISO42001-A.7.3-03
Monitoring
Implement Data Quality Measurement Processes
The organization must implement processes to measure data quality for ML and AI systems.
ISO42001-A.7.3-04
Monitoring
Implement Data Quality Monitoring Processes
The organization must implement processes to monitor data quality for ML and AI systems on an ongoing basis.
Article A.7.6. Data Provenance
2 obligations
ISO42001-A.7.6-06
Monitoring
Use provenance information for incident investigation
The organization must utilize provenance information to support incident investigation activities related to AI systems.
ISO42001-A.7.6-07
Monitoring
Use provenance information for audit activities
The organization must utilize provenance information to support audit activities related to AI systems.
Article A.9.2. Objectives for Responsible Use of AI System
1 obligation
Article A.9.3. Intended Use of AI System
1 obligation
Article A.9.4. Processes for Responsible Use of AI System
1 obligation
Article A.9.5. Human Oversight Aspects
1 obligation
Article A.10.2. Suppliers of AI System Components
1 obligation