AI Governance and Compliance

preparation for ISO/IEC 42001 + AI Act, PECB IMS2 Method

Methodology – PECB IMS2

(Integrated Implementation Methodology for Management Systems and Standards)

Objective

Implement an AI Management System (AIMS) compliant with ISO/IEC 42001, produce auditable evidence, and develop a roadmap for AI Act compliance (classification + requirements + compliance dossier) in order to be ready for the certification audit.

1) Expected results (deliverable commitment)
At the end of the assignment, you will have :
  • An operational AIMS (governance, roles, processes, metrics) covering the defined scope.
  • A registry of AI systems and a mapping of use cases.
  • An AI risk register including risk treatments, acceptance criteria, and controls.
  • The AI policy and documentation (procedures) required by ISO 42001.
  • An evidence pack to demonstrate compliance during an audit.
  • An internal audit and a management review conducted (with reports).
  • A certification readiness assessment (pre-audit) and a corrective action plan.
  • An AI Act roadmap : system classification, mapping of obligations, list of evidence to be compiled, and compliance initiatives.
2) Scope (what the offer covers)
2.1 Scope of “AIMS ISO/IEC 42001”
  • Governance : leadership, responsibilities, committees, objectives, and KPIs.
  • AI risk management : identification, analysis, mitigation, and monitoring.
  • AI lifecycle : design/selection, training/evaluation (if applicable), deployment, monitoring, change management, decommissioning.
  • AI supplier/third-party management : requirements, evaluation, contract terms, monitoring.
  • Documented information management : policies, procedures, records. Competencies & awareness.
  • Monitoring, internal audit, management review, continuous improvement.
2.2 Scope of the “AI Act” (compliance path)
  • Classification of AI systems (by use case / product / module).
  • Mapping of applicable requirements (transparency, documentation, governance, human oversight, robustness/cyber, data, monitoring, etc.).
  • Compliance file : structure, responsibilities, expected evidence, remediation plan.

 

This offering is designed to provide structured preparation for the AI Act. Full compliance depends on the type of system (e.g., “high-risk,” GPAI), the context, and technical choices. The engagement provides the roadmap, the compliance dossier, and the compliance plan.

3) Implementation process (PECB IMS2 aligned with ISO/IEC 42001)
Phase 1 — Define & Establish

Objective : Provide a framework, secure buy-in, define the scope and documentation requirements.


Activities :


1. Leadership & Approval : Sponsor, project governance, objectives, milestones.

2. Roles & Responsibilities : RACI (SMIA Owner, AI Managers, Risk, DPO, Security, Operations).

3. Context & Stakeholders : customer expectations, regulators, suppliers, risks.

4. SMIA Scope : entities, products, use cases, data, environments.

5. Analysis of Current State : AI practices, SDLC/ML lifecycle, security, privacy, GRC.

6. AI Policy : principles, commitments, acceptance criteria, escalation.

7. AI Risk Management : methodology, matrices, thresholds, workflow, registers.

8. Statement of Applicability (SoA 42001) : selected controls/justification.


Phase 1 Deliverables

 

  • Project charter + governance + RACI
  • SMIA scope
    Initial AI systems register (v1)
  • AI policy (v1)
  • AI risk methodology + risk register (v1)
  • 42001 Statement of Applicability (v1)
  • Project plan + evidence plan (structure)
Phase 2 — Implement and Operate

Objective : Implement measures, develop procedures, and launch operations.


Activities :


1. Selection & design of measures : AIMS controls, supplier requirements, validation criteria, monitoring, logs.

2. Implementation : Procedures + basic tools (templates, workflows, tickets).

3. Management of documented information : document repository, versions, evidence.

4. Communication : AI usage rules, committees, decision-making processes.

5. Skills & awareness : targeted training (executives, product, data/AI).

6. AI operations management : operations, monitoring, AI incidents, changes.


Phase 2 Deliverables

 

  • AI Registry (v2) + use case sheets
  • Key procedures : AI lifecycle, validation/release, monitoring, AI incidents, changes, decommissioning, AI vendor management, data management for AI (governance)
  • Evidence catalog + record templates (logs, reviews, validations)
  • SMIA KPI/OKR matrix + dashboard (structure)
  • Training plan + materials + certificates
Phase 3 — Monitor and Review

Objective : To demonstrate that the system is operational and being effectively managed.


Activities :


1. Monitoring, measurement, analysis, and evaluation : KPIs, controls, periodic reviews.

2. Internal audit (ISO/IEC 42001) : plan, checklists, interviews, findings.

3. Management review : decisions, trade-offs, improvement plan, resources.


Phase 3 Deliverables

 

  • Internal audit plan + internal audit report
  • Management review minutes + decisions + action plan
  • Update of SoA, AI risks, logs, evidence
Phase 4 — Maintain and Improve

Objective : Correct, stabilize, and prepare for the certification process.


Activities :


1. Non-conformity handling : corrective actions, effectiveness verification.

2. Continuous improvement : optimization of controls, maturity, automation.


Phase 4 Deliverables

 

  • Non-conformity log + corrective actions + closure evidence
  • “Certification Readiness” package : final evidence file + audit transition plan + checklist
4) AI Act component (integrated as work progresses)
4.1 AI Act Workshops
  • Inventory of relevant AI systems (aligned with the AI registry).
  • Classification (relevant categories based on use cases).
  • Mapping of obligations (by system) and responsibilities.
  • Evidence plan (documentation, traceability, transparency, oversight, robustness, etc.).
4.2 AI Act Deliverables
  • Classification Matrix & Requirements
  • Compliance Roadmap (quick wins / structural initiatives)
  • Structure of the compliance dossier + list of supporting documentation to be compiled
  • Contractual recommendations (AI providers / subcontractors) as needed
5) Project organization
Workshops (typical)
  • Kick-off + Scope Definition
  • AI Registry & AI Risk Workshops
  • AI Lifecycle & Operations Workshops
  • AI Suppliers/Third-Party Workshop
  • AI Act Workshop: Classification & Requirements
  • Internal Audit + Management Review + Readiness
Client-side roles (minimum)
  • Sponsor (Management)
  • AIMS Lead (Owner)
  • Product/AI Lead (CTO/Head of AI or equivalent)
  • Risk/Compliance Lead and/or DPO
  • Security Lead (CSO) if available
6) Duration (to be adjusted based on the scope)
  • Standard : 8 to 12 weeks for a limited “product/platform” scope and 3–10 use cases.
  • Extended : 12 to 16 weeks for multi-team, multi-product projects or those involving numerous or critical use cases.
7) What drives value (certification + compliance)
  • Accelerated sales to key accounts : fewer back-and-forth exchanges regarding “trust and assurance,” and improved responses to RFPs.
  • Risk reduction : structured AI risks, traceable decisions, and implemented controls.
  • Single, auditable framework : centralized evidence, internal audit, and management review.
  • AI Act-ready : classification + requirements + compliance plan and evidence to be provided.
8) Options
  • “AI Vendors” Package : due diligence + contract clauses + third-party monitoring
  • “AI Architecture & Security” Package : security review, hardening, logging, operational requirements
  • “Pre-Penetration Test” Package : certification audit simulation
  • “Scale” Package : extension of the AIMS to other products/use cases