ISO 27001:2022 certification audits now regularly surface AI tool usage as a finding. Auditors have updated their interview questions and evidence requests to reflect the reality that most certified organizations are using AI tools — and most ISMS documentation hasn't caught up.
The gap is predictable: ISO 27001 was designed for documented, controlled information systems. AI tools entered most organizations informally, often without IT or compliance involvement. The result is an ISMS that documents policy around email, cloud storage, and internal applications while treating ChatGPT, GitHub Copilot, and Notion AI as outside scope.
That approach no longer passes audit.
The ISO 27001:2022 Controls That AI Tools Affect
ISO 27001:2022 reorganized Annex A into four themes: Organizational, People, Physical, and Technological. AI tools affect controls across all four, but the most frequently cited ones in AI-related findings are:
Organizational Controls
A.5.9 — Inventory of information and other associated assets
The asset register must include all information assets in scope. An AI tool that processes business data is an information asset. Missing from the register: finding.
What auditors check: Is there an asset register entry for approved AI tools? Does it include vendor, version/tier, data classification permitted, and owner?
What you need: An AI tool register that feeds into the ISMS asset register. One entry per tool, including the data classification the tool is permitted to process.
A.5.19 — Information security in supplier relationships A.5.20 — Addressing information security within supplier agreements
AI vendors are suppliers. The requirement to manage supplier information security risk applies to them. This includes:
- Assessing the AI vendor's security posture before use
- Having a data processing agreement where the vendor processes personal data
- Monitoring for changes to vendor security commitments (such as changes to training data policies or terms of service)
What auditors check: Is there a supplier register that includes AI vendors? Is there a DPA with each AI vendor that processes personal data? Is there a process to review AI vendor security annually?
What you need: AI vendors added to the supplier register. DPAs signed (or documented as unavailable with risk accepted). Annual review process for AI vendor security practices.
A.5.10 — Acceptable use of information and other associated assets
Acceptable use policy must cover AI tools — specifically what data may and may not be entered into each tool. Most existing AUPs were written before AI tools existed and don't mention them.
What auditors check: Does the AUP mention AI tools? Does it specify data classification rules for AI tool inputs? Has staff signed or acknowledged the updated policy?
What you need: AUP updated to explicitly address AI tools. Data classification rules per tool tier. Documented staff acknowledgment.
A.5.36 — Compliance with policies, rules, and standards for information security
The policy exists — is it followed? Auditors look for evidence that AI tool governance is operationalized, not just documented.
What auditors check: Staff interview responses about AI tool usage. Training records showing AI governance training. Audit trails of AI tool access.
What you need: Training records showing AI acceptable use training. Evidence of monitoring or spot-checks of AI tool usage.
Technological Controls
A.8.20 — Networks security A.8.23 — Web filtering
If AI tools are accessed via the corporate network or managed devices, network controls should address them. This includes whether AI tool domains are allowed by default or require explicit approval.
What you need: Documented decision on whether AI tool access is controlled at the network layer. For high-risk tools, evidence that access is restricted to approved users.
A.8.32 — Change management
When AI tools are integrated into production processes (automated summarization, content generation, decision support), changes to those integrations should go through change management. Updating a model version or changing a prompt template is a change that affects output quality and risk.
What you need: Change management procedure updated to cover AI tool integrations. Version control for prompts used in production AI workflows.
Evidence Matrix for ISO 27001 Audits
| Control | Evidence auditors want | Common gap |
|---|---|---|
| 5.9 Asset register | AI tools listed with owner, classification, version | AI tools missing entirely |
| 5.19 Supplier assessment | Due diligence record per AI vendor, dated | No records for free or trial tier tools |
| 5.20 Supplier agreements | DPA or equivalent for each vendor processing personal data | Missing DPA for consumer-tier AI tools |
| 5.10 Acceptable use policy | AUP with AI-specific data classification rules, signed by staff | AUP doesn't mention AI |
| 5.36 Policy compliance | Training records, monitoring evidence | No AI governance training documented |
| 8.24 Cryptography | Evidence that AI output retention meets encryption requirements | AI outputs stored unencrypted in notes tools |
| 8.32 Change management | Change records for AI integration updates | No change management applied to AI prompt/model updates |
The Three Most Common AI-Related ISO 27001 Findings
Finding 1: Shadow AI tools not in the asset register
The most common finding. Auditors interview staff and discover AI tools being used regularly that are not in the ISMS documentation. Often these are consumer-tier ChatGPT, Grammarly, or browser extensions with AI features.
Fix: Run an AI tool discovery survey (3 questions: what tools, what data, personal or company account). Add results to asset register before the next audit. Update quarterly.
Finding 2: AI vendors not in the supplier register
AI vendors aren't recognized as suppliers because they weren't procured through the standard vendor onboarding process. The IT security team didn't assess them. No DPA exists.
Fix: Define AI vendors as suppliers in the supplier management procedure. Retroactively assess current AI vendors using a risk-tiered approach (high-risk first). Get DPAs where required.
Finding 3: AUP doesn't mention AI tools
The acceptable use policy exists and is enforced for standard systems, but has no language about AI. Staff who use AI for work have no policy guidance — and auditors see a control gap.
Fix: Add an AI section to the AUP. Minimum content: approved tools list, prohibited data types, what to do with AI output (human review requirements), how to report AI-related incidents.
Practical Documentation: What an ISMS AI Section Needs
For a small team seeking or maintaining ISO 27001 certification, the minimum documentation set for AI governance is:
| Document | Content | Linked ISMS policy |
|---|---|---|
| AI tool register | List of approved AI tools with owner, data classification, DPA status | Asset register (5.9) |
| AI vendor risk assessments | One per vendor with SOC 2, DPA, training opt-out status | Supplier register (5.19) |
| AI acceptable use policy (or AUP addendum) | Approved tools, data rules, prohibited uses | Acceptable use policy (5.10) |
| AI governance training record | Date, attendees, content | Awareness training (6.3) |
| AI incident log | AI-related security events and near-misses | Incident management |
| Change records for AI integrations | Version updates, prompt changes, new tool onboarding | Change management (8.32) |
This documentation set is sufficient for most small team audits. Larger organizations may need more — but getting the register, AUP update, DPAs, and training records in order addresses the most frequent findings.
Preparing for the Auditor Interview
Auditors increasingly interview staff directly about AI tool usage. Typical questions include:
- "Do you use any AI tools in your work? Which ones?"
- "What kind of information do you put into those tools?"
- "Is there a policy that tells you what information you can and can't put in?"
- "If you weren't sure whether it was okay to use an AI tool with a specific type of data, what would you do?"
The last question is the most revealing. If staff know the process (check the approved tools list, ask IT security, don't use if unsure), the control is operationalized. If they shrug, the control is documentation only.
Starting your AI governance documentation from scratch? The AI Tool Register Template is copy-paste ready and maps to the asset register requirements above. For vendor assessment records, the AI Vendor Due Diligence Checklist covers the supplier assessment questions ISO 27001 auditors look for — including SOC 2 status, DPA availability, and training data opt-out.
