ISO 27001:2022 has 93 controls across four domains. When organizations say they've "implemented" those controls, what they usually mean is they've written policies, run a gap assessment, and have some monitoring in place. What they rarely have is a systematic way to test whether those controls are actually functioning on an ongoing basis.

Control testing is the step between "we have a policy" and "we can prove it's working to a certification auditor." It's also the step most organizations either skip or handle manually at year-end, which is why so many ISO 27001 certification audits produce nonconformities that should have been caught months earlier.

Here's a breakdown of how we mapped ISO 27001 controls to automated testing — and where the 27% that can't be automated actually sits.

The Starting Framework: What Counts as "Automated"

Our definition of automated control testing: the control status is assessed by a system-generated check, without a human manually pulling data or running a test. The result is captured, timestamped, and mapped to the specific control with no manual intervention required.

Partial automation — where a human initiates a script that generates a report — counts as manual for our purposes. The goal is zero-touch evidence collection for as many controls as possible.

Domain 1: Organizational Controls (37 Controls)

This is the hardest domain to automate, and our coverage here is lowest. Controls like 5.1 (Information Security Policies), 5.4 (Management Responsibilities), and 5.37 (Documented Operating Procedures) require human judgment and organizational processes that don't generate machine-readable evidence.

What we can automate in this domain:

  • 5.9 Inventory of Information and Other Assets: API pull from asset management systems, auto-reconciliation against IdP and CMDB
  • 5.10 Acceptable Use: Policy acknowledgment tracking via HR system integration — automated completion reporting
  • 5.23 Information Security for Use of Cloud Services: Cloud configuration assessment against defined baselines, automated daily
  • 5.36 Compliance with Policies: Cross-reference policy documentation versions against sign-off records, automated monthly

Automation coverage in organizational controls: approximately 35%.

Domain 2: People Controls (8 Controls)

This domain is almost entirely human-process-dependent. Background checks, security training, disciplinary processes — these require human initiation and human judgment. What we can capture automatically: training completion rates (via LMS integration), onboarding checklist completion (via HRIS integration), and policy acknowledgment timestamps.

Automation coverage in people controls: approximately 62%, but mostly on the evidence capture side rather than the testing side.

Domain 3: Physical Controls (14 Controls)

For cloud-native companies, most physical controls apply to data center providers, not to internal facilities. In those cases, the "control test" is verifying that your data center provider's certifications are current and their controls cover your requirements. This is fully automatable: pull the vendor's SOC 2 or ISO 27001 certification status via API or scheduled document check, flag when certs are within 90 days of expiration.

For companies with physical offices, access badge logs, visitor logs, and CCTV uptime monitoring are all automatable given the right integrations. Coverage here depends heavily on infrastructure, but for cloud-native organizations it approaches 80%.

Domain 4: Technological Controls (34 Controls)

This is where automation coverage peaks, and it's what drives the overall 73% figure. Almost every control in this domain generates machine-readable evidence by design:

  • 8.2 Privileged Access Rights: Automated quarterly report from IdP showing all accounts with elevated permissions, compared against approved list
  • 8.7 Protection Against Malware: EDR agent coverage report, automated daily — flag any endpoint without active protection
  • 8.8 Management of Technical Vulnerabilities: Vulnerability scanner results, automated weekly cadence, mapped to CVE severity thresholds
  • 8.15 Logging: Log completeness check against defined source list, automated daily alert if sources go silent
  • 8.16 Monitoring Activities: SIEM alert volume trend, automated weekly report
  • 8.20 Networks Security: Firewall rule audit against approved baseline, automated monthly
  • 8.24 Use of Cryptography: Certificate expiration monitoring across all domains and services, automated daily
  • 8.28 Secure Coding: SAST scan coverage rate and critical finding trend, automated per-commit

Automation coverage in technological controls: approximately 88%.

The 27% That Requires Human Judgment

The controls that remain manual share a common characteristic: they require assessing whether something is appropriate, not just whether it exists. Risk assessments (6.1.2), supplier relationship reviews (5.19–5.22), incident response effectiveness reviews (5.26), and business continuity exercise outcomes — these are judgment calls that no system can replace.

The practical implication is that your compliance program should be designed around this split. Automate everything in the technological domain first — it's highest coverage, fastest ROI, and it's the evidence auditors scrutinize most carefully. Then build structured processes for the human-judgment controls with clear documentation requirements and calendar-driven reminders.

What organizations that achieve full ISO 27001 certification efficiently have in common: they don't treat the 93 controls as a uniform list. They triage by automation potential, execute the automated controls on a defined cadence, and reserve human effort for the controls where it actually matters.

Implementation Timeline

Getting to 73%+ automation coverage from a standing start typically takes 8–12 weeks. Week 1–2: map your existing toolstack to controls and identify integration points. Week 3–6: build integrations and define collection schedules. Week 7–10: validate output against auditor requirements and remediate gaps. Week 11–12: run a full cycle and verify completeness before engaging your certification body.

The payoff at certification time is substantial. Instead of spending 3–4 weeks manually gathering evidence, your audit package is largely pre-built. Certification auditors spend less time chasing artifacts and more time on substantive control questions — which is where a well-run program should be spending auditor time anyway.

RegaLoop automates ISO 27001 control testing across your entire tech stack.

See how we map your existing tools to Annex A controls and generate audit-ready evidence automatically.

Book a Demo