EU AI Act Audit Pack

A 6-document evidence binder generated from your GitHub repository. Covers every major EU AI Act obligation enterprise buyers ask about — risk classification, technical documentation, data governance, transparency, and human oversight.

What's included

The Audit Pack is produced by scanning your repository and mapping findings to specific EU AI Act Articles and Annexes. Each document is evidence-sourced — not a blank template. The six documents together answer the compliance questionnaire your enterprise buyers send before they will sign.

1

AI System Risk Classification Report

Article 6 / Annex III

Maps your AI system to the EU AI Act risk tiers — minimal, limited, high, or unacceptable — using evidence drawn from your repository, with a written rationale your legal team can submit to enterprise buyers.

2

Technical Documentation Summary

Article 11

A structured description of your AI system's intended purpose, architectural design, training methodology, and performance characteristics, meeting the documentation requirements auditors and enterprise procurement teams check for.

3

Conformity Assessment Checklist

Annex VI

A completed checklist mapping your system against each Annex VI conformity criterion, with a pass or gap status for each item and evidence references pointing back to your codebase.

4

Data Governance Statement

Article 10

Documents the governance controls applied to your training, validation, and testing datasets — covering data quality, provenance, bias considerations, and access controls.

5

Transparency Disclosure Template

Article 13

A ready-to-publish user-facing disclosure covering your AI system's capabilities, limitations, and user rights — formatted for placement in your product UI, help documentation, or terms of service.

6

Human Oversight Protocol

Article 14

Describes how your system is designed to support effective human intervention, override, and monitoring — the specific mechanism enterprise buyers ask about most when evaluating AI vendor risk.

How the Audit Pack is generated

Connect your GitHub account (read-only scope), enter your repository URL, and Regulatory Signals scans your code, configuration, and documentation. The AI identifies AI model usage, risk indicators, data handling patterns, and transparency signals — then maps each finding to the relevant Article or Annex. The six documents are generated from that evidence, not typed by hand.

Private repositories are supported. Your source code is never stored beyond the scan session.

Who it's for

  • SaaS founders whose AI features are blocking enterprise deals because procurement teams require EU AI Act documentation before signing.
  • Compliance teams at mid-market software companies who need auditor-ready evidence without a six-figure consulting engagement.
  • Legal counsel who need a structured starting point — evidence already mapped to Articles 6, 10, 11, 13, and 14 — rather than a blank page.

Pricing

The Audit Pack is $39 one-time. No subscription. No recurring charge.

Alternatively, the Audit Pack is included in the Professional and Enterprise subscription plans alongside continuous monitoring, all policy document types, and daily regulatory feeds.

EU AI Act enforcement for Annex III high-risk systems begins August 2, 2026. Enterprise buyers are already issuing compliance questionnaires. The Audit Pack gives you answers.

Related pages

Audit Pack documents are compliance documentation drafts, not legal advice. All output should be reviewed by qualified legal counsel before submission to customers or regulators. See our full disclaimer.