Your AI Feature Just Blocked a $200k Enterprise Deal.
Enterprise buyers are asking for EU AI Act documentation before signing. Credo AI costs €50K+. Here's how to unblock your deal in 48 hours.
The questionnaire that kills deals
Your enterprise prospect loves the product. The champion is ready to sign. Then the legal or procurement team opens a security review and drops a 20–40 question AI compliance questionnaire in your lap.
The questions aren't abstract. They are specific obligations from the EU AI Act:
- Article 6 / Annex III — What risk classification does your AI system fall under? Provide a written classification rationale.
- Article 11 — Do you maintain technical documentation describing your AI system's intended purpose, performance characteristics, and design logic?
- Article 10 — What is your data governance policy for training and validation datasets?
- Article 13 — Do you publish a transparency disclosure to users interacting with your AI system?
- Article 14 — How does your system support human oversight and override?
Most founders have shipped the AI feature. Almost none have the documentation. The deal stalls. Sometimes it dies.
What the Audit Pack delivers
The Audit Pack is a 6-document evidence binder sourced from your actual repository — not a generic template you fill in manually.
- AI System Risk Classification Report — Article 6 / Annex III mapping. Documents which risk tier your system lands in and the evidence trail that supports the classification.
- Technical Documentation Summary — Article 11 requirements. A structured description of your AI system's intended purpose, architecture, training approach, and performance boundaries.
- Conformity Assessment Checklist — Annex VI checklist. A completed checklist mapping your system to each conformity criterion, with pass/gap status and evidence references.
- Data Governance Statement — Article 10 data requirements. Documents the governance controls applied to your training, validation, and testing datasets.
- Transparency Disclosure Template — Article 13 user-facing obligations. A ready-to-publish disclosure for your product UI or help docs, covering AI system capabilities, limitations, and user rights.
- Human Oversight Protocol — Article 14 human oversight requirements. Describes how your system is designed to allow effective human intervention, override, and monitoring.
Every document is generated from evidence found in your repository. When a buyer's legal team asks where a claim comes from, you have a source — not a template assertion.
How it works — three steps, about five minutes
- Paste your GitHub URL. Connect your GitHub account (read-only scope) and enter the repository that contains your AI feature. Private repos are supported.
- AI scans your repo and maps to EU AI Act obligations. The scanner reads your code, configuration, and documentation. It identifies AI model usage, risk indicators, data handling patterns, and transparency signals. Each finding is mapped to the relevant Article or Annex.
- Download your 6-document binder. The Audit Pack generates in minutes. Download it, forward the documents to your legal counsel for review, and send to the buyer's procurement team.
Pricing
The Audit Pack is a one-time purchase at $39. No subscription required.
Credo AI starts at €50,000 per year. A compliance consultant engagement for a single AI Act questionnaire typically runs $5,000–$15,000. The Audit Pack gets you auditor-ready documentation from your actual codebase for the cost of a dinner.
Frequently asked questions
Do I need to be classified as high-risk to need this documentation?↓
No. Enterprise procurement teams send AI Act questionnaires regardless of your risk tier. Even limited-risk AI systems face transparency obligations (Article 13) and buyers want documented evidence you have reviewed your obligations. The Audit Pack covers both the classification result and the documentation trail — so you can answer the questionnaire whether you land at minimal, limited, or high risk.
What if my repository is private?↓
When you connect your GitHub account, Regulatory Signals requests read-only repo scope. Your code is scanned server-side and is never stored beyond the scan session. The resulting Audit Pack documents are yours — they contain only findings and policy text, not your source code.
Is the Audit Pack legal advice?↓
No. The Audit Pack is compliance documentation tooling, not legal advice. The documents give your legal counsel a structured starting point — evidence mapped to Articles 6, 10, 11, 13, 14, and Annex III — rather than a blank page. You should have a qualified lawyer review and sign off before submitting documentation to a customer or regulator.
Get your Audit Pack in under 5 minutes
Paste your GitHub URL, scan your repository, and download a 6-document EU AI Act evidence binder. $39 one-time — no subscription required.
This page is informational only and does not constitute legal advice. All Audit Pack output should be reviewed by qualified legal counsel before submission to customers or regulators. Read our methodology and full disclaimer.