EU AI Act August 2026 deadline compliance checklist for German companies
compliance

EU AI Act August 2026: Compliance Checklist for German Businesses

Short answer

The EU AI Act's main obligations take full effect on August 2, 2026. German companies using customer-facing AI, automated decision systems, or general-purpose AI must complete their compliance review before this date or face fines of up to €15 million or 3% of global annual turnover.

  • Transparency disclosures for customer-facing AI chatbots are mandatory from August 2, 2026.
  • High-risk AI systems used in HR, credit scoring, or critical infrastructure require conformity assessments.
  • All AI vendor contracts must reflect Articles 25 and 26 deployer obligations under the AI Act.
  • The Bundesnetzagentur (BNetzA) is Germany's designated AI Act enforcement authority.

The EU AI Act’s main obligations for general-purpose AI systems and transparency requirements take effect on August 2, 2026. German companies using customer-facing AI, automated decision systems, or general-purpose AI models must complete their compliance review before this date or face fines of up to €15 million or 3% of global annual turnover.

This deadline is not new — but many German companies have underestimated the preparation time required. With the deadline approaching, compliance teams, legal departments, and CTOs need a structured action plan. This guide provides a practical checklist and explains which obligations apply from August 2, 2026.

What Happens on August 2, 2026?

August 2, 2026 is not the AI Act’s first enforcement date — it is the most significant one for most businesses. The regulation rolled out in phases:

DateWhat took effect
August 1, 2024AI Act entered into force
February 2, 2025Prohibited AI systems banned (Art. 5)
August 2, 2025GPAI model obligations (Art. 51–56), governance rules
August 2, 2026High-risk AI obligations, transparency rules (Art. 50), full fine regime
August 2, 2027High-risk AI already regulated under EU product safety law

From August 2, 2026, the following obligations become mandatory:

  • Article 50: Transparency disclosures for customer-facing AI (chatbots, emotion recognition, deepfakes)
  • Articles 8–15: Technical documentation, conformity assessment, human oversight, and logging for high-risk AI systems
  • Articles 25–26: Deployer and operator obligations when using third-party AI systems
  • Article 71: Full fine regime operative for all violation categories

If your company already addressed GPAI obligations after August 2, 2025, the next compliance layer is high-risk AI and transparency disclosures.

Who Does the EU AI Act Apply To?

The EU AI Act applies to any company that places AI systems on the EU market or puts them into service — regardless of where the company is headquartered. This covers:

  • German companies using AI in their products or business operations
  • Non-EU companies with EU customers or deploying AI within EU member states
  • AI providers who develop and supply AI systems: subject to provider obligations
  • Deployers who operate AI systems developed by third parties: subject to deployer obligations

The provider/deployer distinction matters practically. If you purchase a CRM platform with embedded AI recommendation features and use it in your customer operations, you are a deployer — and Articles 25–26 apply to you directly.

SME provisions: Micro and small enterprises benefit from some reduced obligations. National competent authorities must provide simplified guidance, and certain documentation requirements are lighter. However, core obligations — prohibited AI classification, transparency disclosures, and high-risk conformity assessments — still apply.

The August 2026 Action Checklist

Use this structured checklist to complete your compliance preparations before August 2, 2026.

Step 1: Inventory All AI Systems in Use

  • Map every AI system used internally or embedded in customer-facing products
  • Include third-party AI embedded in software your company uses (CRM AI, HR tools, chatbots, scoring engines, recommendation systems)
  • Document what each system does, who supplies it, and what personal data it processes
  • Identify whether your company acts as provider, deployer, or both for each system

Step 2: Classify Each AI System

  • Prohibited AI (Art. 5): social scoring, real-time public biometric surveillance, manipulative AI — must have been deactivated since February 2, 2025
  • High-risk AI (Annex III): requires conformity assessment, technical documentation, human oversight, registration
  • Limited-risk AI: transparency obligations apply (chatbots, emotion recognition systems, deepfake generators)
  • Minimal-risk AI: no mandatory obligations — good governance still recommended

Step 3: For High-Risk AI Systems

  • Conduct or obtain a conformity assessment as required under Art. 9–15
  • Prepare and maintain technical documentation as specified in Annex IV
  • Implement human oversight procedures — ensure consequential decisions can be reviewed by a human
  • Register the system in the EU AI Act public database where required
  • Establish post-market monitoring procedures and incident reporting mechanisms

Step 4: For Customer-Facing AI (Art. 50)

  • Add clear disclosure when customers interact with an AI chatbot or automated assistant
  • Label AI-generated content: synthetic images, audio, video, and text must be marked as AI-generated
  • Review whether any AI system infers customer emotions — additional disclosure requirements apply
  • Ensure disclosure is given at the beginning of the interaction, not buried in terms of service

Step 5: Review AI Vendor Contracts

  • Confirm vendors provide required technical documentation under Art. 25
  • Ensure deployer obligations under Art. 26 are reflected in existing contracts
  • Request AI Act compliance documentation from vendors — treat this like a GDPR data processing agreement review
  • Update standard contract templates to include AI Act compliance clauses

Step 6: Appoint an AI Compliance Contact

  • Designate an internal contact for AI Act compliance (does not need to be a lawyer)
  • Ensure the contact coordinates between legal, IT, and business operations
  • Brief relevant teams on their obligations under the AI Act

Step 7: Document Your AI Governance Process

  • Implement a risk management process for AI systems in scope (Art. 9)
  • Record decisions made by or with AI systems that affect individuals
  • Establish internal escalation procedures for AI-related incidents

Step 8: Review GDPR Intersections

  • Identify any AI systems making automated decisions with legal or significant effects (Art. 22 GDPR)
  • Verify that DPAs with AI vendors are updated to cover AI Act roles and obligations
  • Confirm data minimization compliance for AI training data where applicable
  • Consider whether existing DPIAs need to incorporate AI Act risk assessments

For a detailed breakdown of AI vendor assessment obligations, see our guides on AI chatbot compliance under the EU AI Act and Claude Enterprise legal compliance.

High-Risk AI: Which Systems Qualify Under Annex III?

The EU AI Act’s Annex III lists the categories where AI systems are treated as high-risk by default. German businesses most frequently encounter these categories:

Annex III CategoryGerman business examples
Employment and HR (no. 4)AI recruitment screening, performance evaluation tools, workforce analytics, shift planning AI
Credit scoring and financial services (no. 5b)Automated loan approval, credit risk assessment, insurance risk scoring
Biometric identification (no. 1)Facial recognition for access control, attendance biometrics
Education and training (no. 3)AI that assesses student performance, allocates training, or filters applications
Critical infrastructure (no. 2)AI in energy grid management, water treatment, logistics networks
Public services (no. 5a)AI assessing eligibility for benefits, public procurement AI

For companies using AI in HR and recruitment: Obligations are significant. AI used to filter CVs, score candidates, evaluate performance, or influence promotion or termination decisions falls under Annex III no. 4. This requires a conformity assessment, human oversight, transparency to affected persons, and detailed technical documentation.

See our dedicated guides on AI hiring tools and EU AI Act compliance and the Recruitment and HR industry AI Act overview for sector-specific requirements.

Transparency Obligations for Customer-Facing AI (Art. 50)

Article 50 is the provision most immediately relevant to companies that use chatbots, virtual assistants, or any AI that interacts directly with customers or users.

From August 2, 2026, operators must:

1. Inform users they are interacting with AI. If your customer support uses a chatbot, users must be clearly notified before or at the start of the interaction. A small disclaimer buried in terms of service is not sufficient.

2. Label AI-generated content. Synthetic images, audio recordings, video, and written content generated by AI must be marked as AI-generated. This applies to marketing materials, product imagery, voice-over content, and customer-facing documents.

3. Disclose emotion recognition. If your AI system analyses a customer’s emotional state during a support call, video interaction, or any other touchpoint, this must be disclosed. Users must be informed before the analysis begins.

4. What counts as sufficient disclosure? The AI Act does not specify exact wording. The standard is that a reasonable person must understand they are interacting with an AI system. A clear statement at the beginning of the interaction is the safer approach.

The transparency obligation applies regardless of the AI system’s risk level. A minimal-risk customer service chatbot still requires Art. 50 disclosure. For companies using AI in customer operations, review our guide on AI customer service compliance.

GDPR and the AI Act: Where They Overlap

The AI Act and GDPR operate in parallel. For German companies already subject to GDPR, AI Act compliance creates a second regulatory layer. The key overlap areas are:

Automated decision-making: GDPR Article 22 gives individuals the right not to be subject to decisions made solely by automated processing with legal or similarly significant effects. The AI Act does not replace Art. 22 — it adds further obligations on the AI systems used to reach those decisions.

DPA/AVV implications: Under GDPR, using AI from a third-party vendor typically requires a data processing agreement (Auftragsverarbeitungsvertrag, AVV). The AI Act’s deployer obligations under Articles 25–26 must now also be reflected in vendor contracts. Standard AVV templates may need updating to cover AI Act compliance roles.

Data protection impact assessments: A DPIA under GDPR may already be required where AI processes personal data at high risk to individuals. Where AI Act high-risk classification also applies, the two assessments can often be combined — but both must be addressed.

Data minimization for AI training: If you train or fine-tune AI models using personal data, Art. 10 of the AI Act sets data governance requirements that align with GDPR’s data minimization principle but add specificity around training data quality and bias evaluation.

For AI employee monitoring — which sits at the intersection of AI Act high-risk classification, GDPR, and German labor law (BetrVG) — see our detailed compliance guide on AI employee monitoring in Germany.

Consequences of Non-Compliance

The AI Act’s full fine regime becomes operative from August 2, 2026. The structure is:

Violation typeMaximum fine
Using prohibited AI (Art. 5)€35 million or 7% of global annual turnover
High-risk AI obligations violations€15 million or 3% of global annual turnover
Incorrect information to authorities€7.5 million or 1% of global annual turnover

For SMEs and start-ups, the percentage cap typically applies over the absolute figure. For large enterprises, the absolute cap may be lower.

Enforcement in Germany: The Bundesnetzagentur (BNetzA) is designated as Germany’s primary national market surveillance authority under the AI Act for most sectors. The European AI Office provides coordinating oversight at EU level. German data protection authorities may also act where AI Act violations intersect with GDPR.

As of early 2026, the BNetzA has signaled a graduated enforcement approach — prioritising the most serious violations (prohibited AI, high-risk AI without conformity assessment) before moving to transparency infractions. However, building compliance around enforcement tolerance is not a sustainable strategy. The August 2, 2026 deadline is a firm legal date.

Sector-Specific AI Act Guidance

AI Act obligations vary by industry due to the sector-specific scope of Annex III. For guidance on how the AI Act applies to your industry:

What to Do Before August 2, 2026

For most German companies, the practical priority order is:

  1. Inventory first — you cannot comply with obligations for systems you have not mapped
  2. Classify by risk — prohibited AI should have been addressed since February 2025
  3. Address high-risk AI — conformity assessments take time; start as early as possible
  4. Fix customer-facing AI disclosures — Article 50 is relatively straightforward to implement and provides a visible compliance signal
  5. Update vendor contracts — combine AVV updates with AI Act deployer obligation reviews
  6. Document governance — risk management documentation is both a requirement and a defense in enforcement proceedings

Compound Law supports businesses navigating EU AI Act compliance across all phases — from initial AI inventory and risk classification to vendor contract review and ongoing compliance documentation. The above is general legal information; specific situations require tailored legal advice.

Related Compliance Guides

EU AI Act and GDPR legal advisory law firm Germany
Guides

EU AI Act & GDPR Legal Advisory for Companies in Germany

Compound Law advises businesses in Germany on EU AI Act compliance and GDPR. Legal counsel for AI regulatory requirements across the DACH region.

AI API BRAO compliance guide for German law firms
compliance

AI APIs for Law Firms: BRAO Compliance Guide Germany

Using AI APIs as a German law firm: what §43a BRAO, §43e BRAO, and GDPR require for ChatGPT, Claude, and other AI tools in legal practice.

Enterprise search GDPR AI document search compliance Germany
compliance

Enterprise Search and GDPR: AI Document Search Compliance

How German companies can deploy AI enterprise search (Microsoft 365 Copilot, Google Workspace AI) in a GDPR-compliant way — DPA, BetrVG, and SCCs explained.

Frequently asked questions

When does the EU AI Act fully apply?

The EU AI Act's main obligations — including transparency rules, general-purpose AI requirements, and high-risk system obligations — take full effect on August 2, 2026.

Does the AI Act apply to German SMEs?

Yes, but SMEs benefit from lighter obligations. Micro and small enterprises may face reduced documentation requirements, but core obligations including prohibited AI, transparency, and high-risk conformity still apply.

What is the fine for AI Act non-compliance in Germany?

Fines vary by violation: up to €35 million or 7% of global annual turnover for prohibited AI; up to €15 million or 3% of turnover for high-risk AI violations; up to €7.5 million for misleading authorities.

Do I need to disclose that my chatbot is AI?

Yes. Under Article 50 of the EU AI Act, operators of AI chatbots must clearly inform users they are interacting with an AI system — unless the context makes it completely obvious.

Which AI systems are banned under the AI Act?

The AI Act bans social scoring by public authorities, real-time remote biometric surveillance in public spaces (with narrow exceptions), manipulative AI targeting vulnerabilities, and emotion recognition in workplaces or schools.

What is a high-risk AI system under the AI Act?

High-risk AI systems are listed in Annex III of the AI Act. They include AI used in recruitment, credit scoring, biometric identification, education, critical infrastructure management, and certain law enforcement applications.

Who enforces the AI Act in Germany?

The Bundesnetzagentur (BNetzA) is Germany's designated national market surveillance authority for the AI Act. The European AI Office provides coordinating oversight at EU level.

Do AI vendor contracts need to be updated for the AI Act?

Yes. Deployers must ensure vendor contracts address Articles 25 and 26 obligations, covering technical documentation, human oversight procedures, and data governance requirements.

Book Free Call