AI Tools for Law Firms in Germany: BRAO & GDPR Guide
Can German Lawyers Use AI Tools?
Yes — under strict conditions. German law firms can use AI tools such as Claude Enterprise or Azure OpenAI if a GDPR Art. 28 Data Processing Agreement is in place, no client data is used for model training, and a §43e BRAO professional-secrecy clause supplements the DPA. Provider and configuration choices are decisive.
- Sign a GDPR Art. 28 DPA and add a §43e BRAO professional-secrecy clause
- Choose providers that contractually exclude client data from model training
- Verify EU data residency or valid Standard Contractual Clauses (SCCs)
Yes — German lawyers can use AI tools in their practice. Under German professional law, AI APIs and tools are permissible for legal work provided that §43e BRAO requirements are met: a written confidentiality commitment from the provider, a §203 StGB briefing clause, a training data opt-out for client matters, and a GDPR-compliant Data Processing Agreement (DPA). Compound Law is itself a law firm that uses AI tools in practice — and advises other firms on the same compliance architecture.
BRAO §43e and Attorney-Client Privilege When Using AI
§43a BRAO imposes a core professional duty: everything entrusted to a lawyer in their professional capacity is subject to attorney-client privilege (Mandatsgeheimnis). This duty does not end where digital processing begins.
§43e BRAO — introduced by the 2022 BRAO reform (effective 1 August 2022) — creates the professional-law basis for engaging external service providers, including AI API vendors, to process client matter data. The provision permits AI use but simultaneously defines the contractual obligations that must be satisfied beforehand.
What Falls Under Attorney-Client Privilege?
Attorney-client privilege covers all information a lawyer receives or produces in the course of a mandate:
- Facts disclosed by clients
- Pleadings, contracts, and documents prepared for the matter
- Strategic considerations and legal assessments
- Correspondence with clients, courts, and authorities
- Client identity and identities of other parties involved
As soon as any of this information is transmitted to an external AI API, §43e BRAO applies. What matters is not the provider’s reputation or marketing language — it is that client matter data has left the firm’s sphere.
When Is the Use of AI APIs Permissible?
AI API use is permissible when all of the following conditions are cumulatively satisfied:
- The AI provider has been bound to confidentiality in writing and briefed on §203 StGB
- A no-training clause exists: client data is not used for model training or improvement
- Purpose limitation is agreed — no processing beyond the specific mandate purpose
- A GDPR Art. 28 DPA has been signed
- Sub-processors of the AI provider are known and bound correspondingly
- Deletion and exit provisions are contractually anchored
A DPA alone does not satisfy the professional-law requirements. §43e BRAO sets stricter standards than the GDPR baseline — both frameworks must be implemented together. For a full statutory analysis, see our §43e BRAO deep-dive guide.
GDPR Requirements for AI in Law Firms
In addition to professional law, the GDPR applies to every processing operation involving personal data — and client matter data almost always includes personal data.
Data Processing Agreement with AI Providers
As soon as an AI provider processes personal client data, a GDPR Art. 28 DPA is mandatory. The DPA must contractually define the provider’s obligations as a data processor: purpose limitation, technical and organizational measures (TOMs), deletion obligations, assistance with data subject requests, and audit rights.
Without a DPA, the processing is unlawful under EU data protection law. For AI providers outside the EU or EEA, an additional mechanism for international transfers under GDPR Art. 44 ff. is required — typically Standard Contractual Clauses (SCCs).
Important: the DPA governs the data protection framework; it does not satisfy the professional-law obligations under §43e BRAO. Both instruments must exist side by side.
Data Minimization for Client Matter Data
The data minimization principle under GDPR Art. 5(1)(c) is particularly relevant in AI practice. Before every API call, firms should assess:
- Which data is actually necessary for the AI task?
- Can names, case numbers, and direct identifiers be replaced with placeholders?
- Is an anonymized or pseudonymized version sufficient for the AI analysis?
Default rule: only transfer to AI APIs what is strictly necessary for the specific task. Checking a brief for formal legal arguments does not require transmitting client identity or case numbers.
Purpose Limitation and Retention Periods
Purpose limitation means: client data transmitted for a specific AI purpose (e.g., document analysis) may not be used for other purposes (e.g., model improvement). This obligation follows from both GDPR Art. 5(1)(b) and §43e BRAO.
Retention periods must be contractually defined for all AI-processed client data:
- Temporary inference storage at the provider
- Chat logs and API call logs
- Data held at the provider upon contract termination (exit provision)
Without documented retention and deletion rules, AI use with client data is neither GDPR-compliant nor BRAO-compliant.
AI Provider Comparison: BRAO Suitability for Law Firms
The following table reflects publicly available information as of publication. BRAO suitability always depends on the specific contract terms and configuration.
| Provider | EU Data Residency | DPA Available | No Training on Client Data | SCCs |
|---|---|---|---|---|
| Claude Enterprise (Anthropic) | Limited (US-AWS; EU config possible) | ✓ | ✓ (Enterprise) | ✓ |
| Anthropic API | US default; EU region available | ✓ | ✓ (API terms) | ✓ |
| OpenAI API / Enterprise | US default | ✓ (Enterprise) | ✓ (API default) | ✓ |
| Azure OpenAI Service | ✓ (EU regions available) | ✓ (Microsoft) | ✓ | ✓ |
| Perplexity | Limited | Limited | Not clearly contractual | Limited |
Note: This table is based on publicly accessible provider information and does not replace individual legal review. Contract terms change, and actual BRAO suitability depends on the concrete configuration.
For detailed provider analyses, see: Anthropic API for Law Firms, OpenAI API Compliance, and Azure OpenAI Data Privacy.
Practical Checklist: Making AI Use BRAO-Compliant
Before client matter data enters any AI API, every firm should have the following in place:
- §43e BRAO clause executed — written confidentiality commitment with §203 StGB briefing
- No-training clause agreed — explicit prohibition on using client data for model training
- GDPR Art. 28 DPA signed — in addition to the §43e BRAO clause, not instead of it
- Purpose limitation documented — AI use restricted to the specific mandate purpose
- Sub-processor list obtained and reviewed — all provider third parties correspondingly bound
- Retention and exit provisions contractually defined — including data deletion at termination
- EU data residency or SCCs confirmed
- Data minimization implemented — anonymization before API calls where feasible
- Internal AI usage policy in place — approved tools, prohibited data categories, approval process
- §43e Abs. 5 BRAO client consent assessed — particularly for sensitive matters
If you are uncertain whether your current AI vendor contracts satisfy these requirements, a structured contract review can identify gaps quickly. Our SaaS agreement service covers AI API contract analysis under §43e BRAO.
Common Mistakes and How to Avoid Them
Mistake 1: Signing only a DPA and overlooking §43e BRAO
A GDPR DPA is necessary but not sufficient. §43e BRAO additionally requires a written confidentiality commitment with a §203 StGB briefing and a no-training clause. Firms that only satisfy data protection requirements have not fulfilled professional law obligations.
Mistake 2: Using standard ChatGPT for client matter analysis
The public version of ChatGPT provides no DPA and no contractual guarantee against training data use. Any input containing client matter information is potentially incompatible with BRAO. Only enterprise configurations with a complete contract package are appropriate for legal practice.
Mistake 3: Not assessing client consent under §43e Abs. 5 BRAO
§43e Abs. 5 BRAO requires client consent when data is processed beyond the specific mandate purpose. For sensitive matters — M&A, criminal defense, insolvency — this question should be explicitly assessed and documented before deploying AI.
Mistake 4: Ignoring data minimization
Submitting complete pleadings with full client identifiers to AI APIs when an anonymized version would suffice violates the GDPR data minimization principle. It creates regulatory exposure without operational benefit.
Mistake 5: Not reviewing the provider’s sub-processor chain
AI providers regularly use third parties for infrastructure, monitoring, and security. §43e BRAO requires that this entire chain is correspondingly bound to confidentiality. Requesting the sub-processor list is a necessary step before production deployment.
Conclusion: Building a Defensible AI Governance Structure
Compliant AI use in German law firms is achievable — but it requires structured preparation. The foundation is combining professional-law compliance under §43e BRAO with data protection compliance under the GDPR, supported by organizational measures: an internal AI usage policy, staff training, and documented vendor due diligence.
For most firms, the first step is reviewing existing AI vendor contracts against BRAO requirements and retrofitting missing clauses. Compound Law accompanies firms and in-house legal teams through this process — from contract analysis to a complete AI governance structure.
Further reading:
- §43e BRAO explained: contractual obligations for AI service providers
- AI APIs for Law Firms: BRAO Compliance Guide
- AI APIs for Law Firms in Germany: When and How
This article is for general information purposes only and does not constitute legal advice. For your specific situation, we recommend consulting a qualified lawyer.