AI Facial Recognition in Germany: Compliance Guide 2026
Short answer
Most commercial facial recognition in Germany is heavily restricted. Real-time biometric identification in public spaces is banned under Art. 5 EU AI Act. The high-risk compliance deadline (2 August 2026) requires conformity assessments and EU database registration.
- Art. 5 AI Act: prohibitions (real-time public identification, database scraping, untargeted surveillance) have applied since 2 February 2025.
- High-risk requirements (conformity assessment, bias testing, EU database registration) apply since 2 August 2025 — full compliance required by 2 August 2026.
- Biometric data is special-category data under GDPR Article 9. Legitimate interest is not a sufficient legal basis.
- Workplace facial recognition triggers mandatory works council co-determination under Section 87(1) no. 6 BetrVG.
Facial recognition in Germany is lawful only in limited, well-defined scenarios. Real-time biometric identification in publicly accessible spaces is prohibited under the EU AI Act (Regulation (EU) 2024/1689). Scraping faces from the internet to build recognition databases is banned. Commercial uses — from access control to identity verification — are classified as high-risk AI and have been subject to the high-risk requirements since 2 August 2025. The full conformity deadline is 2 August 2026 — businesses without an active conformity assessment need to act immediately.
What is Prohibited Under the EU AI Act
Article 5 of the EU AI Act sets out absolute prohibitions that have applied since 2 February 2025. For facial recognition, three bans are directly relevant to German businesses:
1. Real-time biometric identification in publicly accessible spaces
AI systems used by or on behalf of law enforcement to conduct real-time remote biometric identification of individuals in publicly accessible spaces are prohibited, subject to extremely narrow exceptions such as targeted searches for missing children or imminent terrorist threats. For commercial operators, no exception applies. A shopping centre, railway station, airport, or public street cannot lawfully deploy a live facial recognition system to identify individuals.
2. Biometric database scraping
The AI Act prohibits creating or expanding facial recognition databases by scraping images from the internet or CCTV footage without a targeted collection process. This prohibition directly targets the Clearview AI model — building a massive biometric database from publicly available online photos — and bans that practice entirely under EU law.
3. Untargeted facial recognition surveillance
AI systems used for untargeted surveillance, including real-time tracking of individuals across locations or retrospective analysis of large populations, are caught by the general prohibition on mass biometric surveillance. Where a facial recognition system is not targeted at a specific identified individual for a specific permitted purpose, it is likely prohibited.
Fines for prohibited practices can reach €35 million or 7% of global annual turnover, whichever is higher.
Recent Developments 2025–2026: What Has Changed
The EU AI Act is no longer a future framework — it is binding law with active deadlines. For businesses operating facial recognition systems in Germany, these developments are immediately relevant:
2 August 2025: High-risk requirements entered application
Since 2 August 2025, the full high-risk requirements of the EU AI Act apply to facial recognition systems falling under Annex III — including access control, identity verification, and biometric categorisation. Operators and providers already deploying these systems are in the compliance window: conformity assessments, risk management systems, and technical documentation must be completed by 2 August 2026.
EU AI Act database is operational
The public EU database for high-risk AI systems became operational in the second half of 2025. Providers placing Annex III facial recognition systems on the EU market are now required to register. Businesses working with unregistered vendors carry increased liability exposure.
Intensified DPA enforcement
German and European data protection authorities have stepped up scrutiny of biometric AI. The European Data Protection Board (EDPB) has published guidance clarifying that deploying a facial recognition system — even in private premises — without a valid legal basis constitutes a GDPR violation. The DSK (German Data Protection Conference) has announced coordinated inspection activities on AI-enabled biometric systems running in parallel with AI Act market surveillance.
Market surveillance infrastructure is live
National market surveillance authorities across EU member states — in Germany, the Bundesnetzagentur for certain AI areas — are building operational oversight of high-risk AI systems. Incident reporting obligations for serious malfunctions are now active, not hypothetical.
Deadline urgency: approximately four months to August 2026
Businesses without an active conformity assessment as of April 2026 are seriously behind schedule. Conformity assessments for high-risk facial recognition systems can take several months — particularly where a notified body must be involved. Time is running short.
High-Risk Facial Recognition: What It Means for German Businesses Now
Not all facial recognition is prohibited. Systems used for access control, identity verification, payment authentication, and age verification are classified as high-risk under Annex III of the EU AI Act rather than banned. But since 2 August 2025, these obligations are legally binding.
A high-risk facial recognition system must:
- Implement a documented risk management system under Art. 9 of the AI Act, covering foreseeable risks and mitigation measures
- Use high-quality training, validation, and testing datasets, with bias monitoring across demographic groups — particularly across gender, skin tone, and age categories
- Maintain technical documentation per Annex IV and logs per Art. 12 sufficient to enable retrospective review
- Enable human oversight — outputs must be capable of being overridden or corrected by a qualified person, operationally embedded
- Meet transparency requirements toward individuals subject to identification
- Undergo a conformity assessment before being placed on the EU market
- Be registered in the EU AI Act database before deployment if covered by the relevant Annex III category
- Implement post-market surveillance: serious incidents and malfunctions must be reported to national market surveillance authorities
For system vendors, the full high-risk requirements apply. For deployers — businesses that configure or use high-risk systems — the duties shift to proper implementation, staff training, use monitoring, and cooperation with post-market surveillance.
The full high-risk compliance deadline is 2 August 2026.
GDPR Rules for Biometric Data in Germany
Facial recognition by definition processes biometric data — data derived from specific technical processing of physical characteristics that allows the unique identification of natural persons. Under GDPR Article 9, this is special-category data, and its processing is prohibited by default.
To process biometric facial data lawfully, a German company needs:
- A legal basis under Article 6 GDPR (most commonly contract, legitimate interest, or legal obligation)
- Plus a specific exception under Article 9(2) — most commonly explicit consent under Article 9(2)(a), or, for employment-related processing, the relevant provision of national law
Legitimate interest alone is not enough. The EU data protection framework does not permit balancing-test reasoning to justify processing special-category data. Consent under Article 9(2)(a) must be specific, informed, and freely given — and free in the GDPR sense means the individual can refuse without detriment.
Additional requirements for biometric data processing include:
- A Data Protection Impact Assessment (DPIA) under GDPR Article 35, which is typically mandatory for systematic biometric processing
- Appointment of a Data Protection Officer (DPO) if not already required
- Entries in the record of processing activities under Article 30
- Clear data subject information under Articles 13 and 14
- Defined retention periods and robust deletion procedures
German-Specific Rules: BDSG and DPA Enforcement
Germany layers additional protections on top of the GDPR. For employment contexts, Section 26 BDSG governs employee data processing and is more restrictive than pure GDPR. For biometric data processed in the employment context, Section 26(3) BDSG requires explicit consent or a collective agreement (works agreement), and proportionality is assessed strictly.
German data protection authorities have demonstrated willingness to pursue facial recognition cases:
- The Hamburg Data Protection Authority (HmbBfDI) investigated Clearview AI, issued findings of GDPR violations, and ordered the deletion of data belonging to Hamburg residents
- The BfDI (Federal Commissioner for Data Protection and Freedom of Information) has taken positions restricting biometric identification in commercial contexts
- Landesbeauftragter für Datenschutz (state-level DPAs) have the authority to audit, investigate, and fine operators of facial recognition systems without prior complaints
- The DSK has announced coordinated inspection activities on AI-enabled biometric systems, running in parallel with AI Act market surveillance measures
German enforcement is not theoretical. The proportionality principle is central to German data protection practice: even where a legal basis technically exists, authorities can challenge processing where less intrusive alternatives are available. This should be factored in from the outset of any compliance planning.
Workplace Facial Recognition in Germany
Using facial recognition in a German workplace is legally possible in limited scenarios, but employers should approach it as high-friction by default.
Works council co-determination is the first barrier. Under Section 87(1) no. 6 BetrVG, the works council has mandatory co-determination rights for the introduction and use of technical devices capable of monitoring employee behavior or performance. Facial recognition — even for access control — clearly qualifies. The employer cannot deploy without negotiating a works agreement (Betriebsvereinbarung).
A compliant workplace facial recognition works agreement should typically address:
- The specific purpose (e.g. building access only, not attendance scoring)
- Which employee groups are covered
- Enrollment process and alternatives for employees who decline
- Data minimisation — biometric templates should not be stored longer than needed for authentication
- Who has access to logs and under what conditions
- Explicit prohibition on using recognition data for performance, disciplinary, or promotion decisions
- Audit rights and review cycle
GDPR-BDSG consent for biometric data in employment is complex. Because the power imbalance between employer and employee can make consent involuntary, providing a meaningful alternative (such as a PIN or key card) is often essential for consent to be considered freely given.
Facial recognition for employee performance monitoring or behavioral scoring is a distinctly harder case. Even if the employer has consent and a works agreement, using recognition data to track attendance granularity, identify when employees are away from workstations, or correlate presence with output metrics is likely to attract DPA challenge.
Lawful Use Cases for German Companies
Despite the restrictions, some commercial facial recognition deployments are legally viable in Germany:
| Use case | Legal status | Key requirements |
|---|---|---|
| Voluntary access control with alternative options | Possible | Explicit consent, DPIA, works agreement, no performance linkage |
| Identity verification for regulated financial services | Possible | High-risk AI Act compliance, GDPR consent, AML context |
| Age verification in regulated industries | Possible | Proportionality, high-risk compliance, data minimisation |
| Fraud detection in limited targeted contexts | Possible with controls | Strong legal basis, narrow scope, bias testing |
| Real-time identification in public spaces | Prohibited | No commercial exception exists |
| Scraping to build biometric databases | Prohibited | Banned under AI Act and GDPR |
| Untargeted employee behavioral monitoring | Effectively prohibited | Combination of AI Act, GDPR, and BetrVG barriers |
The common thread among lawful use cases is voluntariness, narrow purpose, meaningful alternatives, and no linkage to consequential decisions beyond the stated purpose.
Compliance Checklist: What to Complete Before August 2026
For German businesses deploying or procuring facial recognition systems, the window to 2 August 2026 is short. The following checklist structures the essential steps:
Phase 1: Inventory and Classification (immediate)
- Inventory all facial recognition systems — internally built and purchased
- Classify each use case as prohibited, high-risk, or limited risk
- Ask vendors for AI Act classification, conformity documentation, and EU database registration status
- Review existing GDPR documents (DPIA, records of processing activities) for currency
Phase 2: Secure GDPR Legal Bases
- Document Article 6 legal basis and Article 9(2) exception for each system
- Review consent mechanisms for GDPR compliance — particularly voluntariness in employment contexts
- Update or create DPIA (mandatory for systematic biometric processing)
- Engage Data Protection Officer and update records of processing activities
Phase 3: AI Act High-Risk Compliance
- Establish and document risk management system per Art. 9 EU AI Act
- Prepare technical documentation per Annex IV
- Conduct and document bias testing — results across gender, skin tone, and age groups
- Implement logging and record-keeping per Art. 12
- Operationalise human oversight — who overrides system outputs, how, and under what conditions?
- Initiate conformity assessment (determine whether self-assessment or notified body applies)
- Trigger EU database registration (providers) or request registration proof from vendor (deployers)
Phase 4: Operational and Post-Market Governance
- Establish post-market surveillance processes: real-world performance, bias incidents, near misses
- Set up incident reporting pathways to national market surveillance authorities
- Review vendor contracts for AI Act obligations, liability allocation, and data processing terms
- Conduct and document staff training for personnel operating high-risk AI systems
Phase 5: Works Council and Employee Rights (where applicable)
- Engage works council early — before procurement, not after the contract is signed
- Negotiate works agreement or update existing agreement to reflect AI Act requirements
- Provide alternative enrollment option for employees who decline biometric registration
What Vendors Need to Know
If your business builds or supplies facial recognition technology to German or EU customers, you face a distinct set of obligations:
Conformity assessment: High-risk AI systems typically require a conformity assessment before market placement. Depending on the specific use case, this may be a self-assessment supported by technical documentation, or it may require a third-party notified body review.
EU database registration: High-risk systems covered by Annex III must be registered in the EU AI Act public database maintained by the Commission. This obligation has been active since August 2025. Operators relying on your system will ask for proof of registration.
CE marking: Like other product categories, AI Act-compliant high-risk systems may require a declaration of conformity and CE marking.
Post-market monitoring: Providers must implement post-market surveillance covering real-world performance, bias incidents, near misses, and feedback from deployers. Serious incidents and malfunctions must be reported to national market surveillance authorities.
Instructions for use: Deployers need sufficient documentation to use the system within its intended purpose. Providers cannot shield themselves from liability by arguing the deployer misconfigured the system if the instructions were inadequate.
Practical Compliance Steps for German Businesses
Before deploying any facial recognition system, German businesses should work through the following:
- Classify the use case — prohibited, high-risk, or limited risk? Most commercial uses are high-risk at minimum.
- Map the legal bases — Article 6 GDPR legal basis plus Article 9(2) exception for biometric data. Document both.
- Run a DPIA — mandatory for virtually all facial recognition systems involving natural persons.
- Engage the works council early — before procurement, not after the contract is signed.
- Build an alternative — ensure that employees and customers who decline biometric enrollment can still access the service.
- Audit your vendor — request the AI Act classification, conformity documentation, EU database registration, bias testing results, and data processing agreements.
- Plan for deletion — biometric templates and logs should have defined retention periods and enforceable deletion schedules.
Compound Law advises businesses on facial recognition compliance in Germany, covering AI Act classification, GDPR biometric data frameworks, DPIAs, works council negotiations, and vendor due diligence. For related topics, see our guides on AI biometric identification, AI employee monitoring, and AI recruitment screening.
This page provides general information only and is not a substitute for legal advice on a specific deployment.
Frequently Asked Questions
Is facial recognition illegal in Germany?
Not entirely, but the most invasive uses are prohibited. Real-time biometric identification in publicly accessible spaces is banned, as is scraping facial images to build databases. Commercial deployments such as access control and identity verification are legal in principle, but classified as high-risk — meaning comprehensive compliance is required before deployment.
When is the high-risk compliance deadline for facial recognition under the AI Act?
Full conformity obligations for high-risk facial recognition systems must be met by 2 August 2026. The high-risk requirements have applied since 2 August 2025. Businesses without an active conformity assessment as of April 2026 are behind schedule — assessments can take several months, particularly where a notified body is required.
Can employers use facial recognition in Germany?
Only in narrow, well-justified scenarios. Works council co-determination applies under Section 87(1) no. 6 BetrVG, explicit consent or another GDPR Article 9(2) basis is required, and a DPIA is typically mandatory. For access control, employers should provide a non-biometric alternative for employees who decline.
What is the GDPR rule on biometric data in Germany?
Biometric data is special-category data under GDPR Article 9. Processing it is prohibited by default. An Article 9(2) exception — most commonly explicit consent — is required in addition to a standard Article 6 legal basis. Legitimate interest alone is not sufficient. In employment contexts, Section 26(3) BDSG adds further requirements.
Is Clearview AI banned in Germany?
In practice, yes. The Hamburg DPA found GDPR violations and ordered deletion of German residents’ data. The EU AI Act now explicitly bans scraping internet images to build biometric databases, making that practice unlawful under both frameworks.
What happens if you use facial recognition without GDPR compliance?
Fines under GDPR Article 83(5) can reach 4% of global annual turnover. Under the AI Act, prohibited practices carry fines up to €35 million or 7% of global annual turnover. German DPAs can also order deletion of unlawfully processed data, suspend systems, and impose injunctions. Enforcement risk is real — German authorities have pursued facial recognition cases.
Do I need a notified body for facial recognition compliance?
It depends on the specific use case. Many high-risk facial recognition systems can use a self-assessment supported by technical documentation. However, certain use cases — particularly those involving law enforcement or critical infrastructure — may require a notified body. Providers should consult the AI Act Annex and, where needed, obtain specialist legal advice on which conformity assessment route applies.
Which authority enforces the AI Act for facial recognition in Germany?
The Bundesnetzagentur (Federal Network Agency) is designated as the market surveillance authority for certain AI areas. Data protection aspects remain with the DPAs (BfDI and state-level authorities). For sector-specific applications — such as financial services or critical infrastructure — BaFin or the BSI may have additional jurisdiction. In practice, cross-authority coordination is increasingly likely.