AI Image Generation: What German Companies Need to Know
AI image generation is transforming content creation. DALL-E, Midjourney, Stable Diffusion—these tools are everywhere. Under the EU AI Act, they’re not high-risk, but they do have specific transparency requirements.
The Transparency Requirement
AI-generated images that could be mistaken for real content must be marked as artificially generated. This applies to deepfakes, synthetic media, and photorealistic AI content.
The requirement targets deception potential. If your AI creates images that people might think are photographs, those images need clear disclosure.
What This Means Practically
If you’re using AI to generate marketing images, product visuals, or creative content, consider whether the output could be mistaken for authentic photographs. Stylized illustrations typically don’t need marking. Photorealistic faces or scenes do.
The disclosure must be machine-readable where technically feasible. This means metadata labeling, not just visible watermarks—though visible disclosure is also recommended for high-risk contexts.
Provider vs. Deployer Obligations
AI image generation platforms (providers) must build in disclosure capabilities and inform users about obligations. Businesses using these tools (deployers) must actually implement appropriate disclosure for their specific use cases.
If you’re using an image generation API, you’re responsible for ensuring proper disclosure of outputs you publish.
Copyright Considerations
Beyond AI Act requirements, AI-generated images raise copyright questions under German law. Training data issues, originality requirements, and ownership of AI outputs remain legally unsettled. Document your use carefully.
How Compound Law Helps
- Assessment of disclosure requirements for your use cases
- Implementation guidance for transparency obligations
- Policy development for AI-generated content
- Copyright risk assessment for generative AI
- Ongoing compliance monitoring
Frequently Asked Questions
Do AI illustrations need disclosure? Stylized illustrations typically don’t. Photorealistic images that could be mistaken for photographs do.
What kind of disclosure is required? Machine-readable metadata where feasible, plus clear disclosure to end users when content could be mistaken for real.
Who is responsible—the AI provider or us? Both have obligations. Providers must enable disclosure; deployers must implement it appropriately.