AWS Bedrock DPA and GDPR review for German companies
tools

AWS Bedrock DPA and GDPR in Germany: What Legal Should Review

Short answer

AWS Bedrock can be a strong GDPR option for German companies, but legal teams still need to review the AWS DPA, regional setup, transfer logic, logging, and separate third-party model terms before treating it as a lower-risk procurement route.

  • Check the AWS processor layer and the model-provider layer separately.
  • Treat EU hosting as a configuration question, not a blanket guarantee.
  • Verify prompt handling, logs, and transfer mechanics before rollout.

AWS Bedrock can be lawful under the GDPR for German companies, and it is often easier to procure than a direct model-provider contract, but only if your team separates the AWS platform commitments from the third-party model terms and verifies where processing actually happens. In practice, the legal review should focus on the AWS DPA, Article 28 GDPR processor terms, Chapter V transfer logic, logging and retention settings, and the extra contractual layer that applies when you use Anthropic, Meta, Mistral, or other third-party models through Bedrock.

That makes Bedrock less of a simple “yes or no” compliance question and more of a procurement architecture question. If legal, security, and procurement review the stack together, Bedrock can be a sensible route for Germany-based businesses that want enterprise controls without negotiating separately with every model vendor. For a broader view of how this compares with other enterprise AI solutions, see the AI tools assessed by Compound Law.

Can German companies use AWS Bedrock lawfully?

Yes, in many cases. But the answer depends on how you configure Bedrock, which model family you use, and what data you send through it.

For most internal productivity, software development, document drafting, support enablement, and retrieval-based use cases, Bedrock can fit into a defensible GDPR setup — including use cases that require AI chatbot compliance under GDPR or AI data analytics compliance — if you:

  1. map the processing purpose and data categories up front,
  2. put the AWS contractual layer in place,
  3. validate the region and transfer design,
  4. review any provider-specific model terms, and
  5. document the rollout in your AI governance process.

The key legal touchpoints are familiar:

  • Article 28 GDPR for processor terms and instructions,
  • Articles 5 and 32 GDPR for data minimization and security,
  • Chapter V GDPR for international transfers,
  • Article 35 GDPR if the implementation creates a need for a DPIA,
  • § 87(1) no. 6 BetrVG if employee-facing deployment creates co-determination issues,
  • and the EU AI Act if your Bedrock-enabled workflow falls into transparency or high-risk categories.

If you are still comparing vendors, our guides on Claude Enterprise, Azure OpenAI, OpenAI API, and Perplexity show where the contracting and hosting logic differs.

Does AWS provide a DPA and how does Bedrock fit into it?

For most enterprise customers, Bedrock is not contracted as a standalone legal universe. It sits inside the AWS customer and data processing framework. That is useful, because many procurement teams already have AWS procurement, security review, and vendor-management processes in place.

What legal should check is not just “is there a DPA?” but what exactly the DPA covers in the Bedrock workflow:

  • whether your AWS agreement and data processing terms cover the services you will use,
  • whether Bedrock usage is documented in the relevant internal vendor record,
  • whether subprocessors and support structures are acceptable,
  • whether incident, deletion, and audit clauses align with your internal standards,
  • and whether the instructions you give users match the contract assumptions.

For German procurement teams, the practical point is this: Bedrock usually reduces friction because AWS is the main processor relationship, but it does not eliminate legal review. You still need to confirm that Bedrock is being used within the agreed AWS framework and that no unsupported assumption is being made about training, retention, or geographic scope.

How to access the AWS Bedrock Data Processing Agreement

AWS Bedrock offers a Data Processing Addendum (DPA) for GDPR compliance — it is incorporated into the AWS Customer Agreement and applies to Amazon Bedrock as part of the AWS service scope. Unlike some SaaS vendors that require a separate DPA signature, AWS’s approach is to embed the DPA into its standard terms, which means it is already in place for customers with an active AWS account.

To access and review the AWS DPA for a Bedrock rollout:

  1. Log in to the AWS Console and navigate to Account Settings > Privacy and Terms (or visit the AWS Data Privacy page directly at aws.amazon.com/compliance/data-privacy/).
  2. Download the current AWS Data Processing Addendum and the associated Service Terms that cover Amazon Bedrock.
  3. Review the AWS Subprocessor List, which AWS publishes and updates when subprocessors change — legal teams should confirm that the listed subprocessors are acceptable under your data privacy policy.
  4. For enterprise customers with custom commercial terms, contact your AWS account team to negotiate data processing specifics, confirm coverage scope, or obtain a countersigned DPA if required by your procurement or legal policy.
  5. If Anthropic, Meta, Mistral, or other third-party models are in scope, check whether any provider-specific model terms in your Bedrock setup create an additional processing layer that requires separate review.

The AWS DPA covers:

  • Article 28 GDPR processor obligations: AWS commits to processing personal data only as instructed, implementing appropriate technical and organizational security measures, and assisting with data subject requests.
  • Subprocessors: AWS publishes its subprocessor list, including infrastructure and support entities. Legal teams should review whether any subprocessors involve non-EU data handling relevant to their specific use case.
  • Standard Contractual Clauses (SCCs): AWS uses the 2021 EU SCCs for personal data transferred outside the EEA. Typically Module 2 (Controller to Processor) applies.
  • Data deletion and retention: The DPA addresses how AWS handles data on service termination, including deletion timelines and portability obligations.
  • Amazon Bedrock scope: Confirm that your AWS agreement explicitly covers Bedrock and that your approved use cases and data types are consistent with the AWS Acceptable Use Policy for AI/ML services.

Companies comparing procurement terms across enterprise AI providers can also review the Azure OpenAI data privacy review and the Compound Law data processing agreement compliance guide for context on how Article 28 obligations differ across vendors.

AWS Bedrock certifications and GDPR transfer safeguards for Germany

AWS holds a broad certification portfolio relevant to German enterprise procurement. The most commonly requested in DACH procurement processes are:

  • ISO 27001 — AWS’s information security management system (ISMS) is ISO 27001-certified across its global infrastructure, including the Frankfurt (eu-central-1) region.
  • SOC 2 Type II — AWS publishes SOC 2 Type II reports covering its control environment. These reports are accessible via AWS Artifact in the AWS Console and are routinely required by German security and procurement teams.
  • C5 (Cloud Computing Compliance Criteria Catalogue) — AWS holds the BSI C5 attestation, which is the German Federal Office for Information Security’s (BSI) cloud security standard. For public-sector and heavily regulated German buyers, C5 is often a prerequisite.
  • ISO 27701 — AWS holds this privacy information management certification, which specifically addresses GDPR-related privacy controls.

For German companies, the key GDPR transfer and compliance safeguards are:

  • Frankfurt region (eu-central-1): Selecting the Frankfurt region keeps standard inference and data storage within Germany and the EU. This is the most important configuration decision for GDPR-sensitive workloads. For the Anthropic-on-Bedrock workflow, see also the Claude EU hosting guide for context on how regional choices affect the overall compliance posture.
  • Standard Contractual Clauses (SCCs): AWS uses the 2021 EU Commission SCCs for transfers of personal data from the EEA to third countries. Confirm that Module 2 (Controller to Processor) applies to your Bedrock configuration and review whether a Transfer Impact Assessment (TIA) is required under your internal GDPR compliance framework.
  • Cross-Region inference: AWS offers a cross-Region inference feature that can route requests across multiple AWS regions for capacity and latency reasons. If this feature is enabled, data may leave the selected EU region — legal teams should explicitly disable it or account for it in their Chapter V transfer analysis.
  • BDSG: The Bundesdatenschutzgesetz (BDSG) applies alongside the GDPR in Germany. For employee data, special category data, or behavioral data processed through Bedrock-enabled applications, the BDSG’s stricter consent and necessity conditions apply independently of the AWS DPA.
  • BetrVG: If Bedrock is used in workflows that affect employees — for example, HR document automation, performance analysis, or internal support tools — § 87(1) no. 6 BetrVG may require works council co-determination before deployment, regardless of whether the DPA is in order.

For a broader view of how AWS Bedrock’s GDPR posture compares with other enterprise AI platforms, the Compound Law tools overview covers the main procurement and compliance differences across vendors.

Regional hosting, transfers, and what “EU-hosted” actually covers

This is where many legal reviews go wrong. Teams hear “EU region” and infer “EU-only processing.” That inference is too broad.

Bedrock supports European regions, including Europe (Frankfurt) and Europe (Ireland). AWS also states in its Bedrock FAQ that customer content is encrypted and stored at rest in the region where you are using the service. That is a helpful starting point for GDPR analysis, especially if your business wants to keep production inference inside the EU.

But legal should still verify at least four separate questions:

IssueAWS layerModel-provider layerWhat legal should verify
Contract scopeAWS contract and DPA govern the service relationshipProvider terms may still apply for third-party modelsWhether both layers are acceptable for the intended use case
Inference locationRegion selection can keep standard usage in a chosen AWS regionSome model features or profiles can rely on broader routingWhether cross-Region inference or other routing is enabled
Data at restAWS says customer content is encrypted and stored in the region in useProvider-specific terms may describe additional handling limitsWhether logs, caches, and connected services stay in scope
International transfersAWS offers GDPR transfer mechanisms such as SCC-based structures where neededProvider layer may create extra transfer and subprocessor questionsWhether Chapter V analysis is complete, not assumed away

The most important procurement lesson is that regional deployment is a technical control, not a legal shortcut. If a team enables a feature that uses cross-Region inference, or connects Bedrock to other AWS services that replicate data elsewhere, the original “EU-hosted” assumption may no longer describe the real setup.

Third-party models on Bedrock: who processes what?

Bedrock is attractive because it puts third-party models behind an AWS interface. That can simplify security review, IAM, billing, and centralized access control. But it does not collapse the legal stack into a single layer.

You should separate:

  • AWS as the infrastructure and service operator, and
  • the model provider as the provider of the underlying model or model-specific terms.

That matters because AWS publishes third-party model terms for Bedrock, and those terms can differ by provider. Procurement should therefore ask:

  1. Which model families will be approved for use?
  2. Do those model families introduce provider-specific acceptable-use, IP, or indemnity terms?
  3. Does the business want one default model only, or a multi-model setup with different legal profiles?
  4. Is the team using AWS-native models, third-party foundation models, fine-tuned deployments, or agents with external tools?

From a governance perspective, Bedrock often makes sense when a company wants to approve one AWS channel and then manage model-level permissions internally. But that only works if the approved-model list is controlled and legal has reviewed the provider layer in advance.

Prompt retention, training, logging, and security controls

AWS states in its Bedrock FAQ that model invocation requests and responses are not shared with model providers, and that AWS does not use Bedrock input or output to train AWS models. That is an important difference from some older direct-API assumptions in the market.

Still, “not used for training” is not the same as “no data ever exists anywhere.” Legal and security should review:

  • application logs and observability pipelines,
  • CloudTrail and administrative event visibility,
  • prompt content copied into tickets, chat systems, or debugging tools,
  • retrieval sources connected to Bedrock,
  • data classification rules for uploads,
  • and deletion or retention settings in surrounding systems.

If the rollout involves personal data, trade secrets, employee information, or regulated documentation, the safe approach is to document a use-case matrix that states what may be sent to Bedrock, what must be pseudonymized first, and which workflows require additional approval.

When Bedrock is preferable to direct vendor contracts

For some Germany-based businesses, Bedrock is not just technically convenient. It is legally and operationally cleaner than going direct to multiple model vendors.

Bedrock is often the better route when:

  • your company already has mature AWS procurement and security review,
  • legal prefers a centralized vendor relationship rather than many direct model contracts,
  • IAM, logging, and network controls need to sit inside an existing AWS environment,
  • procurement wants one spend channel and one approval path,
  • or the business wants to test multiple models without onboarding each vendor separately.

This centralised approach is particularly valued in financial services AI regulation in Germany and the manufacturing sector AI adoption, where centralised vendor governance and strict data-handling rules are standard requirements.

Direct vendor contracting can still be preferable when a provider offers stronger dedicated EU commitments, when key features are only available outside Bedrock, or when customer contracts require direct vendor obligations. In other words, Bedrock changes the procurement posture, not the need for legal review.

Before approving Bedrock, ask the joint legal, privacy, security, and procurement team to complete this checklist:

  1. Identify the approved Bedrock use cases and the categories of data involved.
  2. Confirm which AWS entities, contracts, and processor terms apply.
  3. Check which Bedrock region will be used in production and test environments.
  4. Verify whether any model profile, feature, or integration uses cross-Region routing.
  5. Approve the specific model providers that employees or product teams may access.
  6. Review provider-specific Bedrock terms for IP, use restrictions, and liability allocation.
  7. Document whether a DPIA is required under Article 35 GDPR.
  8. Assess whether employee-facing deployment triggers works council involvement under § 87 BetrVG.
  9. Define rules for prompts, uploads, logs, and connected knowledge bases.
  10. Record the decision in the company’s AI governance and vendor-management process.

If you are building a broader procurement framework for generative AI in Germany, our expertise page shows how data privacy, commercial contracts, employment law, and AI Act compliance need to fit together.

What Compound Law helps with

Compound Law supports German and DACH businesses with Bedrock DPA review, provider-term red-flag analysis, GDPR transfer and DPIA scoping, AI governance rules, and works council strategy for internal AI deployments.

FAQ

The main question is whether your actual Bedrock setup matches the contractual and technical assumptions behind the AWS processor relationship. Legal should not stop at “AWS has a DPA”; it should check service scope, region design, provider terms, logging, and transfer mechanics.

How does AWS Bedrock GDPR compliance differ from a direct Anthropic or Meta contract?

Bedrock can centralize the infrastructure and procurement layer through AWS, which often simplifies access control and vendor management. But if you use third-party models, you still need to review the model-provider terms and confirm whether the legal risk profile is really lower for your use case.

Do I need SCCs if I use Bedrock in Europe?

Maybe, depending on the full architecture. EU-region deployment improves the position, but Chapter V analysis still depends on support access, subprocessors, connected services, and any cross-Region or non-EU handling introduced by the selected setup.

Does Bedrock change the risk profile of Anthropic, Meta, or other third-party model access?

Yes, often in a positive way for procurement, because AWS becomes the main operational layer. But it does not erase provider-specific restrictions or all international-transfer questions, so the risk profile changes rather than disappears.

Usually yes if the use case involves personal data, employee data, regulated workflows, or customer-facing automation. A tool guide can explain the framework, but deployment decisions still need review against the company’s exact use case and contract stack.

How do I access the AWS Bedrock Data Processing Agreement?

The AWS Data Processing Addendum (DPA) is incorporated into the AWS Customer Agreement and applies automatically to Amazon Bedrock. You can download it from the AWS Data Privacy page (aws.amazon.com/compliance/data-privacy/) or access it through AWS Artifact in the AWS Console. Enterprise customers should also confirm with their AWS account team that their specific use case and data types fall within the agreed contractual scope — particularly if using third-party foundation models through Bedrock.

Does AWS Bedrock use Standard Contractual Clauses for Germany?

Yes. AWS uses the 2021 EU Standard Contractual Clauses (SCCs) as its primary GDPR transfer mechanism for personal data processed outside the EEA. For most Bedrock enterprise use cases, Module 2 (Controller to Processor) applies. Legal teams should confirm that the Frankfurt region (eu-central-1) is selected for production workloads and that the cross-Region inference feature is not enabled without a corresponding update to their Chapter V transfer analysis. AWS Artifact provides the current SCC documentation for download during procurement review.

Related Tool Guides

Claude Enterprise GDPR compliance review for companies in Germany
tools

Claude Enterprise in Germany: GDPR Compliance, DPA, SCCs & EU Hosting Guide

Can German companies use Claude Enterprise under GDPR? Covers DPA/AVV, SCCs, EU hosting options, data residency, and a compliance checklist before rollout.

GitHub Copilot DPA and GDPR compliance guide for German companies
tools

GitHub Copilot GDPR: DPA, IP & German Compliance Guide

GitHub Copilot is GDPR-compliant only on Business or Enterprise plans with a signed DPA. German companies: IP, Betriebsrat, and data residency checklist.

Notion DPA and GDPR compliance guide for German companies
tools

Notion DPA and GDPR: Can German Companies Use Notion Compliantly?

Notion DPA, GDPR compliance, EU data hosting, and AVV requirements for German companies. Practical guide for legal, privacy, and IT teams.

ChatGPT Enterprise GDPR and DPA compliance guide for Germany
tools

ChatGPT Enterprise GDPR & DPA: Compliance Guide for German Companies 2026

Is ChatGPT Enterprise GDPR compliant? OpenAI DPA, EU data residency, SOC 2, AI Act obligations, and works council requirements for German companies.

AI tools for lawyers Germany BRAO GDPR professional secrecy compliance
tools

AI APIs for Law Firms in Germany: BRAO, GDPR & Secrecy Guide

Can lawyers in Germany use AI tools like Claude or ChatGPT? BRAO §43a, GDPR Art. 28, and BRAK guidance explained — with a 7-point compliance checklist.

Make.com DPA and GDPR compliance for German companies
tools

Make.com DPA: Does Make Have a Data Processing Agreement? (GDPR Guide)

Make.com offers a DPA for paid plan customers. What German companies must verify for GDPR compliance — EU data residency, sub-processors, and BetrVG.

Browse More AI Tools

Frequently asked questions

Does AWS provide a DPA for Amazon Bedrock?

Yes. Bedrock sits within AWS's broader GDPR and data processing framework, but legal should still verify whether the relevant AWS contract set, service scope, and technical setup fully cover the intended Bedrock use case.

Does Bedrock keep prompts and outputs in the EU?

It can, but that depends on the region and features you enable. Standard regional deployment is different from cross-Region inference, logging, or integrations that may change where data is handled.

Do third-party Bedrock models create separate legal review work?

Yes. AWS is the platform layer, but model-provider terms can add product-specific restrictions, acceptable-use rules, and commercial conditions that procurement and legal should review separately.

How do I access the AWS Bedrock Data Processing Agreement (DPA)?

The AWS Data Processing Addendum (DPA) is incorporated into the AWS Customer Agreement and can be downloaded from the AWS Data Privacy page. Enterprise customers can also negotiate custom DPA terms with their AWS account team. The DPA covers Amazon Bedrock as part of the broader AWS service scope.

Does AWS Bedrock use Standard Contractual Clauses (SCCs) for Germany?

Yes. AWS uses the 2021 EU Standard Contractual Clauses as its transfer mechanism for personal data transferred outside the EEA. Legal teams should confirm which SCC module applies — typically Module 2 (Controller to Processor) — and verify that the Frankfurt region (eu-central-1) is selected for production workloads.

Does Bedrock reduce transfer risk compared with direct vendor APIs?

Often yes, because procurement can centralize security and contracting through AWS. But Bedrock does not remove all transfer, subprocessors, or provider-term questions, so the risk profile still needs case-by-case review.

Do German companies need a DPIA or works council review for Bedrock?

Sometimes. A DPIA may be required if the rollout creates high privacy risk, and works council co-determination can be triggered if Bedrock affects employee monitoring, evaluation, or workplace organization.

Book Free Call