What Microsoft Copilot Actually Processes
Microsoft Copilot for Microsoft 365 is not a standalone AI assistant — it is deeply integrated with your organization's M365 data estate. When an employee uses Copilot, the system has access to a broad corpus of business content:
Email & Calendar
Exchange mailboxes, calendar events, and contact lists — including customer correspondence and meeting notes containing PII
Files & Documents
SharePoint sites and OneDrive — contracts, HR records, financial reports, patient files if stored in M365
Teams & Chat
Teams messages, channel posts, and meeting transcripts — often containing informal data sharing that bypasses formal DLP controls
Copilot uses retrieval-augmented generation (RAG) to index this data and surface relevant content when responding to prompts. This means personal data about customers, employees, patients, and business partners becomes part of the AI's working context — even if it was never explicitly shared with Copilot.
In addition to RAG-indexed data, Copilot processes whatever employees type directly into prompts: customer names, case details, financial figures, medical information. This prompt-level data is the hardest to govern because it is entirely user-driven and not subject to M365 data classification policies.
Microsoft's GDPR Position: What the DPA Actually Covers
Microsoft offers a Data Processing Agreement (DPA) for enterprise M365 customers, which makes Microsoft a data processor under GDPR Article 28. This is a necessary baseline — without a DPA, using M365 for personal data would itself be non-compliant. Microsoft's DPA includes Standard Contractual Clauses for international transfers.
Microsoft also offers EU data residency configuration: enterprise customers can pin their M365 tenant to European data centers, keeping stored data within the EU. This is a genuine compliance improvement over the default multi-region configuration.
However, there are important caveats EU IT teams need to understand before declaring Copilot compliant:
- Telemetry data — Copilot diagnostic and performance telemetry may be processed outside the EU even with EU residency configured. Microsoft's documentation distinguishes "Customer Data" (EU residency honored) from "Service Data" (global processing).
- Connected Experiences — Microsoft's "Connected Experiences" feature can share content with Microsoft for product improvement. EU customers must explicitly opt out via the Microsoft 365 admin center to prevent this data flow.
- Prompt data residency — The EU data boundary commitment covers stored Customer Data. Real-time Copilot prompt processing may route through different infrastructure than stored data residency settings apply to.
Minimum configuration for EU compliance
EU customers should: (1) confirm EU data residency is active in the M365 admin center, (2) disable "Optional Connected Experiences" in the Office privacy settings policy, (3) review the Microsoft Products and Services DPA and ensure it is signed as part of the enterprise agreement, and (4) conduct a DPIA that accounts for residual non-EU data flows in telemetry and real-time processing.
The Remaining GDPR Risk: Unmanaged Prompt Data
Even after configuring EU data residency and disabling Connected Experiences, one risk category remains almost entirely unaddressed by Microsoft's configuration options: employees pasting or typing personal data directly into Copilot prompts.
This is the shadow AI problem applied to an officially sanctioned tool. When Copilot is deployed, employees have a trusted, convenient interface for interacting with AI. The natural behavior is to paste in the document they are working on — including all customer names, email addresses, phone numbers, and account details it contains — and ask Copilot to summarize, rewrite, or analyze it.
What EU data residency covers
- Documents stored in SharePoint
- Emails at rest in Exchange
- Teams messages at rest
- Calendar and contact data at rest
What EU data residency does NOT cover
- PII typed directly into Copilot prompts
- Customer data pasted from external systems
- Conversation history stored in Copilot logs
- Screenshot and image analysis (US processing)
The GDPR legal basis for this prompt-level processing is also unclear. If an employee pastes customer PII into Copilot and the conversation history is retained, the customer has not consented to their data being processed by a third-party AI system — and the organization likely lacks a legitimate interest basis that would survive a challenge from a European data protection authority.
Copilot Data Flows: GDPR Risk Assessment
| Data Flow | Data Residency | DPA Covers | GDPR Risk |
|---|---|---|---|
| Document analysis via M365 index | Configurable EU | Available | LOW |
| Copilot chat prompts (typed text) | Not always EU | Partial | MEDIUM |
| Employee pastes customer data | Unknown | No | HIGH |
| Image/screenshot analysis | US processing | No | HIGH |
Based on Microsoft EU Data Boundary documentation and GDPR Art. 28/44 requirements. March 2026.
EU AI Act 2026: Additional Pressure on Copilot Deployments
The EU AI Act adds a new compliance layer on top of GDPR for organizations deploying Copilot. Microsoft Copilot qualifies as a "general purpose AI model" under the AI Act, which triggered transparency and documentation obligations as of August 2025.
More significantly, organizations that deploy Copilot for use cases that touch employment decisions, customer creditworthiness scoring, or benefits assessment may be classifying it as a "high-risk AI system" under AI Act Annex III. High-risk deployments trigger Article 10 data governance requirements:
- Data minimization documentation — Personal data feeding into high-risk AI must be documented and minimized. Undocumented RAG indexing of HR or customer files creates Art. 10 exposure.
- Training data relevance — Art. 10(3) requires that training, validation, and testing data be relevant, representative, and free of errors. Organizations cannot simply pass raw M365 data through Copilot without assessing data quality.
- Human oversight — High-risk AI systems must allow meaningful human review of outputs before consequential decisions. Copilot deployments used for HR or credit decisions need documented human oversight processes.
The practical implication: EU organizations deploying Copilot for any use case touching regulated data categories (HR data, financial data, health information) should conduct an AI Act risk classification assessment before August 2026 — and build a PII minimization layer into the Copilot workflow architecture before enforcement begins.
The Solution: PII Anonymization Before Copilot Processes Your Data
The architectural fix for Copilot's GDPR gap is a PII anonymization layer that intercepts personal data before it reaches the Copilot inference engine. Two deployment approaches work for different organizational contexts:
Proactive: Office Add-in
The cloak.business Office Add-in adds an "Anonymize" button directly inside Word, Excel, and Outlook. Before an employee asks Copilot to process a document, they click Anonymize — replacing all PII with reversible encrypted tokens.
- Works with existing M365 workflow
- Deployed via M365 Admin Center
- Reversible — original PII restored after Copilot
Reactive: API Proxy
For developer teams, the cloak.business API can intercept M365 Graph API calls and anonymize content on-the-fly before it is passed to Copilot as context — requiring no user action.
- Transparent to end users
- 317 entity recognizers in 48 languages
- German server processing — no SCCs required
The before/after workflow for the Office Add-in approach illustrates the compliance improvement:
Before (original document)
Dear Maria Müller,
Re: Contract #2026-DE-4491
Please review the terms for Müller GmbH.
Tax ID: DE293847162
After anonymization
Dear PERSON_001,
Re: Contract ID_001
Please review the terms for ORG_001.
Tax ID: TAX_001
Copilot output → deanonymized
Summary: Maria Müller at Müller GmbH needs contract review. Reference: #2026-DE-4491.
Step-by-Step: Deploying the Office Add-in for Copilot Compliance
The Office Add-in can be centrally deployed to all M365 users via the Microsoft 365 Admin Center in four steps:
Centralized Deployment via M365 Admin Center
Go to Microsoft 365 Admin Center → Settings → Integrated Apps → Deploy Add-in. Select "cloak.business Office Add-in" from the Office Store or upload the manifest. Assign to relevant user groups (e.g., legal, finance, HR teams that use Copilot for document analysis).
Configure Entity Types to Redact
In the cloak.business admin console, configure which entity types should be anonymized before AI processing. For most EU organizations: PERSON (names), EMAIL_ADDRESS, PHONE_NUMBER, IBAN, TAX_ID, and relevant national ID formats (e.g., German Personalausweis, French NIR). Save as a shared preset that all users access from the add-in.
Train Users: "Anonymize Before Copilot"
Add a one-slide procedure to your Copilot onboarding materials: before asking Copilot to analyze a document containing customer or employee data, click the "Anonymize" button in the add-in ribbon. This becomes a 3-second habit that closes the GDPR prompt-data gap without adding friction to the workflow.
Optional: Enable Reversible Encryption
Enable the reversible anonymization mode with AES-256-GCM encryption. Copilot sees anonymized tokens and produces a summary using placeholder names. After Copilot finishes, click "Deanonymize" to restore original PII in the output — getting the full value of Copilot's analysis without ever exposing real names to the AI inference engine.
Compliance Documentation for Copilot Deployments
EU organizations deploying Copilot should maintain the following documentation as part of their GDPR compliance program:
Records of Processing Activities (Art. 30)
Create a ROPA entry for "AI-assisted document processing via Microsoft Copilot." Include: categories of personal data processed, legal basis, retention period for Copilot conversation history, and the technical measures in place (anonymization layer).
Data Protection Impact Assessment (Art. 35)
A DPIA is required when processing is likely to result in high risk, which applies when deploying Copilot at scale for HR or customer data. Document the risk (residual non-EU data flows), the mitigation (anonymization layer + EU residency config), and the residual risk assessment.
Data Processing Agreement (Art. 28)
Confirm the Microsoft Products and Services DPA is active under your enterprise agreement. Verify it explicitly covers Copilot for Microsoft 365 and includes SCCs for any residual non-EU processing. Check renewal dates — DPA terms can update with product changes.
Internal "Anonymize Before AI" Policy
Document a formal internal policy requiring anonymization of personal data before Copilot processing. Include in employee AI usage guidelines, Copilot onboarding training, and acceptable use policy. This policy provides the documented technical measure that supports the DPIA risk mitigation.
Conclusion: Copilot Can Be GDPR-Compliant with the Right Controls
Microsoft Copilot for Microsoft 365 is not inherently GDPR non-compliant — but it is not automatically compliant either. The compliance gap is specific and addressable: EU data residency configuration covers stored data, but prompt-level PII and unmanaged document uploads create risk that Microsoft's infrastructure settings alone cannot close.
EU organizations that want to deploy Copilot safely need two things beyond Microsoft's baseline: (1) deliberate configuration of EU data residency and opt-out from Connected Experiences, and (2) a PII anonymization layer that intercepts personal data before it reaches Copilot's inference engine.
With the Office Add-in deployed via M365 Admin Center and a simple "Anonymize Before Copilot" workflow, EU IT teams can give their users the full productivity benefit of Copilot while maintaining a documented, auditable GDPR compliance posture — and getting ahead of EU AI Act Art. 10 data governance requirements before August 2026 enforcement.