Microsoft 365 is the operating system of global institutional life. Over 400 million paid seats. Word, Excel, PowerPoint, Outlook, Teams, SharePoint, OneDrive — the applications that governments, universities, hospitals, and corporations use to operate. And now, embedded across every one of these applications: Copilot, Microsoft’s AI assistant powered by OpenAI’s GPT models.
This is not a tool you choose to adopt. For most institutions, Microsoft 365 is already adopted. Copilot is arriving whether you evaluated it or not. And that is precisely the sovereignty problem.
Sovereignty Test Matrix™: Microsoft 365 + Copilot
| Domain | Score | Assessment |
|---|---|---|
| Strategic Alignment | 5/5 | Deeply integrated into existing workflows. Copilot augments tools institutions already use daily. Reduces friction for AI adoption because the platform is already there. |
| Technical Performance | 4/5 | Solid performance across productivity tasks. Excel analysis, document drafting, email summarisation, meeting transcription. Not best-in-class for complex reasoning (compared to Claude or standalone GPT-4), but excellent for embedded productivity AI. |
| Ethical Compliance | 3/5 | Microsoft has published responsible AI principles and deployed content filters. AI access governance controls exist in the admin console. However: Copilot has access to everything in your Microsoft Graph — emails, documents, chats, calendar — and the permission model is complex enough that most organisations cannot fully audit what Copilot can see. |
| Sovereignty Impact | 1/5 | Critical sovereignty failure on multiple dimensions. Lock-in is near-total — switching from Microsoft 365 is a multi-year, multi-million enterprise project. Data residency is nominally configurable but Copilot processing routes through Microsoft’s AI infrastructure regardless. US jurisdiction applies via CLOUD Act. No self-hosting option. Dependency scoring: 19/20 (critical). |
| Cultural Alignment | 3/5 | Better localisation than most AI tools (Microsoft has decades of language support). Copilot supports multiple languages. But AI features are English-first in capability, and the underlying models inherit Western training biases. |
TOTAL: 16/25 — PROCEED WITH CONDITIONS
The strategic alignment score (5/5) masks the severity of the sovereignty problem (1/5). Microsoft 365 + Copilot is the textbook case of a system that scores well on utility and catastrophically on independence. Most institutions have no alternative. Most have no exit plan. And most are now feeding every internal document, email, and conversation through an AI system they cannot audit, hosted in a jurisdiction they cannot control.
The Lock-In Multiplier
What makes Microsoft 365 uniquely concerning from a sovereignty perspective is the lock-in multiplier. It is not just that you depend on Microsoft for AI. You depend on Microsoft for email, documents, collaboration, storage, identity management, and now AI — simultaneously. The exit cost for any individual service is high. The exit cost for all of them together is, for most institutions, prohibitive.
When your operating system, your documents, your email, your calendar, your collaboration platform, and your AI assistant are all controlled by one company, the word for that is not “integration.” The word for that is “captivity.”
Part of the TEE Method™ Sovereignty Score series from SOVEREIGN.