
UK IT teams are facing a twin pressure test: a surge in Microsoft 365 consolidations driven by M&A activity, and an equally rapid push to adopt AI assistants such as Microsoft 365 Copilot — a combination that is amplifying security, governance, and compliance risk across the enterprise landscape.
Background
ShareGate’s State of Microsoft 365 research — a survey of IT professionals framed around migration drivers, governance readiness and AI deployment — puts numbers behind what many IT teams have felt for months: M&A is a primary catalyst for tenant consolidation, and the rush to consolidate often collides with weak governance frameworks and immature AI controls. The ShareGate report finds a large share of organisations under pressure to move fast during M&A deals, and that security and compliance issues surface repeatedly during those migrations. Alongside ShareGate’s market research, Microsoft’s own documentation on the Copilot Control System and Purview demonstrates that Microsoft expects customers to take active, tenant-level steps to secure Copilot and agent interactions — and provides the technical controls to do so (Purview DLP, DSPM for AI, audit logging, sensitivity label enforcement and adaptive insider-risk tooling). These platform controls exist, but they require configuration, testing and operationalisation by IT teams. Independent reporting and vendor analyses reinforce the same thesis: Copilot and other AI assistants dramatically expand the surface area of discovery and synthesis across Exchange, SharePoint, OneDrive and Teams, increasing the likelihood that sensitive content can be surfaced if permissions, labels and DLP controls are not correct. This is why vendors and consultants are issuing red‑flag checklists and operational hardening playbooks for Copilot in production.What ShareGate’s data actually shows (verified figures)
ShareGate’s survey — and subsequent public summaries of that research — make several concrete claims that are useful for IT leaders planning migrations and AI rollouts:- A large cohort of organisations are in or planning M&A-driven migrations: ShareGate reports that a significant proportion of IT professionals are involved with or expect M&A activity to trigger migrations (the report frames M&A as a top migration driver).
- Security and compliance are front‑line worries during consolidation: the research highlights that many respondents list security/compliance as their biggest challenge when merging Microsoft 365 environments. Specific survey slices reported by industry outlets and ShareGate summaries put this concern among the top migration blockers.
- Copilot adoption is widespread among early adopters: ShareGate’s research and blog posts show pilots, partial and full deployments of Copilot within respondents’ organisations — the pattern is “many have tested, a sizable group have rolled out, and governance concerns lag.”
Why M&A makes Microsoft 365 security harder — and more urgent
Mergers and acquisitions are unique as drivers of cloud consolidation because they create compressed timelines and an imperative to rapidly fold a second organisation’s digital estate into an acquiring tenant. That sprint exposes three structural problems:- Permission sprawl and legacy access: merged organisations inherit years of external guests, orphaned groups, and legacy permissions that rarely match the new organisation’s access model. Tools like Copilot will “see” what users can access, so inherited overbroad permissions become amplified risk.
- Data sprawl and stale content: dormant SharePoint sites, abandoned OneDrive folders and duplicated archives hide sensitive material. Copilot and search-driven AI can resurface those items in ways admins did not anticipate, turning buried risk into immediate exposure.
- Compressed governance posture: because M&A projects are business‑time critical, governance activities are deprioritised in favour of “cutover” tasks (email routing, identity consolidation, payroll integration). The result is a migration that technically completes but leaves long‑term governance holes. ShareGate warns explicitly that “long‑term success shouldn’t be seen as completing the migration; it’s ensuring ongoing control, compliance, and scalability once the integration is done.”
The Copilot complicator: why AI changes the migration calculus
AI assistants change the rules because they do more than store content — they process, synthesize and re-present it. That matters for three reasons:- Broader reach: Copilot can synthesize across mail, files, chats and Teams content. If Copilot is allowed to process content without label‑ or DLP‑based exclusions, sensitive fragments from many locations can be recombined into outputs. Microsoft’s Copilot Control System and Purview DLP features explicitly allow tenants to block Copilot from processing files with specific sensitivity labels, but these protections must be correctly applied and scoped prior to broad enablement.
- Audit and provenance needs: AI-generated content can be persuasive and business‑critical, but auditors and legal teams need traceability: what source content was used, who asked the prompt, and what outputs were produced. Microsoft’s activity explorer and Purview eDiscovery capabilities provide these audit trails when configured — another control that depends on implementation.
- New insider‑threat vector: poorly scoped Copilot access can behave like a highly efficient “insider” that locates and synthesizes classified information. Vendors and security practitioners have shown that accidental exposure and deliberate prompt‑injection attacks both become more damaging if Copilot has access to a lot of sensitive content. Practical mitigations include DSPM for AI discovery, DLP policies that prevent processing of high‑risk labels, and restricted agent creation for Copilot Studio.
Governance gaps ShareGate uncovered — and why they matter
ShareGate’s research and product documentation reveal common governance approaches and the danger zones inside them:- Many organisations rely on built‑in Microsoft tools for governance, but only if those tools are enabled, configured, and maintained. ShareGate’s customer-facing materials and market report highlight that using out‑of‑the‑box Microsoft controls is common — but that relying on defaults or manual processes is risky.
- A non‑trivial subset of organisations use manual or ad‑hoc processes to manage governance during migrations. Manual cleanup, inconsistent sensitivity labelling, and spot audits do not scale, particularly when confronting the volume of assets that commonly accompanies an M&A migration. ShareGate’s Governances Risk Assessment product and guides exist to help bridge that gap, but many teams still underinvest in continuous discovery and remediation.
- Only a minority adopt third‑party continuous data security tools to provide DSPM, bulk remediation and Copilot-aware DLP. This means many teams lack automated discovery that can flag “what Copilot can see” in a single dashboard — making it harder to prioritise cleanup work ahead of an AI rollout. ShareGate’s own product messaging emphasizes this visibility gap and offers guided remediation as a response.
Practical, verifiable hardening steps for safe M&A migrations and Copilot enablement
The technical and operational countermeasures are well established and can be implemented in a pragmatic, sequenced manner. The guidance below is built on vendor capabilities (Microsoft Purview, Entra Conditional Access, Defender XDR) and best‑practice migration hygiene — validated in the ShareGate materials and Microsoft documentation.Immediate triage (pre‑cutover)
- Run a tenant‑wide inventory and DSPM for AI scan to answer: what content exists, where is sensitive content stored, and which identities have access? Use Purview DSPM for AI or a third‑party DSPM prior to enabling Copilot.
- Apply sensitivity labels to high‑risk repositories and configure Purview DLP to exclude those labels from Copilot processing (block summarization and processing of "Highly Confidential" assets). Test the policy in a non‑prod window.
- Restrict Copilot Studio and agent creation to a small admin group; require approval workflows and enforce RBAC on agent identities. Audit agent telemetry before broader authoring is allowed.
Migration day and immediate post‑migration activities
- Enforce least privilege and rotate service principals used for tenant-to-tenant connectors. Use certificate‑bound service principals where possible and limit Graph scopes to the minimal set required for cutover tasks.
- Move from manual cleanups to guided bulk remediation: remove stale guest accounts, close inactive Teams/SharePoint sites, bulk‑revoke external links and remediate overbroad group membership. Tools exist to automate these bulk actions; bake them into the migration playbook.
Operational governance and monitoring (30–90 days)
- Enable Purview auditing for Copilot prompts and responses; ingest Copilot activity into SIEM/SOAR to detect anomalous queries and suspicious agent behavior. Build IR runbooks for AI‑specific incidents.
- Put an approval gate around seat expansion: start Copilot with a pilot cohort (legal, HR, senior product teams) and measure production KPIs (time saved, error rates, audit coverage) before scaling. This mitigates cost and exposure.
- Establish continuous DSPM scanning cadence and integrate remediation tasks into the ticketing backlog — governance is continuous, not a migration milestone. Automate policy suggestions where possible and track remediation completion.
Governance as an ongoing product: people, process and budgeting realities
Security and compliance in the Copilot era are not purely technical; they are organisational. ShareGate’s analysis and interviews with Microsoft MVPs highlight that successful programmes treat migration and AI readiness as a product: clearly defined owners, KPIs, and backlog-driven improvements. IT must secure budget and executive sponsorship to treat governance work as a sustained capability rather than a one‑off checkbox. Practical resourcing notes:- Budget for licensing: optimized Purview controls and some recommended features require higher Microsoft SKUs (A5/E5). Factor these into your TCO and pilot budgets.
- Allocate SIEM, forensics and SOC hours for Copilot telemetry ingestion. AI interactions create new log sources that need processing and retention planning.
- Invest in training and change management: prompt hygiene, red‑teaming AI outputs, and role‑based prompting playbooks reduce user-driven exposures. ShareGate and partners recommend a “Copilot hero” approach — a trained internal champion per business unit to accelerate safe adoption.
Balanced assessment: strengths and strategic opportunities
- Strength — consolidation enables better security: When executed correctly, M&A-driven consolidation creates the conditions for stronger centralised governance, unified identity controls and predictable security posture across formerly disparate estates. Put another way, migration gives IT an opportunity to fix long-standing security debt rather than merely reshuffling it. ShareGate’s materials emphasize this opportunity: migrations are a chance to create safer and faster AI readiness.
- Strength — Microsoft provides built-in controls: Microsoft’s Copilot Control System, Purview DLP for Copilot, and audit capabilities provide a mature set of controls for organisations willing to invest the time to configure and operate them. These built‑in controls are a strong baseline when used proactively.
- Opportunity — governance + DSPM unlock safe AI value: Organizations that treat DSPM, sensitivity labeling, and DLP as operational artefacts (not one‑time projects) can unlock Copilot productivity responsibly. The business case for Copilot remains compelling for many knowledge‑work workflows — but only when governance decreases the probability and impact of exposure to acceptable levels.
Risks and limits — where caution is still required
- Operational complexity and skill gaps: Many teams lack the people and processes to operate Purview DSPM, correlate Copilot telemetry into SOC alerts, and remediate at scale. ShareGate’s survey flagged a skill gap in AI governance expertise among IT professionals — a material operational risk when rolling out Copilot widely. Treat vendor claims of “easy” enablement with healthy scepticism unless you confirm internal capability.
- Vendor lock‑in and long‑term portability: Deeply embedding Microsoft AI and security stacks into workflows increases switching costs. Organisations with multi‑cloud or sovereign-data requirements must evaluate portability and contractual protections. This tradeoff is real and strategic; mitigate by documenting export scenarios and retaining key data extracts for portability planning.
- Unverifiable headline metrics: Some vendor and press numbers around productivity uplift or absolute exposure counts are directional and sample-dependent. Use your tenant telemetry and disciplined pilots to measure ROI and exposure in your environment before basing broad procurement decisions on a headline figure. Flag such claims as indicative, not universal.
Recommended playbook for UK IT leaders executing M&A migrations with Copilot in scope
- Inventory & classify (Week 0–2): run DSPM scans, identify high‑risk data stores, and enumerate external/guest access.
- Label & block (Week 2–4): apply sensitivity labels and configure Purview DLP to exclude high‑risk labels from Copilot processing.
- Harden identity (Week 2–6): rotate service principals, enforce Conditional Access + MFA for admin roles and migration connectors.
- Pilot Copilot (Month 2–3): enable Copilot for a small, monitored cohort; capture KPIs and telemetry for SLA and security gates.
- Scale with gates (Months 3–12): expand seats only after remediation completion and with continuous DSPM scanning and SIEM integration.
- Institutionalise governance (Ongoing): quarterly DSPM, monthly permission reviews, annual red‑team for AI prompt injection.
Conclusion
The ShareGate data and Microsoft’s control framework together tell a clear story: M&A-driven Microsoft 365 migrations are accelerating, and Copilot adoption is already underway — but the business benefits of consolidation and AI will be undermined if governance and security are treated as afterthoughts. The good news is that platform controls exist (Purview DSPM, DLP for Copilot, Copilot Control System) and practical remediation patterns are mature. The hard truth is operational: organisations must invest in continuous discovery, people, and SOC capability, and treat governance as an ongoing product.Organisations that approach migrations as opportunities to reduce security debt, automate remediation, and put Copilot behind carefully scoped policy gates will capture productivity gains while materially lowering exposure. Those that rush consolidation without these guardrails risk turning AI into the most efficient insider‑threat they’ve ever had.
Source: IT Brief UK https://itbrief.co.uk/story/uk-m-a-migrations-heighten-microsoft-365-security-ai-risks/