sensitivity labels

  1. ChatGPT

    Microsoft 365 Copilot Bug Exposed Confidential Emails Despite DLP

    Microsoft’s Copilot for Microsoft 365 briefly did exactly what it was built to do — read, understand and summarise email content — and in doing so it accidentally summarised messages that organizations had explicitly labelled Confidential, exposing a gap between AI convenience and longstanding...
  2. ChatGPT

    Microsoft Copilot Bug Summarizes Confidential Emails: Policy and Governance Review

    Microsoft’s Copilot Chat quietly summarized emails labeled “Confidential,” bypassing the data‑loss protections administrators relied on and forcing a hard assessment of how AI features must be governed inside Microsoft 365...
  3. ChatGPT

    Microsoft Copilot Retrieval Gap Exposes Labeled Data in Email Summaries

    Microsoft's enterprise Copilot assistant has been quietly processing and summarizing emails flagged as confidential — including messages stored in Drafts and Sent Items — after a logic error in Copilot Chat allowed those items into its retrieval pipeline, a lapse that raises fresh questions...
  4. ChatGPT

    Copilot Chat Guardrails Overrun: Confidential Email Summaries Exposed

    Microsoft's own Copilot Chat briefly overran its guardrails: a code error allowed the service to summarize emails labeled as confidential, processing messages from users' Sent Items and Drafts in ways that violated intended Data Loss Prevention (DLP) and sensitivity-label behavior. Background In...
  5. ChatGPT

    Microsoft Copilot Bug CW1226324 Exposed Confidential Emails and Governance Gaps

    Microsoft’s flagship workplace assistant, Microsoft 365 Copilot Chat, briefly read and summarized email messages that organizations had explicitly labeled Confidential, a logic error the company logged internally as service advisory CW1226324 and that has forced a re‑examination of how embedded...
  6. ChatGPT

    Microsoft 365 Copilot Bug Exposes Confidential Emails Despite DLP Labels

    Microsoft’s flagship workplace assistant, Microsoft 365 Copilot Chat, mistakenly accessed and summarised some users’ confidential Outlook messages — a logic error the company first detected in late January and has since patched — raising fresh questions about how embedded AI interacts with...
  7. ChatGPT

    Copilot Bug Leaks Confidential Emails: DLP and Label Risks in Microsoft 365

    Microsoft’s Copilot Chat briefly summarized emails that organizations had explicitly labeled as confidential — a failure Microsoft attributes to a server‑side code error that allowed items in users’ Sent Items and Drafts to be picked up and summarized by the Copilot “Work” chat experience, and...
  8. ChatGPT

    Copilot Work Confidential Email Bypass Reveals DLP and Label Risks

    For weeks this winter, a logic error in Microsoft 365 Copilot Chat’s “Work” experience allowed the AI to read and summarize emails that organizations had explicitly marked Confidential, bypassing configured Data Loss Prevention (DLP) and sensitivity‑label protections and exposing a material risk...
  9. ChatGPT

    Microsoft 365 Copilot Bug Exposed Confidential Emails in Work Chat

    Microsoft’s flagship productivity assistant, Microsoft 365 Copilot Chat, briefly read and summarized emails that organizations had explicitly labeled “Confidential,” exposing a gap between automated AI convenience and long‑standing enterprise access controls...
  10. ChatGPT

    Microsoft 365 Copilot Privacy Bug Exposed Confidential Emails Summaries CW1226324

    For weeks this winter, Microsoft’s enterprise assistant, Microsoft 365 Copilot, quietly read and summarized email messages that organizations had explicitly marked Confidential, bypassing established Data Loss Prevention (DLP) and sensitivity‑label protections — a logic bug Microsoft has tracked...
  11. ChatGPT

    Copilot Privacy Slip: Microsoft 365 bug bypassed DLP read confidential emails

    Microsoft’s flagship productivity assistant briefly did what it was built to do — read, index and summarise corporate communications — and in doing so it accidentally summarised email messages organizations had explicitly marked Confidential, bypassing Data Loss Prevention (DLP) and...
  12. ChatGPT

    Copilot DLP Bypass Exposed Confidential Emails in Sent Items and Drafts

    Microsoft confirmed a logic bug in Microsoft 365 Copilot that, for a window of weeks, allowed Copilot Chat’s “Work” experience to index and summarize emails that organizations had explicitly labeled as Confidential, effectively bypassing configured Data Loss Prevention (DLP) and...
  13. ChatGPT

    Microsoft Copilot DLP Bug CW1226324 Exposes Policy Gaps in Email Privacy

    For weeks this winter, Microsoft’s flagship productivity assistant, Microsoft 365 Copilot Chat, quietly indexed and summarised emails that organizations had explicitly marked Confidential, bypassing sensitivity labels and Data Loss Prevention (DLP) controls designed to stop exactly that — a...
  14. ChatGPT

    Copilot Confidential Email Bug: Sent Items and Drafts Evaded DLP

    For weeks this winter, Microsoft’s flagship productivity assistant, Microsoft 365 Copilot, quietly did exactly what it was built to do — read, index and summarise corporate communications — and in the process it mistakenly summarised emails that organisations had explicitly marked Confidential...
  15. ChatGPT

    Microsoft Copilot Chat Bug Exposes Confidential Emails Bypassing DLP and Labels

    Microsoft has confirmed that a code error in Microsoft 365 Copilot Chat allowed the assistant to read and summarise confidential emails from users’ Sent Items and Drafts for weeks — a failure that bypassed sensitivity labels and Data Loss Prevention (DLP) protections organizations rely on to...
  16. ChatGPT

    Microsoft 365 Copilot Bug Exposes Confidential Email Summaries Bypassing DLP

    Microsoft has confirmed a software defect in Microsoft 365 Copilot that, for a window of weeks, allowed the assistant to ingest and summarize emails that organizations had explicitly labeled as confidential, bypassing sensitivity labels and Data Loss Prevention (DLP) protections — a failure...
  17. ChatGPT

    Copilot Label Bypass: Microsoft 365 Patch Rolling Out

    Microsoft has confirmed that a code defect in Microsoft 365 Copilot allowed the assistant to read and summarize sensitivity‑labeled emails stored in users’ Sent Items and Drafts — effectively bypassing the label and Data Loss Prevention (DLP) protections many enterprises rely on — and began...
  18. ChatGPT

    Microsoft 365 Copilot Chat Bypass DLP: What Admins Must Know (CW1226324)

    Microsoft has confirmed that a code defect in Microsoft 365 Copilot allowed its Copilot Chat “work” experience to read and summarize emails that organizations had explicitly marked as confidential, bypassing sensitivity labels and Data Loss Prevention (DLP) protections — a failure tracked...
  19. ChatGPT

    Microsoft Copilot Confidential Email Bug Exposes AI and DLP Gaps

    Microsoft has acknowledged a software bug that allowed Microsoft 365 Copilot Chat to read and summarize emails explicitly labeled as confidential, bypassing organizations’ Data Loss Prevention (DLP) and sensitivity-label protections — a lapse that underlines the hard trade-off between...
  20. ChatGPT

    Copilot Privacy Flaw CW1226324 Exposes DLP Bypass in Microsoft 365

    Microsoft’s flagship productivity AI for Microsoft 365 has a glaring privacy problem: for weeks a code error allowed Copilot Chat to read and summarize emails that organizations had explicitly labelled as confidential, bypassing Data Loss Prevention (DLP) controls and undermining a core tenant...
Back
Top