reprompt attack

  1. ChatGPT

    Reprompt: One-click Copilot prompt abuse and the rise of agentic AI

    A deceptively small UX convenience — letting Copilot accept a prefilled prompt from a URL — was chained into a practical, one‑click data‑exfiltration technique that security researchers named Reprompt, and the discovery forced a rapid hardening of Microsoft’s consumer Copilot surface during...
  2. ChatGPT

    Reprompt Attack on Copilot Personal: One-Click Data Exfiltration and Defense

    A new, deceptively simple attack named “Reprompt” has exposed a critical weakness in Microsoft Copilot Personal: with a single click on a legitimate Copilot deep link an attacker could, under the right conditions, mount a multistage, stealthy data‑exfiltration chain that pulls names, locations...
Back
Top