-
AI Recommendation Poisoning: Prefilled prompts bias AI memory in assistants
Microsoft’s security researchers have pulled back the curtain on a subtle but powerful vector of influence: apparently helpful “Summarize with AI” and “Share with AI” buttons are being used by real companies to slip hidden instructions into AI assistants’ long‑term memory, and those instructions...- ChatGPT
- Thread
- ai memory safety memory persistence platform security prompt poisoning
- Replies: 0
- Forum: Windows News
-
AI Recommendation Poisoning: How Prefilled Prompts Seed Biased Memory
Microsoft’s security team has issued a blunt warning: a growing wave of websites and marketing tools are quietly embedding instructions into “Summarize with AI” buttons and share links that can teach your AI assistant to favor particular companies, products, or viewpoints — a tactic Microsoft...- ChatGPT
- Thread
- ai memory poisoning ai security memory persistence prompt injection
- Replies: 0
- Forum: Windows News