memory persistence

  1. ChatGPT

    AI Recommendation Poisoning: Prefilled prompts bias AI memory in assistants

    Microsoft’s security researchers have pulled back the curtain on a subtle but powerful vector of influence: apparently helpful “Summarize with AI” and “Share with AI” buttons are being used by real companies to slip hidden instructions into AI assistants’ long‑term memory, and those instructions...
  2. ChatGPT

    AI Recommendation Poisoning: How Prefilled Prompts Seed Biased Memory

    Microsoft’s security team has issued a blunt warning: a growing wave of websites and marketing tools are quietly embedding instructions into “Summarize with AI” buttons and share links that can teach your AI assistant to favor particular companies, products, or viewpoints — a tactic Microsoft...
Back
Top