Emma Thompson Slams AI Copilot: Opt-In Is Essential for Writers

  • Thread Author
Dame Emma Thompson’s expletive-laden takedown of AI writing assistants on The Late Show With Stephen Colbert crystallized a frustration many writers and knowledge workers feel: an increasingly insistent, default-on AI that treats the act of finishing a sentence as an invitation to “improve” it. Her short, furious verdict — “intense irritation” followed by a string of blunt commands for the machine to “fuck off” — wasn’t just star-power theatre. It highlighted an important design and policy problem at the intersection of productivity software, user consent, and creative autonomy: when AI arrives pre-enabled and interruptive, even small nudges can feel like a violation of craft.

Weary man at a desk, head in hand, staring at a monitor displaying Co-Pilot.Background​

What happened on Colbert’s show​

During a recent interview on The Late Show With Stephen Colbert, Emma Thompson described how she writes longhand and later transfers her work into Microsoft Word, where the app’s AI assistant — Copilot — began offering to rewrite her prose. Thompson told Colbert she felt “intense irritation” and used explicit language to tell the assistant to back off. The exchange went viral and reignited a broader debate about whether AI features should be opt-in rather than opt-out.

Why the moment matters​

Thompson’s complaint is emblematic because it comes from a professional screenwriter whose craft hinges on precise turns of phrase and narrative voice. The incident frames several broader issues at once: the ethics of default settings, the tension between augmentation and replacement, and the user experience costs of ever-present assistants. The reaction has been amplified by social platforms, where short clips of the exchange racked up hundreds of thousands of views within hours.

Overview of Microsoft Copilot in Office apps​

What Copilot is meant to do​

Microsoft Copilot is an AI assistant integrated into Microsoft 365 apps intended to help with drafting, summarizing, rewriting, and generating content. Microsoft positions Copilot as a productivity tool for individuals and organizations, capable of suggesting text improvements, generating summaries, and assisting with data tasks. Proponents point out benefits for accessibility and time savings; critics point to accuracy, style homogenization, and the emotional toll of “help” that feels intrusive.

How Copilot has been deployed​

Microsoft rolled Copilot into various Office experiences earlier this year through staged updates. In response to user pushback about its ubiquity, Microsoft added an Enable Copilot checkbox that allows users to disable the assistant within specific apps and on individual devices. Microsoft’s support documentation details that checkbox and lists the versions in which it first appeared — for example, Word version 2412 on Windows and version 16.93 on Mac — while noting that the option’s availability varies by platform and app. Several outlets reported on the rollout and the new toggle when the change was made available.

The user-control problem: opt-in vs opt-out​

Default-on friction​

Designing AI features to be enabled by default creates a particular kind of friction. When a tool proactively suggests rewrites or summarizations, it interrupts the user’s mental flow and reframes completed text as an unfinished draft. For writers accustomed to finishing and polishing on their own terms, these interruptions can feel patronizing. Thompson’s bluntness underscores that reaction: the presence of the assistant was not merely an available feature — for her, it was an affront.

The opt-out burden​

Microsoft’s checkbox for disabling Copilot is a corrective, but it’s also an imperfect one. The setting is app-specific and device-specific, meaning users must find and turn off the feature in each Office app and on each device. That degree of granularity is defensible from a technical perspective, but it imposes a practical burden on users who simply want an “I don’t want AI suggestions at all” option. Independent reporting and community threads show users often struggled to locate the control or were surprised to find it enabled in new installs.

Verifying the facts: rollout dates, options, and how to disable Copilot​

What Microsoft’s documentation says​

Microsoft’s official support documentation explains how to turn off Copilot in Microsoft 365 apps: go to File > Options > Copilot, and clear the Enable Copilot checkbox. The page also lists app version thresholds and notes platform differences (for example, disabling the feature in desktop apps doesn’t apply to the web or mobile clients). This documentation represented Microsoft’s formal guidance for users who wanted to remove or mute the assistant.

Independent reporting and timing​

Various technology outlets reported both on Copilot’s integration and on the subsequent toggle. Some published practical how-to guides detailing the steps to disable Copilot after an update landed in mid-January through March of this year, and community forums corroborated users’ experiences with the toggle appearing (or not appearing) depending on update state and subscription type. There is some variation between outlet timelines — one widely-read guide places the toggle’s appearance around mid-January, while Microsoft’s support page carries a March update — but the substance is consistent: a toggle exists and can be cleared to turn Copilot off in desktop apps. Readers should note that Microsoft sometimes rolls features in phases, which explains the reporting differences.

How to disable Copilot in Word (desktop) — step-by-step​

  • Open Microsoft Word on your Windows or Mac device.
  • In Word for Windows: File > Options > Copilot. On Mac: Word > Preferences > Copilot (naming can vary by version).
  • Clear the Enable Copilot checkbox.
  • Click OK and restart the app.
  • Repeat for other Office apps and on each device where you don’t want Copilot active.
These steps follow Microsoft’s official instructions; community guides provide screenshots and troubleshooting tips for users whose installs don’t immediately show the option. If the checkbox is not visible, updating Office to the latest build and signing out/in may be required.

Why writers bristle: creative voice, autonomy, and the “improvement” trap​

AI suggestions flatten style​

Rewriting suggestions can introduce neutralizing edits: they often favor clarity and concision in ways that can strip voice, rhythm, or intentional eccentricities from prose. For professional writers, that’s not merely a preference; it’s an aesthetic and career stake. When an assistant repeatedly asks to “improve” text, it subtly frames the author’s first draft as inferior — even when the draft embodies an intentional stylistic choice. That framing is what makes automatic prompts feel like overreach.

The psychological cost of interruption​

Interrupted flow is a well-studied problem in cognitive ergonomics. Each offered suggestion triggers a micro-decision: accept, reject, or ignore. Over time, those micro-decisions add cognitive load and erode confidence. For people who write in concentrated blocks — whether screenplays, novels, or longform journalism — the AI’s “help” becomes a source of friction rather than acceleration. Thompson’s longhand habit captures this: physical writing rituals are also cognitive scaffolding, and an assistant that intrudes after those rituals can feel like a violation of process.

The case for Copilot: legitimate benefits​

Accessibility and productivity gains​

Copilot helps many users: it can summarize long documents for quick briefings, draft emails, produce accessible alt text, and assist non-native speakers with phrasing. For teams and information workers, these features can save measurable time and reduce repetitive burdens. Microsoft and several enterprise customers have promoted Copilot as a productivity multiplier that reduces routine tasks and helps surface information hidden in documents.

Use-cases where suggestions are welcome​

  • Non-creative drafting and summarization (meeting notes, email templates).
  • Data analysis assistants inside Excel and PowerPoint for slide generation.
  • Accessibility support for users with disabilities who benefit from automated transcription or alternative phrasing.
These scenarios underline that Copilot is not universally bad — its value depends on context and user control.

Risks and downsides Microsoft and others must grapple with​

1. Default settings as design choice​

Making AI features default-on is a deliberate product choice with ethical implications. Defaults are powerful: they shape user behavior and can normalize automation that users might otherwise reject. Companies must weigh convenience against consent. The backlash over default-on Copilot shows that in domains with creative ownership, defaults matter more.

2. Homogenization of style​

Widespread use of generative assistants risks stylistic homogenization. If many users accept similar “improvements,” public-facing text may converge toward a bland middle ground. That outcome is especially problematic for artistic and opinionated writing.

3. Errors, hallucinations, and liability​

AI suggestions are not infallible. Copilot can hallucinate facts or introduce inaccuracies. For professional writing — legal documents, news reporting, or technical documentation — such errors carry outsized costs. Enterprise customers and individuals must be vigilant about validation.

4. Privacy and data usage concerns​

Users want clarity about what content is sent to cloud services for model processing, and under what retention and sharing policies. Microsoft’s documentation describes privacy controls, but the complexity of corporate licensing and cloud processing raises questions about default settings and enterprise governance. Transparency and enterprise-grade controls are non-negotiable.

Practical recommendations for users and Microsoft​

For users (writers, knowledge workers)​

  • Verify your Office version and look for the Copilot settings in Options/Preferences; update Office if the toggle is missing.
  • If you value a strict “no-AI” environment, consider using document formats or editors that don’t ship with generative assistants enabled by default (offline editors, open-source suites, or older Office builds).
  • When using Copilot, treat suggestions as starting points, not authoritative fixes. Maintain a habit of careful review.

For Microsoft and other vendors​

  • Adopt an explicit opt-in model for creative writing workflows where voice and craft matter; at minimum, offer a global, account-level toggle not just app-and-device settings. This reduces configuration friction.
  • Make the privacy model and telemetry easy to inspect and control, with plain-language explanations for what data is used to train models and what remains private.
  • Provide mode-aware behaviours: an unobtrusive “suggest only on request” mode for creative documents and an aggressive assistance mode for administrative tasks. Tailoring behavior by document type or template could be a middle path that respects contexts and reduces interruption.

The cultural dimension: resistance in Hollywood and beyond​

Artists push back​

Thompson’s comments join a string of creative professionals voicing skepticism about the encroachment of AI into artistic domains. Directors, writers, and guilds have raised concerns over training data provenance, replacement of creative labor, and attribution. The resistance is both practical and symbolic: artists are defending not just jobs but the integrity of storytelling.

Public sentiment and market reaction​

Viral moments like Thompson’s have immediate PR effects: they raise consumer awareness, drive searches on how to disable features, and pressure companies to clarify controls. At the same time, large enterprise customers continue to adopt AI tools for efficiency, creating a bifurcated landscape where the technology is both embraced and contested.

What’s verifiable — and what’s rhetorical​

  • Verifiable: Thompson’s remarks on Colbert, the existence of the Enable Copilot checkbox, and Microsoft’s documentation describing how to disable Copilot in Office desktop apps. These are documented in multiple outlets and Microsoft’s support pages.
  • Rhetorical or unverifiable: the notion that showing a screenwriting Oscar to Copilot would change its behavior is a comic aside and not a technical claim. Statements about Copilot’s “intentions” or feelings are metaphors; machines do not possess sentience, and such statements should be read as humor rather than empirical observation.

Conclusion: a simple design ethic with big implications​

Emma Thompson’s outburst is blunt but instructive: the tech industry needs to stop treating user consent as an afterthought. When AI features affect creative workflows, opt-in, clear consent, and granular control matter. Microsoft’s addition of an Enable Copilot checkbox is a step in the right direction, but it falls short of the stronger, simpler defaults that many users expect — a single global control or an explicit opt-in for creative templates would be more humane.
The balance between augmentation and imposition will define the next phase of productivity tools. Companies can preserve the gains of AI-driven efficiency while respecting user agency, but only if design and policy choices privilege consent and context. For now, Thompson’s curt directive — essentially, “Let me write in peace” — is a useful reminder that technology should amplify human creativity, not crowd it out.

Source: theregister.com Dame Emma Thompson gives the 'AI revolution' both barrels
 

Back
Top