• Thread Author
Microsoft’s Copilot+ PC pitch promised a new class of Windows machines where on-device intelligence — powered by dedicated NPUs — would deliver privacy-friendly, instant AI features that change how we use a laptop every day. After a year of hands-on testing and watching Microsoft’s rollout, the reality looks far more mundane: some genuinely useful on-device capabilities, a raft of cloud-dependent features, and a branding/packaging mismatch that leaves many buyers underwhelmed.

Futuristic laptop UI shows an NPU chip with glowing blue holographic cards.Background / Overview​

Microsoft first framed Copilot+ PCs as hardware that pairs modern silicon (NPUs included) with a Windows layer built to run local AI models and tight Copilot integration. The idea: offload certain AI tasks to a device NPU for faster responses and better privacy, while still using the cloud when heavier models are needed. Early marketing leaned heavily into that local/on-device promise, and OEMs shipped machines — including ARM-based Surface models and x86 Copilot+ variants — that emphasized battery life and efficiency as much as AI capability.
Yet Microsoft's execution and messaging created a gap between expectation and experience. Some flagship on-device features were delayed, privacy concerns popped up, and many of the features users expect from “AI Windows” still run in the cloud or behind a subscription paywall. This article parses what’s shipping now, where the technology genuinely adds value, and where Microsoft must course-correct to make Copilot+ PCs compelling beyond performance and battery life.

What Microsoft promised — and what actually shipped​

The promise: instant, private, local AI​

Microsoft’s public narrative framed Copilot+ PCs as machines that run meaningful AI locally — small LLMs and vision models that could, for example, let you search your recent activity, summarize on-device content, and automate tasks without sending sensitive data to the cloud.

The reality: a mixed bag​

  • Some features do run locally and feel genuinely useful — notably the Settings agent powered by a small on-device model called Mu, which can translate natural-language queries into concrete settings changes or one-click actions. Microsoft documents the agent and Mu’s role in mapping natural-language input to Settings operations. (learn.microsoft.com, blogs.windows.com)
  • Other headline features either arrived late (or not at all initially), were scaled back, or depend heavily on cloud models and Microsoft accounts — e.g., generative tools in Photos/Designer, many image edits, and some Copilot capabilities that work best when connected to Microsoft’s cloud services. (support.microsoft.com, create.microsoft.com)
  • The Recall timeline (which captures snapshots of the desktop to let you “search your past”) provoked privacy alarms, was delayed for security hardening, and remains controversial despite Microsoft’s later technical mitigations. Coverage of the initial backlash and Microsoft’s subsequent security/architectural changes is extensive. (theverge.com, blogs.windows.com)

Deep dive: the features that matter (and why they underdeliver)​

Windows Recall — powerful, polarizing​

Recall is a clear example of the tension between capability and acceptability. Conceptually, it’s compelling: local screenshots and an index let you search for what you saw earlier — a browser snippet, an image, a paragraph. In practice, the initial design stored snapshots in an unencrypted form, prompting researchers and apps to raise alarms and block or restrict Recall behavior. Microsoft paused the rollout, added encryption, hardened access with Windows Hello, and made Recall opt-in — but the reputational damage lingered. Brave, Signal, AdGuard and others implemented workarounds to prevent or restrict Recall from capturing sensitive content. Those privacy concerns are not hypothetical; they shaped the feature’s public reception.
Why this matters: a feature that by design records your screen needs ironclad privacy controls and transparent UI affordances. Microsoft ultimately implemented reasonable mitigations — encryption, filtering, opt-in defaults — but rebuilding trust is slower than shipping a patch. (blogs.windows.com, arstechnica.com)

Click to Do — clever UI, modest impact​

Click to Do (aka Click-to-Do) overlays the screen and offers action suggestions based on selected content: web searches, email composition, summarization, or handoff to an app. It’s a tidy productivity tool and can be useful for quick tasks. Importantly, Microsoft documents that Click to Do performs analysis locally on Copilot+ devices and only shares content when the user explicitly chooses an action. That local analysis is exactly the kind of low-latency, private interaction that justifies on-device compute — but in practice Click to Do often calls cloud services (e.g., Microsoft 365 Copilot) for richer capabilities, which undercuts the on-device-only value proposition. (support.microsoft.com, learn.microsoft.com)

Settings agent + Mu — the most persuasive on-device win​

The most convincing example of local-only usefulness is the agent in Settings. It runs a small, optimized LLM (Mu) entirely on the NPU to translate casual queries like “My mouse pointer is too small” into a direct suggested action with a one-click fix. Microsoft’s blog and docs explain Mu’s architecture, quantization, and how the agent tightly maps natural language to concrete Windows APIs. The results are fast and private, and this is the clearest case where local compute delivers a frictionless UX that cloud latency wouldn’t improve. (blogs.windows.com, learn.microsoft.com)
Why it’s not enough: Mu is task-specific and narrow. It can make Settings easier, but it’s not a catch-all assistant that replaces cloud models for heavy lifting tasks like complex summarization or large-image generation.

Photos / Designer / Image Creator — cloud-first creativity​

Microsoft has integrated image generation and generative editing into Photos via Designer and Image Creator tooling. These can generate or edit images from prompts, but they require a Microsoft account and often run on cloud models (e.g., DALL·E integrations via Designer and cloud-backed editing pipelines). This makes them less attractive as a unique on-device selling point: the same or better generation is available through cloud services or dedicated tools. (support.microsoft.com, create.microsoft.com)

Local NPUs vs GPUs vs the cloud: who wins?​

What NPUs are good at​

  • NPUs shine at efficient, low-latency inference for small models and specialized workloads.
  • They run lightweight LLMs and vision models with far lower power cost than using a CPU or full GPU, which is ideal for always-on, privacy-sensitive agents (e.g., settings agent or quick on-device visual analysis). Microsoft designed Mu and Phi Silica for that niche.

Where GPUs still dominate​

  • For large generative models (image generators, big LLMs, sustained high-throughput pipelines), GPUs remain the workhorse. High-end GPUs (NVIDIA with Tensor cores, AMD compute accelerators) provide the compute and memory bandwidth that NPUs — optimized for efficiency, not raw throughput — don’t match.
  • Many popular creative and pro apps rely on GPUs or cloud servers: DaVinci Resolve’s Neural Engine and other pro tools are tuned for GPU acceleration; Nvidia Broadcast runs exclusively on NVIDIA RTX GPUs and uses Tensor cores; Adobe’s Generative Fill and Lightroom generative features leverage Firefly models hosted in the cloud. That ecosystem reality explains why so many AI workloads still prefer GPUs or cloud. (linkedin.com, nvidia.cn, news.adobe.com)

The practical takeaway​

NPUs are a valuable addition, but their usefulness depends on compelling NPU-only experiences. Without a broader set of features that meaningfully require on-device NPUs, most buyers will view Copilot+ machines as lavishly marketed laptops that happen to have a silicon feature they rarely use.

Ecosystem and third-party app support: the weak link​

Third-party apps drive real-world value; hardware is only as useful as the software that leverages it.
  • Some third-party developers are moving to support on-device NPUs (e.g., DaVinci Resolve added Snapdragon X Elite optimization), but many widely used apps still rely on GPU or cloud backends. Blackmagic’s Resolve demoed Snapdragon X Elite acceleration — a notable exception — but most creative tools remain GPU-first or cloud-first. (linkedin.com, pcguide.com)
  • Topaz Labs explicitly noted that its applications didn’t use NPUs at the time of community discussion and relied on GPU/CPU acceleration instead, confirming that mainstream developer adoption of NPU acceleration is uneven.
  • NVIDIA’s Broadcast remains GPU-bound; for streamers and creators who depend on real-time virtual backgrounds, noise removal, and camera effects, an NPU-equipped laptop is not a substitute for an RTX GPU.
The bottom line: many of the apps people care about either can’t use a laptop NPU today or simply get better results from a GPU or cloud model. That reduces the perceived return on investment for a buyer focused on AI features.

Branding and pricing issues: Copilot vs Copilot+​

Microsoft’s naming choices made the problem worse. “Copilot” now denotes a wide family of features — Copilot the cloud assistant, Copilot+ PCs as hardware, Copilot Pro as a subscription tier — and the overlap confuses consumers.
  • Copilot Pro is a distinct subscription offering with a $20/month price tier and perks like priority access and expanded usage; Crucially, Copilot Pro unlocks broader Copilot functionality across Microsoft 365 apps and increases usage allowances for Copilot features. The subscription model means that owning a Copilot+ PC does not automatically give you unlimited access to the best Copilot features.
  • The implication is clear: hardware purchasers are being asked to pay separately for premium cloud AI usage while being told local NPUs are the big differentiator — a mismatch between marketing and practical economics.
A smarter OEM bundle — for example, including a trial or a year of Copilot Pro with high-end Copilot+ purchases (similar to how phone vendors sometimes include cloud storage or premium subscriptions) — would reduce friction and demonstrate value while Microsoft fills out more on-device features.

Strengths: What Copilot+ PCs already do well​

  • Battery life and efficiency: ARM Snapdragon X Elite and modern Intel/AMD chips focused on AI deliver class-leading battery life and snappy day-to-day performance. Reviews of Copilot+ hardware often praise battery longevity and quiet operation — real, tangible benefits beyond AI marketing. (rtings.com, tech.yahoo.com)
  • Privacy-first engineering where applied: When Microsoft committed to local models and VBS enclaves for Recall and local model execution, it showed a design pattern that can protect sensitive data better than cloud alternatives — if users choose to enable those features and trust the implementation.
  • Meaningful on-device UX wins: The Settings agent (Mu) demonstrates how local models can remove friction from OS navigation. These micro-improvements matter to everyday productivity and accessibility, and they point toward a sensible use of on-device AI. (learn.microsoft.com, blogs.windows.com)

Risks and unanswered questions​

  • Brand confusion and expectation mismatch: The Copilot/Copilot+/Copilot Pro naming fabricates expectations of a single, seamless AI experience across hardware and cloud. Instead users face fragmented access, subscription gates, and inconsistent rollouts.
  • Security surface area: Features that capture or analyze personal content (Recall, Click to Do) expand the attack surface. Microsoft hardened Recall, but the debate shows the risk: any system that stores or indexes user content — even locally — must be designed and audited to the highest standards. Several browsers and apps have already taken steps to block or mitigate Recall. (theverge.com, windowscentral.com)
  • Developer adoption: If third-party devs do not optimize for NPUs, local compute remains underutilized. Most pro creative apps and many AI workflows still expect GPUs or cloud compute; Microsoft must make it easier for developers to target NPUs or broaden Copilot+ compatibility to GPU-accelerated desktops. (community.topazlabs.com, linkedin.com)
  • Economic model misalignment: Charging separately for Copilot Pro while marketing Copilot+ hardware as the AI differentiator is a tough sell unless on-device features materially reduce the need for cloud credits or subscriptions. Microsoft should consider including Copilot Pro trials or bundling credits to demonstrate value.

What Microsoft should do next (concrete recommendations)​

  • Recenter messaging around tangible benefits:
  • Sell Copilot+ hardware primarily on battery life, efficiency, and specific on-device features (Settings agent, instant local accessibility tools), not vague promises of an AI revolution.
  • Expand and accelerate developer tooling:
  • Provide robust SDKs, optimized runtimes, and validation tools so independent developers can efficiently compile and deploy models to NPUs and GPUs with minimal friction.
  • Make a temporary Copilot Pro bundling program:
  • Offer a one-year Copilot Pro subscription (or a large block of Microsoft 365 AI credits) with higher-end Copilot+ purchases to lower the adoption hurdle and prove the platform’s value.
  • Prioritize trust-building measures:
  • Continue independent security audits, publish privacy-preserving design docs, and create visible UI affordances that make it obvious when on-device capture and indexing are active.
  • Broaden hardware compatibility:
  • Avoid exclusive or semi-exclusive rollouts that limit feature parity across x86 and ARM platforms. If NPUs are key, enable equivalent GPU paths so desktops and gaming laptops can participate in the Copilot+ ecosystem.

The bottom line​

Copilot+ PCs are not a dud — the hardware improvements, energy efficiency, and focused on-device experiences (notably the Settings agent powered by Mu) are real and valuable. But the “AI revolution” Microsoft promised is only partial: many of the sexiest AI features still live in the cloud, power-hungry workloads remain GPU- or server-bound, and marketing has set expectations the initial feature set couldn’t meet.
For buyers: a Copilot+ PC can be an excellent laptop — especially if you prioritize battery life and portability. But buy it for the machine, not for an immediate, transformative on-device AI experience. Microsoft can still deliver the vision, but it will take clearer messaging, developer momentum behind NPUs, improved third-party support, and smarter bundling of cloud services to turn Copilot+ from a promising category into a genuinely disruptive one. (blogs.windows.com, microsoft.com)

Microsoft’s Copilot+ story is now a work in progress: the building blocks are here — NPUs, compact models like Mu and Phi Silica, and a Windows layer that can invoke local intelligence where it makes sense — but the OS, partner ecosystem, and pricing strategy must evolve quickly before the promise of on-device AI becomes the kind of competitive advantage that drives mainstream PC upgrades.

Source: PCMag My Year With a Copilot+ PC: Where’s the AI Revolution Microsoft Promised?
 

Back
Top