Copilot Comes to Windows 11 OOBE: Try Now During Setup Update

  • Thread Author
Microsoft has quietly added an interactive Copilot trial into the Windows 11 Out‑of‑Box Experience (OOBE) that can appear while the installer downloads and applies updates — letting new users “Try now” and open the full Copilot chat interface during setup without signing in to a Microsoft account.

Laptop displaying Windows 11 welcome screen alongside a Copilot “Try now” panel.Background / Overview​

The Windows Out‑of‑Box Experience (OOBE) is the guided setup flow users see the first time a PC boots after installation. It handles language selection, region, Wi‑Fi, account configuration and, increasingly, promotional panels and onboarding nudges. When an internet connection is available during OOBE, Windows will often pause the visible flow to check for and apply installer‑time updates — these can include day‑one firmware patches, cumulative or servicing packages that bring the installer up to date, and feature enablement packages used to activate hidden functionality before first sign‑in.
Historically, OOBE has also been a place where Microsoft surfaces offers for OneDrive and Microsoft 365 trials; the new twist is that the company is now experimenting with an interactive Copilot sandbox during that same update pause. The feature has been reported and demonstrated in recent installs: screenshots and first‑hand descriptions show a Copilot panel with a visible “Try now” button, with the Windows installation progress visible beneath it.

What changed: Copilot inside OOBE update pauses​

The core change is simple but important in placement and intent: when the installer determines that OOBE must download and apply an update before completing setup, the OOBE UI can surface a Copilot trial affordance. Selecting “Try now” opens a Copilot chat pane inside the setup environment, providing the familiar chat interface users see on the desktop — text chat, generative image options where enabled, and the usual left‑hand feature icons — while the background update continues to install. This is not the default behavior during a normal, offline Windows install; it appears only in the specific context where OOBE has initiated an installer‑time update.
A few practical details reported from testers and early observers:
  • The Copilot session is available before sign‑in, because OOBE checks for updates prior to account setup. That means you can interact with Copilot without first creating or signing in to a Microsoft account.
  • The Copilot interface appears to be the same web‑wrapped Copilot app experience users encounter on the desktop; it supports conversation, prompts for image generation, and the basic Copilot toolset accessible in an anonymous session.
  • When the update completes, you can return to the rest of OOBE and finish setup; Copilot can remain open through the transition until you choose to back out and complete the device restart.
These changes are delivered through OOBE installer updates — the same mechanism Microsoft uses to push CloudExperienceHost and localized resource updates at setup time. Those packages are intentionally applied only when OOBE updates are enabled and the device has network access during first boot.

How the flow works (step‑by‑step)​

  • Start OOBE: language, keyboard, region and other initial prompts are shown as normal.
  • OOBE checks Windows Update for installer‑time packages; if one is found, OOBE begins the download and apply cycle.
  • While the installer shows progress for update download and installation, the OOBE UI may surface a small promotional/interactive panel that includes a “Try now” button for Copilot.
  • Clicking “Try now” opens the Copilot chat pane inside the OOBE session. The user can interact anonymously for that session.
  • The update continues in the background; when it finishes, OOBE resumes and the device completes the first sign‑in and restart sequence. Users can, if they choose, continue using Copilot after OOBE completes.
This flow emphasizes the captured attention window during updates: rather than showing a static progress spinner for several minutes, Microsoft can offer an interactive demonstration of Copilot while the system brings itself up to date.

Why Microsoft likely did this: product and commercial logic​

Embedding Copilot into OOBE during update pauses aligns with several predictable product and business goals:
  • Captured attention window: OOBE update downloads can take several minutes — sometimes significantly longer on constrained networks — creating a captive moment where new users have time to experiment. Offering an interactive trial in that window reduces discovery friction.
  • Low‑friction trial: Allowing an anonymous session without forcing an MSA sign‑in lowers barriers. Users can see Copilot’s capabilities without creating accounts or changing settings. That design choice increases the chance of a positive first impression.
  • Product discovery during onboarding: Demonstrating features during the earliest moments of ownership increases the chance users adopt Copilot regularly once they reach the desktop. Early exposure tends to flatten the learning curve and reduces later support load.
  • Marketing and ecosystem lock‑in: Copilot adoption benefits Microsoft’s broader ecosystem — cloud services, Microsoft 365 integrations and the Copilot+ hardware story. Nudging uptake at OOBE is a logical commercial lever.
  • Telemetry and signals: OOBE trials provide early feedback on usage patterns: how many users try Copilot during setup, what prompts they use, and whether the experience aids or harms overall onboarding metrics. Those telemetry signals are valuable for product iteration.
Taken together, these incentives make the change unsurprising: the onboarding moment has always been used to surface high‑value features and paid trials; Copilot is now a core platform story Microsoft wants to accelerate.

The user experience: what you’ll see and what you can do​

Reports and screenshots show OOBE’s Copilot trial offering behaves like a sandboxed Copilot session inside the OOBE UI. Key practical points for users:
  • No account required for the demo: Because updates are checked and applied before sign‑in, the demo can run without an MSA. That reduces friction but also raises privacy questions about anonymous cloud processing.
  • Familiar Copilot features: The chat pane includes the left‑hand feature icons, supports text conversation and image generation where enabled, and otherwise mirrors the desktop Copilot web wrapper in look and feel.
  • Installer progress remains visible: The update progress bar stays on the screen beneath the Copilot pane, so users can see how much longer the setup will take while they interact.
  • Session scope and persistence: The OOBE session is temporary. Without signing in, conversation history cannot be tied to a persistent user profile, although the server‑side handling of those sessions is not fully public. This suggests the demo experience is intended as a transient trial rather than a personalised Copilot experience.
Taken at face value, this is a lightweight way to let people experience Copilot early. But lightweight does not mean risk‑free.

Privacy, telemetry and data handling — critical analysis​

Embedding a cloud‑powered assistant into a pre‑sign‑in setup flow raises several privacy and telemetry questions that Microsoft has not fully detailed in public materials available to independent observers. The major concerns are straightforward:
  • What data is collected during anonymous sessions? Even if a user interacts anonymously, Copilot relies on cloud processing. It is unclear what transient logs, prompts or telemetry the service retains and for how long, whether any device identifiers are associated with that data, and whether the data is used to personalize subsequent sessions once a user signs in later. These are material questions because OOBE is a sensitive first‑use context where users may be less likely to scrutinize permissions.
  • Consent and comprehension: The OOBE flow has become dense with choices and offers. There is a risk that users — eager to finish setup — will click through the Copilot demo without reading how their interactions may be collected or used. That consent‑clarity problem is well documented for other onboarding nudges and remains relevant here.
  • Telemetry and retention policies: Public documentation about Copilot’s server‑side retention windows and the linkage between anonymous OOBE sessions and later account‑bound data is limited. Without explicit, easy‑to‑read disclosures, the anonymity of OOBE interactions cannot be guaranteed from a privacy auditing perspective. We flag those claims as currently unverifiable without Microsoft’s explicit disclosure.
  • Edge cases for sensitive content: OOBE occurs before a user configures disk encryption or enterprise controls. If a user pastes or asks about sensitive information in the Copilot demo — credentials, passkeys, or private documents — that content will be processed by the cloud. Good design would warn or block certain inputs in this context; early reports do not confirm such protections. This is an important risk vector to audit.
In short: the convenience of an accountless demo is real, but it must be matched with clear, accessible privacy disclosures and sensible server‑side retention policies to avoid eroding trust at the platform’s most sensitive moment.

Enterprise and deployment implications​

For IT administrators and device builders, the OOBE Copilot experiment intersects with already complex OOBE update behavior and provisioning logistics.
  • Installer‑time updates affect provisioning windows: Devices that download multi‑gigabyte OOBE packages during first boot can take substantially longer to reach the desktop. Organizations that image or stage large fleets must account for this additional time or else use caching/distribution points to avoid heavy internet egress. Microsoft’s documentation and community tests show that some OOBE patches include localized resource bundles and CloudExperienceHost updates, which are applied only when the installer has network access.
  • Autopilot and token expiry risks: Long OOBE update windows can interfere with zero‑touch provisioning flows where enrollment tokens expire. Admins should pilot and adjust token lifetimes and consider staged rollout planning.
  • Network and caching strategy: Large organizations should consider delivering OOBE packages via local distribution points, branch cache, or other WAN‑optimization techniques to reduce setup time and avoid providing a captive window where Copilot trials appear unpredictably across a fleet.
  • Policy control and gating: Enterprises that do not want Copilot surfaced at OOBE — either for privacy, bandwidth or policy reasons — will need Microsoft‑documented controls. As of the current reporting, distinct Copilot behaviors and entry points for managed vs consumer devices exist, but admins should verify and test the enforcement mechanisms in their environment.
These operational realities mean administrators cannot treat OOBE as purely cosmetic anymore; the setup flow is now a policy surface affecting security, enrollment and user privacy.

Historical comparison: Cortana and user sentiment​

Microsoft previously used OOBE as a surface to introduce assistants and services — notably Cortana appeared prominently in Windows 10’s setup and early‑use experiences. That earlier exercise was controversial and often disliked by users, though the scale of backlash varied. The new Copilot experiment is similar in spirit — introducing a built‑in assistant at the earliest moment — but different in that Copilot is a cloud‑backed, multimodal system with perimeter reach into many apps and services. Given the broader debate about AI on the desktop, it is likely that Copilot in OOBE will be polarizing: appreciated by users who want to try an assistant immediately, and disliked by those who view it as an upsell or an intrusive default.

Risks, trade‑offs and unanswered questions​

No single feature decision is purely beneficial. The OOBE Copilot integration surfaces several clear trade‑offs:
  • Longer perceived setup time vs engagement: Showing an interactive experience in the middle of an update may reduce perceived waiting time for some users, but it also formalizes a route to market inside a flow that used to be simple and brief. For users who want a quick, privacy‑minimal setup, this feels like unnecessary friction.
  • Privacy vs discovery: Anonymous demos improve trial uptake but risk collecting sensitive prompts during a context when users are least attentive to permissions. Without clear server‑side controls, retention windows and linkage policies, users may be uncomfortable.
  • Commercial nudging vs user autonomy: OOBE is increasingly a sales and discovery surface for subscriptions and services. Adding Copilot to that mix strengthens Microsoft’s ability to nudge users toward its assistant ecosystem — valuable from a business standpoint, but likely to fuel pushback from privacy advocates and some customer segments.
  • Operational unpredictability for admins: If Copilot trials appear unpredictably across hardware and locales depending on server flags and entitlement checks, admins must work harder to document and control the user experience during mass deployments.
Finally, several crucial technical and policy details remain unverified in public reporting: the exact telemetry schema for anonymous OOBE Copilot sessions, the server‑side retention period, and whether any device identifiers are associated with those sessions. These are material items Microsoft should clarify to satisfy privacy‑minded users and large purchasers.

Practical recommendations​

For consumers
  • If you prefer a minimal, private setup: perform OOBE offline (skip network connection) and apply updates after you reach the desktop. This prevents OOBE from downloading installer‑time packages and thereby avoids the Copilot demo during setup.
  • Read the on‑screen disclosures carefully before interacting with Copilot in OOBE. Treat it as a cloud demo — don't input passwords, passkeys, or other sensitive secrets.
For IT administrators
  • Pilot OOBE images with a small set of devices, observe how installer‑time packages behave on your network, and measure provisioning windows. Adjust Autopilot token lifetimes if necessary.
  • Use local distribution or caching to avoid heavy egress and to keep provisioning times predictable.
  • Validate whether your management policies (MDM, group policy, or provisioning tooling) can suppress or control Copilot exposure during OOBE if that is a requirement for your fleet.
For privacy‑focused users and advocates
  • Ask Microsoft for explicit, easy‑to‑find documentation that explains what is collected by Copilot during anonymous OOBE sessions, how long the data is retained, and whether it is ever linked to a device or account. If those details are lacking, treat the feature as a potential privacy risk.

Final verdict: intriguing convenience, legitimate concerns​

Microsoft’s decision to offer a Copilot trial during OOBE updates is a clever product move: it exploits an idle moment to demonstrate a flagship feature, reduces trial friction, and increases the chance of adoption. For many users — especially those who want to explore AI on a new device — that convenience will feel helpful and immediate.
But the change also amplifies the existing tension in modern Windows setup flows: the balance between useful onboarding and commercialized, telemetry‑heavy first impressions. OOBE is not merely a tutorial; it sets defaults that cascade into privacy, security and long‑term habits. Any feature that leverages that moment must be accompanied by clear disclosures, configurable controls for enterprise customers, and robust, public privacy guarantees for anonymous sessions. Those safeguards are the missing pieces that will determine whether this experiment is perceived as thoughtful product design or an intrusive marketing nudge.
If Microsoft wants Copilot to win trust as a platform feature rather than a persistent annoyance, the company should:
  • Publish transparent privacy and telemetry documentation for OOBE Copilot demos;
  • Provide enterprise controls to suppress or manage Copilot exposure during provisioning; and
  • Make it easy for consumers to opt out or to run OOBE without network access if they prefer a minimal, privacy‑focused setup.
Until those items are explicit and easily discoverable, Copilot in OOBE will remain an intriguing but contested addition to Windows’ onboarding choreography.

In the end, the feature is a clear example of how platform owners are turning onboarding into a strategic surface: a place for discovery, trials and product nudges. That can be good for adoption and user education — but only if the company doing the nudging also meets users halfway with transparent policies and practical controls. The weeks and months ahead should tell whether Microsoft treats this as a helpful trial tool that respects user choice, or as another persistent nudge in an OOBE that many already find too crowded.

Source: Windows Latest Microsoft is using Windows 11 setup time to get you to try Copilot when you install certain updates
 

Microsoft has quietly started surfacing Copilot inside the Windows 11 Out‑of‑Box Experience (OOBE) so that new users can open and try the assistant while the installer downloads and applies updates.

Windows 11 install screen with a Copilot help card and privacy notice.Background / Overview​

The Out‑of‑Box Experience (OOBE) is the guided setup flow you see the first time a PC boots after a fresh install or reset: language, region, keyboard, network, account choices and a handful of initial personalization and privacy prompts. Traditionally OOBE has also been a place Microsoft uses for onboarding nudges — trial offers, OneDrive prompts and quick feature callouts — but the recent change se of that captive moment: an interactive Copilot chat window offered while installer‑time updates are downloaded and applied.
Copilot itself has moved quickly from a sidebar experiment to a system‑level assistant in Windows. Over the past year Microsoft has placed Copilot entry points in search, the taskbar, voice wake words and companion apps, and the company has also marketed a hardware tier called Copilot+ PCs for on‑device AI acceleration. Those earlier BE a logical place to try to accelerate discovery and engagement with the assistant.

What changed — the new OOBE Copilot experience​

  • During the OOBE update step — the phase where Windows checks for and applies installer‑time packages such as day‑one firmware or servicing updates — the setup UI may show a panel labeled some Copilot on Windows" with a "Try it now"** button. Clicking that button launches a Copilot chat pane inside the setup session while the update continues in the background.
  • The Copilot session available in OOBE appears to be session‑scoped and anonymous in the sense that thforce a Microsoft account sign‑in before letting you interact. That lowers the barrier to trying the assistant immediately after boot. (windowscentral.com)
  • The embedded Copilot UI behaves like the familiar desktop Copilot wrapper: text chat, the basic toolset, and generative features where enabled (for example, image generation). When the OOBE update finishes you can return to the rest of setup and complete first sign‑in and restart.
hange is small from a technical perspective — it is an additional OOBE affordance surfaced during a known pause in the setup flow — but it is notable for what it signals about Microsoft’s promotional and product strategy: use onboarding windows as live demos for platform features.

Why Microsoft likely added Copilot to OOBE​

Embedding Copilot into OOBEl practical advantages.
  • Captured attention window: installer‑time updates can take several minutes or longer depending on network speed, creating a predictable idle period. An interactive demo in that moment is high‑value real estate for product discovery.
  • Low friction trial: allowing Copilot to run before an account sign‑in reduces friction for first impressions. Users who are curious can try the assistant without interrupting setup or creating accounts.
  • Faster feature adoption: users exposed to Copilot during setup may be more likely to usesupport friction and increasing engagement metrics Microsoft can measure. OOBE trials are also telemetry‑rich for seeing which prompts or features attract attention.
  • Marketing and ecosystem lock‑in: early exposure to Copilot strengthens the pathway toMicrosoft 365, OneDrive, cloud features) and to the Copilot+ hardware narrative that emphasizes on‑device AI.
These commercial and product incentives make the OOBE placement unsurprising. Still, because OOBE is a consent and configuration moment, placing interactive demos there is sensitive and invites scrutiny on privacy, d user autonomy.

Verifiable details and what we checked​

To make the essential claims concrete and verifiable:
  • The observable UI text and the flow — an OOBE panel offering "Try it now" that opens Copilot while updates download — has been reported and reproduced by independent testers and multiple outlets.
  • The session‑scoped, pre‑sign‑in availability (you can use Copilot in OOBE without signing into a Microsoft account) is explicitlnitial reporting and in testers’ recollections. That is a material, user‑facing detail.
  • The Copilot experience in OOBE is delivered via the same update / OOBE update mechanism Microsoft has used to push localized and CloudExperienceHost packages; it appears only when OOBE initiates installer‑time updates. That technical delivery path explains why the affordance is not universal in all installs (e.g., offline setups) and why it can be gated via update server flags.
When a platform change like this appears, readers deserve confirmation from more than one independent source; the core claims above are corroborated by both mainsommunity testing.

Privacy, telemetry and the open questions​

The most important unresolved items are not about the UI but about data handling and telemetry.
  • Anonymous does not mean "ephemeral" by default. Even if an OOBE Copilot session is available before sign‑in, Copilot processing runs in Microsoft’s cloud and is subject to server‑side retention and telemetry policies. Public reporting so far does not disclose the exact telemetry schema, retention period, or whether session metadata is ever linked to a device identifier or later account sign‑in. That omission is material.
  • Consent clarity at OOBE is a real concern. OOBE is a moment where users commonly tap through prompts to get to a usable device. Introducing an interactive, cloud‑backed assistant into that flow increases the chance that users will share content they consider private without full awareness of what is transmitted. Microsoft’s general Coption explains how the assistant works in normal sessions, but it does not yet provide a clear, OOBE‑specific disclosure page describing what is collected during anonymous setup interactions. That gap should be closed.
  • Enterprise and managed deployments need controls. For organizations that ma images, Autopilot or other tooling, unpredictable Copilot affordances during OOBE can complicate compliance, bandwidth planning and image validation. There’s early guidance to pilot OOBE images, but as of current reporting there is no single, clearly documented toggle that universally suppresses Copilot during initial setup across all consumer and enterprise SKUs. IT teams should validate their deployment flows immediately.
Where reporting is incomplete or inconsistent, we’ve an unanswered question. Microsoft should publish a short, accessible OOBE Copilot privacy and telemetry FAQ aimed at end users and a separate, detailed control matrix for enterprise administrators. Until it does, treat anonymous OOBE Copilot sessions as cloud‑procecrosoft’s existing Copilot telemetry rules.

Practical recommendations — consumers​

If you want to avoid encountering Copilot during setup, or you want to minimize data sent during the initial device configuration, follow these steps:
  • Set up offline. Ducting the PC to the network (choose “I don’t have internet” or physically unplug/disable Ethernet and Wi‑Fi) and complete setup locally. Apply updates from Windows Update after you reach the desktop when you can review prompts deliberately.
  • Don’t input secrets. If you use Copilot in OOBE, avoid entering passwords, passkeys, account recovery codes, or other sensitive secrets while the session is anonymous and before you have reviewed privacy options. Treat the OOBEblic demo.
  • Read the on‑screen disclosures. If the OOBE Copilot panel appears, read any available notice about data use before initiating a session. If the notice is unclear, decline interaction and finish the install offline or after signing in to an account you control.
  • If you’re privacy‑minded, consider a clean install using an older, offline installer (for example an older ISO) and then apply updates under controlled conditions. Thi removes surprises during first boot. Community tools that customize OOBE behavior (discussed below) may also be used with caution.
These are practical, defensive moves for consumers who prefer to keep the first‑boot environment minimal and predictable.

ons — IT administrators and power users​

For managed environments, Copilot appearing unpredictably in OOBE is operationally meaningful. Consider the following steps:
  • Pilot extensively. Test the latest images and OOBE flows on representative hardware and network conditions. Confirm whether installer‑time packages tnces in your environment and measure provisioning time and bandwidth impact.
  • Use local caching and distribution. Reduce egress and provisioning time by staging OOBE packages through local distribution points or branch cache. This reduces the iot trials might surface unpredictably and keeps provisioning predictable.
  • Validate Autopilot and MDM settings. Ensure that Autopilot token lifetimes, enrollment flows and MDM policies are consistent w’s preference for surfacing or suppressing assistant features at setup. If necessary, contact your Microsoft account rep for official guidance or enterprise controls.
  • Communicate with end users. Tell new device recipients what to expect during first boot and whether interactive demos (such as Copilot) might appear. Provide a checklist so users know the safe way to proceed if they prefer a local‑only setup.
Enterprises will want a documented, reproducible OOBE path that matches their security posture; assume the default consumer flow is account‑first unless you explicitly control it via tooling or policy.

Could Copilot eventually run arios (and risks)​

Industry observers naturally speculate that embedding Copilot in OOBE could be a step toward letting an assistant guide or even perform setup tasks on behalf of users. There are a few realistic possibilities and some clear limits:
  • The near term (plausible): Copilot becomes a conversational helper inside OOBE that auggests settings (for privacy, OneDrive, Windows Hello), but human confirmation remains required for any changes. That helps users unfamiliar with both Windows and AI. This is consistent with Microsoft’s current agent‑support model.
  • The medium term (possible): Copilot could offer to preselect or apply recommended configurations via explicit user consent — for example, "Set up as a work profile with BitLocker and require Windows Hello", then execute those steps on user approval. This introduces automation trade‑offs and raises consent, audit and rollback questions.
  • The long term (speculative): An "agentic" Copilot that fully orchestrates provisioningon, account creation, application installation, privacy toggles) with minimal prompting would require robust enterprise controls, deterministic rollback, auditable logs and high trust in the assistant’s behavior. That level of automation is attractive but it is also a material change in who controls device identity and settings, so expect heavy scrutiny and regulatory interest before it becomes mainstream. This is speculation and should be treated as such; current public reporting does not show Microsoft shipping a fully agentic OOBE controller.
Whichever direction Microsoft chooses, the engineering and policy work to make assistant‑run provisioning safe and auditable will be significant. Administrators and regulators will rightly demand clear opt‑outs, logging and rollbacks before trusting a helper to reconfigure enterprise endpoints at scale.

Community reaction and the precedent of onboar vacuum. OOBE has been used by Microsoft and OEMs for years to introduce services — sometimes to the annoyance of experienced users — and Copilot’s placement echoes earlier contentious moments (for example, the Cortana promotional surfaces in prior Windows versions and the increasing "account‑first" nudges in recent Windows 11 preview builds). Enthusiast communities and privacy advocates have been vocally resistant to forced account sign‑ins and to the steady increase of promotional prompts during setup.​

Those communities have produced practical responses: tools and projects aimed at reclaiming control of OOBE or removing bundled AI features (community packages such as Flyoobe and other customization utilities), and widely circulated workarounds for local account creation. These projects for clearer user controls and a more transparent onboarding experience. At the same time, they highlight a tension: users who modify or disable components may lose official s‑device AI capabilities.

So if Copilot appears during your OOBE​

  • If Copilot appears and you want to test it: skim any privacy notice, avoid entering secrets, and close the session before completing OOBE if you want to finish setup offline.
  • If you want to avoid it: disconnect the network during OOBE and proceed with local setup; apply updates after you reach the desktop when you coices.
  • If you manage fleets: pilot images, use local caching for OOBE packages, and confirm Autopilot/MDM policies can enforce your organization’s OOBE preferences.
  • If you care about telemetry: ask Microsoft (via support or enterprise channels) for explicit OOBE Copilot telemetry documentation — what is collected in anonymous sessions, how long it’s retained, and whether it is ever linked to device or account identifiers. That documentation is not yet visible in public reporting and is essential for risk assessments.

The editorial verdict — convenience versus consent​

Putting a friendly, interactive assistant in the one place every user sees—the first‑time setup—was inevitable once Copilot matured into a system‑level feature. It is clever product design: demonstration at a captive moment reduces discovery friction and can deliver immediate value. For users who want to sample an assistant right away, this is a p
But OOBE is also a consent moment, and the trade‑offs matter. Without clear, OOBE‑specific disclosures and enterprise controls, this kind of placement risks being perceived as an intrusive upsell or as a privacy hazard for people who are not fullrst boot. The balance here is delicate: earned trust through transparent controls or fast adoption via subtle nudges. Microsoft needs to choose the former to avoid long‑running friction with privacy‑mT professionals.

Final thoughts and what to watch next​

  • Expect this Copilot OOBE affordance to roll out variably (it depends on whether OOBE downloads installer‑time updates and on server‑side flt on some installs and not others.
  • Watch for Microsoft to publish OOBE‑specific Copilot privacy and telemetry guidance; that documentation will materially change how comfortable privacy advocates and enterprises feel about the change.
  • If you manage devices, treat OOBE as part of the policyocument and communicate. Use local caching where possible to reduce unintended exposure and to make provisioning predictable.
  • Finally, view this change as part of a broader pattern: Microsoft is systematically embedding Copilot across Windows entry points. That’s sensible for product growth, but it also demands stronger transparency and controls if the company wants the assistther than resisted.
Copilot in OOBE is a small UI change with outsized implications — for discovery, for privacy, and for who controls the first moments of a Windows PC’s life. The right next steps are straightforward: clearer disclosure, robust enterprise controls, and auditability. Until those arrive, cautious users and administrators should plan their setups accordingly.

Source: XDA Microsoft is adding Copilot to the Windows 11 setup process, proving nothing is sacred
 

Back
Top