Microsoft’s latest Canary‑channel experiment stitches AI into one of Windows’ oldest workflows: right‑clicking files. The reported Windows 11 Insider Preview Build 27938 surfaces a new AI actions entry in File Explorer’s context menu that lets you run visual search, blur or remove backgrounds, and perform generative erases on common image formats without opening a separate editor — and it pairs that with a returning Notification Center clock that displays seconds and a new Settings surface that lists apps using the OS’s generative AI capabilities. (theverge.com)
Windows has spent the last two years moving AI from siloed apps into system surfaces: Copilot, OneDrive Copilot actions, and “Click to Do” experiments have shown Microsoft’s preference for surfacing intelligence where users already work. The File Explorer integration is a logical next step: the file system is the natural place to ask “do something” with a file. Multiple mainstream outlets and Insider hands‑on reports describe the same set of image‑focused quick actions and a roadmap to extend the concept to Microsoft 365 documents — though some document features will be gated by Copilot/Microsoft 365 licensing at first. (windowscentral.com) (theverge.com)
Microsoft’s official Flight Hub remains the authoritative source for build listings, and community reporting has tied these experiments to a Canary flight commonly reported as Build 27938. Flight Hub’s public list of active builds does not always reflect every community‑reported Canary number in real time, so treat a specific build label as community‑reported until confirmed. (learn.microsoft.com)
The reported Build 27938 is a Canary‑channel experiment — promising, incremental, and still subject to change. Treat it as a preview of what could become a standard part of Windows’ productivity fabric, but expect additional policy controls, clearer locality guarantees, and staged rollouts before this functionality lands broadly and safely in production channels. (windowscentral.com) (theverge.com)
Source: Cyber Press Microsoft Adds AI-Powered Actions to Windows File Explorer
Background
Windows has spent the last two years moving AI from siloed apps into system surfaces: Copilot, OneDrive Copilot actions, and “Click to Do” experiments have shown Microsoft’s preference for surfacing intelligence where users already work. The File Explorer integration is a logical next step: the file system is the natural place to ask “do something” with a file. Multiple mainstream outlets and Insider hands‑on reports describe the same set of image‑focused quick actions and a roadmap to extend the concept to Microsoft 365 documents — though some document features will be gated by Copilot/Microsoft 365 licensing at first. (windowscentral.com) (theverge.com)Microsoft’s official Flight Hub remains the authoritative source for build listings, and community reporting has tied these experiments to a Canary flight commonly reported as Build 27938. Flight Hub’s public list of active builds does not always reflect every community‑reported Canary number in real time, so treat a specific build label as community‑reported until confirmed. (learn.microsoft.com)
What’s in the Canary test: AI actions in File Explorer
The new right‑click menu item
When the feature is available for your device, right‑clicking a supported image will show an AI actions submenu. This menu groups several one‑click workflows that either launch an app with the edit staged, run a rapid model‑driven edit, or route the image to Bing Visual Search for web lookups. Reported image actions at introduction include:- Bing Visual Search — use the image itself as the search query to identify landmarks, plants, products, people or find visually similar images on the web. (windowscentral.com)
- Blur Background — opens the Photos app with an automatic subject/background separation and tools to adjust blur intensity or refine areas with a brush.
- Erase Objects — invokes a generative erase flow (Photos) to remove unwanted elements in the scene.
- Remove Background — launches Paint’s automatic background removal to produce a one‑click subject cutout with no background.
Document and OneDrive Copilot actions (roadmap)
Beyond images, Microsoft has also been adding Copilot actions to OneDrive and File Explorer for Microsoft 365 files stored in OneDrive: Summarize, Ask a question, Create an FAQ, and Compare up to five files. Those Copilot actions operate on Office formats, PDFs and other document types and are explicitly tied to Microsoft 365/Copilot entitlements at launch. The current Canary image actions are separate but part of the same strategic move: bring AI micro‑workflows into shell surfaces and the OneDrive activity center.How the features work (practical detail)
- The Explorer submenu acts as a launcher: it either passes the file reference to a target app (e.g., Photos/ Paint) with the edit preloaded, or it sends the image to Bing Visual Search and returns results directly.
- Some operations may run locally on device hardware if your PC supports on‑device models (Copilot+ devices with NPUs), while others will fall back to cloud processing. The precise locality per action is not guaranteed in public docs and can vary by hardware and account entitlements. That hybrid model is important for privacy, latency and cost considerations.
- For document Copilot features, processing currently occurs in Microsoft’s cloud and is restricted to files in OneDrive when invoked through the OneDrive UI or File Explorer OneDrive submenu. This keeps heavier document analysis centralized but requires Microsoft 365/Copilot licensing for some operations.
Enabling and trying it today
- Join the Windows Insider Program (Canary, Dev or Beta as appropriate) and update Windows to the latest preview flight.
- If your build and device are eligible and the server‑side flag is enabled, right‑click a supported image in File Explorer to see AI actions.
- For the returning Notification Center clock with seconds, go to Settings > Time & language > Date & time and toggle Show time in the Notification Center. If that option is not visible, the feature is still rolling out and may be gated; advanced users can enable internal flags with ViVeTool using feature IDs commonly circulated by the community. Use caution with ViVeTool. (pureinfotech.com)
The new privacy & visibility controls
Build 27938 (community reports) adds a Privacy & security → Text and image generation section in Settings that lists third‑party apps which have invoked Windows‑provided generative AI models recently. The intent is clear: provide visibility about which apps used OS‑provided generative capabilities and allow per‑app control over access. This is an early transparency step toward governance; enterprises and admins will need Group Policy/MDM controls and clearer audit logs for real manageability.Reliability fixes and known issues in the Canary flight
The Canary flight reporting this work also ships a range of typical fixes and known regressions. Community summaries and Insider notes list:- Fixes: “Reset this PC” reliability under Settings → System → Recovery; dark‑mode color issues for low‑space drive indicators in This PC; restored thumbnails for some video files with specific EXIF; WMI Registry scanning performance improvements; Task Manager freeze fixes; and resolution of some green‑screen errors (ntoskrnl.exe CRITICAL_PROCESS_DIED) reported in earlier Canary builds.
- Known issues: installation rollbacks with error codes 0xC1900101‑0x20017 or 0xC1900101‑0x30017 for some Insiders; certain settings pages hanging when scanning temporary files; PIX on Windows unable to play GPU captures until a PIX update; audio device issues showing yellow exclamation in Device Manager; screen flicker in some browsers; and occasional UI inconsistencies tied to server‑side gating. Microsoft is actively investigating many of these.
Why this matters: productivity, discoverability and the shell
Bringing AI into File Explorer is a classic friction‑reduction play. These changes address three common pain points:- Speed for micro‑tasks: small edits and lookups that previously required launching an app (Photos, Paint, a browser) are now a right‑click away.
- Context retention: users stay in File Explorer, preserving their mental flow while performing quick transformations or lookups.
- Discoverability: exposing AI options in the context menu puts capabilities in front of users who might not open Copilot or Photos to discover them.
Risks, caveats and unanswered questions
While the feature is promising, several important issues deserve scrutiny.1) Local vs cloud processing and privacy
Microsoft’s hybrid approach — local on Copilot+ hardware vs cloud fallback — is not clearly documented per action. That matters for:- Data residency and exposure: if an image or document is uploaded to cloud endpoints for processing, that changes the privacy calculus for sensitive content.
- Consent and transparency: the Settings surface listing recent generative AI activity is a start, but it’s not a full audit trail or proof of where the data was processed.
2) Enterprise manageability
Visibility alone isn’t enough for IT:- Enterprises will need explicit Group Policy/MDM controls to enforce whether devices may use on‑device models or cloud endpoints for generative AI.
- Admins will require robust telemetry and auditability for compliance scenarios.
3) Feature gating, inconsistency and support burden
Canary builds are experimental and heavily server‑gated. That leads to:- Inconsistent experiences across machines and rings.
- Increased support complexity for organizations testing or piloting the features.
- Potential regressions (installation rollbacks, driver issues) that can disrupt workflows.
4) Intellectual property and content integrity
AI edits like generative erase or background removal may modify image content in non‑obvious ways. Users should be aware that automated edits can alter meaning or remove context; for legal, journalistic or forensic use cases, preserving the original file should be a default. Built‑in “save as” workflows and easy access to originals will be important safeguards.Recommendations for power users and IT teams
- Treat Canary builds as experimental: do not deploy on production devices without backups and rollback plans. Use virtual machines or test devices for hands‑on evaluation.
- Review the new Settings → Privacy & security → Text and image generation page regularly to understand which apps have invoked generative AI and to toggle app access where necessary.
- For organizations, request explicit MDM/Group Policy controls from Microsoft for:
- Disabling cloud processing for generative actions.
- Controlling which users/groups have access to AI actions in File Explorer.
- Enabling audit logging for generative AI invocations.
- If you need the Notification Center seconds clock but don’t see the toggle, wait for the staged rollout or use ViVeTool only if you understand the risks; community guides document the feature flags and ViVeTool commands that have exposed the toggle in earlier Insider tests. Use ViVeTool with care. (pureinfotech.com)
- Keep Photos and Paint updated from the Microsoft Store — the Explorer quick actions call those apps, and behavior depends on installed app versions.
- For sensitive images, prefer manual edits in a controlled editor rather than automated cloud‑backed operations until locality and privacy guarantees are clarified.
How this fits into Microsoft’s larger AI strategy
This shell‑level integration is consistent with Microsoft’s approach of making AI a first‑class OS capability rather than an isolated app feature. The pattern is:- Surface AI where users are already working (File Explorer, OneDrive, taskbar/Activity Center).
- Provide visibility and initial controls in Settings.
- Gate heavier document and enterprise features behind Copilot/Microsoft 365 entitlements and staged rollouts.
Final verdict — strengths and potential risks
- Strengths
- Productivity gains: one‑click edits and visual search remove friction from many everyday tasks.
- Discoverability: burying AI inside the familiar right‑click menu exposes capabilities to users who might not open Copilot or Photos on their own.
- Platform consistency: reusing Photos, Paint and Bing Visual Search leverages existing investments rather than duplicating efforts. (windowscentral.com)
- Risks
- Privacy and data locality ambiguity: the mixed local/cloud model needs clearer documentation and explicit enterprise controls.
- Canary instability and support overhead: early test builds can introduce regressions and inconsistent behavior across devices.
- Licensing and fragmentation: Copilot‑gated document actions create different experiences for commercial and consumer users, complicating IT planning.
Practical checklist for testing Build 27938 (or equivalent preview)
- Run the build on a non‑critical test device with a fresh backup image.
- Confirm whether AI actions appears when right‑clicking a .jpg/.png in File Explorer.
- Test each image action (Visual Search, Blur Background, Erase Objects, Remove Background) and note whether processing appears local (fast, offline) or requires cloud connectivity.
- Open Settings → Privacy & security → Text and image generation and observe recent app activity.
- Check Settings → Time & language → Date & time for Show time in the Notification Center and test collapsed/expanded flyouts.
- Monitor Device Manager, Task Manager, and Event Viewer for regressions (driver warnings, freezes, error codes).
- If evaluating Copilot/OneDrive document actions, test with a Microsoft 365 account that has Copilot entitlements and files stored in OneDrive.
Conclusion
Embedding AI into File Explorer marks a pragmatic and user‑centered step: Microsoft is not forcing a new app, it’s adding micro‑workflows where users already manage files. For everyday users and content creators, the result will be faster edits and easier visual lookups. For enterprises and privacy‑sensitive users, the move raises clear questions about where processing happens and how to govern access.The reported Build 27938 is a Canary‑channel experiment — promising, incremental, and still subject to change. Treat it as a preview of what could become a standard part of Windows’ productivity fabric, but expect additional policy controls, clearer locality guarantees, and staged rollouts before this functionality lands broadly and safely in production channels. (windowscentral.com) (theverge.com)
Source: Cyber Press Microsoft Adds AI-Powered Actions to Windows File Explorer