
Microsoft’s Edge is quietly evolving from a Chromium browser with an AI sidebar into a permissioned, agentic browsing surface—and recent test code and community reporting suggest the company is now experimenting with giving Copilot direct access to a user’s Edge profile so the assistant can act as the user inside the browser.
Background / Overview
Since the launch of Copilot Mode and the incremental rollout of Copilot features inside Edge, Microsoft has repeatedly emphasized that the assistant is opt‑in and permissioned—but also that it will be able to do more than answer questions, moving toward automating multi‑step tasks such as price monitoring, bookings, and form fills. Microsoft’s public Copilot Mode pages describe a new tab experience, voice navigation, and task handoff scenarios where Copilot assists with bookings and comparisons when users permit it.At the same time, a raft of agentic competitors—most notably Perplexity’s Comet browser—has pushed the market toward making agent actions central to the browsing experience. Perplexity opened Comet to a broad audience in early October 2025, turning its AI‑first, sidecar‑assistant browser into a freely available alternative to Chrome and Edge. That competitive pressure is one context for Microsoft’s accelerated experiments in Edge.
Insider and community discoveries over recent weeks show three converging developments inside Edge’s test builds:
- A hidden toggle labelled “Browser Actions” that appears to give Copilot permission to browse using the current Edge profile.
- An Edge feature called Journeys that summarizes recent browsing activity and surfaces jump‑back cards on the New Tab Page.
- Expanded Copilot “Actions” that let the assistant navigate pages, click, and fill forms under explicit consent—capabilities Microsoft has publicly described for the feature set it calls Actions.
What the “Browser Actions” toggle appears to do
The visible evidence
Test screenshots and UI strings discovered in Edge Canary builds show a toggle in Copilot’s privacy settings labelled “Browser Actions” with language that says Copilot can “browse the web and complete tasks using your Edge profile.” That phrasing ties the functionality not just to reading pages or summaries, but to operating within the same profile context—meaning saved passwords, signed‑in sessions, cookies, and browsing history would be available to the agent while it performs actions.Microsoft’s official policy and support pages already disclose that Copilot can use page content and other contextual clues when the user allows it, and that enterprises can manage access via Group Policy/MDM policies. The Edge policy set includes settings such as CopilotPageContext and EdgeCopilotEnabled that let administrators enable, disable, or force behavior, confirming that Microsoft intends to make Copilot’s page‑access capabilities controllable at scale.
How it would work in practice (likely behavior)
Based on UI strings and how similar agent systems operate, when the toggle is enabled Copilot would run its browsing actions using the live environment of the signed‑in Edge profile rather than a remote “virtual browser” that starts fresh each time. Practically, that means:- Sites where the profile is already signed in would open in that state, so Copilot could perform actions that require authentication (e.g., check award‑member pricing or access a saved itinerary), stopping short of bypassing authentication flows that require additional codes or MFA.
- Copilot would be able to open tabs, click buttons, follow links, and fill fields using the saved autofill or password manager entries in that profile—again, only when the user grants permission to do so for a given action.
- Visual indicators and conversation‑tied scopes are likely to be shown (Microsoft has emphasized visible cues for Copilot actions in Copilot Mode).
Journeys: summarizing history and the “7‑day” jumpstart claim
What Journeys promises
Journeys is a UI and feature idea that turns a short window of browsing activity into a handoffable snapshot: cards on the New Tab Page that represent recent tasks or projects, with one‑click resumption and an AI‑generated summary of the work done so far. WindowsLatest and other observers found a help/documentation snippet (later removed) indicating Journeys may use a short window of browsing history to “jumpstart” the feature. The reported line read that Microsoft “may use your past seven days of browsing activity (excluding page content) to jumpstart your Journeys experience.”Why that matters
Turning browsing history into summarized “journeys” is useful—from resuming travel research to picking up multi‑site shopping and comparison work. But it also raises two immediate concerns:- Scope and granularity: If Journeys reads only URLs, titles, and timestamps (as claimed in the snippet), that’s less invasive than reading page content. Still, URLs and titles can reveal highly sensitive signals (medical, legal, financial searches).
- Retention and control: The UI text suggested a first‑use import window; a short retention window (seven days) is less risky than long‑term indexing, but the feature’s defaults, any opt‑outs, and where the data is stored (locally vs cloud) change the privacy calculus dramatically. Microsoft’s public support pages repeatedly say Copilot won’t use personal browsing data for training and that users can toggle context access, but Journeys’ precise storage and processing architecture has not been fully documented in official guidance available at time of reporting.
Unverified and deleted material — treat with caution
Because part of the Journeys claim comes from a now‑deleted Microsoft support snippet seen by reporters, it must be treated cautiously. The removed documentation and community screenshots are strong signals that Journeys exists and is being trialed, but any precise numbers or guarantees referenced in third‑party reporting should be verified against Microsoft’s public docs or policy pages once Microsoft publishes final guidance. For now, Journeys is best described as under test with claimed local‑only processing and opt‑in controls, not as a shipped, auditable feature.Technical and security analysis: capabilities, limits, and attack surface
What Copilot can and cannot do (based on public descriptions)
Microsoft has been explicit on two guardrails:- Copilot cannot bypass native Windows security or MFA; if a site requires a login code, human intervention is required.
- Copilot’s actions are supposed to be tied to the conversation/session in which they were requested and visible to the user, to allow interruption or rollback.
New attack classes
Agentic browsing introduces specialized risks that security teams and browser engineers must treat seriously:- Prompt injection through web content: If an agent interprets page content as an instruction, malicious pages could coerce the agent into unintended actions. Security researchers have already demonstrated such attacks against early agentic browsers. The mitigations require origin‑bound permissions, replayable action logs, and conservative action scopes (read‑only vs suggest vs perform).
- Credential misuse and session hijacking: Even without bypassing MFA, an agent with profile access could interact with sites to trigger unintended transactions or data exposures if the agent is tricked or misdirected.
- Exfiltration via automated workflows: Agents that can save or export data—especially if they connect to cloud connectors—could exfiltrate sensitive material if connectors or permission checks are misapplied.
- Auditing and non‑repudiation gaps: Without persistent, user‑readable action logs, it’s difficult to determine what the agent did and whether to roll back or dispute an automated action.
Enterprise controls and admin guidance
Microsoft’s Group Policy and MDM options for Copilot exist precisely because enterprises must control these risks. Policies can:- Disable Copilot entirely for managed devices.
- Restrict Copilot’s access to page context and profile data.
- Deploy per‑profile or per‑tenant rules that prevent Copilot from accessing sensitive sites protected by DLP.
- How Copilot logs actions and whether those logs integrate with SIEM/EDR.
- Whether Copilot’s scope adheres to corporate DLP and SSO policies.
- How user prompts and agent decisions are surfaced to auditors.
Competition and the broader market: Perplexity’s Comet and the AI browser race
Perplexity’s Comet has made agentic browsing a mainstream story by packaging Perplexity’s answer engine and task automation inside a Chromium‑based shell and releasing it broadly in October 2025. Comet’s sidecar assistant emphasizes citation grounding and agentic workflows—summarize, compare, and act—and the product’s rapid, free distribution is a direct counter to incumbent strategies that prefer to retrofit agents into existing browsers. Perplexity’s move forces incumbents to accelerate feature design and governance.Microsoft’s commercial advantage is distribution and enterprise management: Edge ships to Windows devices, integrates with Microsoft 365, and supports IT controls that matter in regulated environments. Perplexity’s Comet wins on a more aggressive product stance—agentic by default—but that stance naturally demands more transparent, auditable safety controls. The two paths—retrofit with guardrails (Edge) and agentic default (Comet)—represent different risk and product trade‑offs that will be decided by adoption, regulation, and high‑profile security incidents.
Practical guidance for power users and IT teams
- For individuals: Keep Copilot features opt‑in and review Copilot settings under Settings > Copilot and sidebar. If a feature mentions “use profile” or “browser actions,” assume it will have access to signed‑in sessions and adjust privacy preferences accordingly. Microsoft support pages include toggles to disable page context and personalization that should be used by privacy‑conscious users.
- For administrators (recommended rollout checklist):
- Run a pilot with a small set of users and monitor agent activity logs.
- Configure Edge policies (EdgeCopilotEnabled, CopilotPageContext) to define default behavior in the org.
- Integrate audit logs with SIEM and require human confirmation for transaction‑critical flows (finance, HR, legal).
- Use DLP and site whitelisting to restrict agentic actions to known good domains.
- On feature economics: If Journeys or other features move behind Copilot Pro or a $20/month paywall as early reporting suggested, IT procurement should consider whether those paid features introduce further compliance obligations (e.g., data residency or contractual guarantees not covered by free offerings).
Strengths, trade‑offs, and what to watch next
Strengths
- Convenience and productivity: Agents that can resume tasks across tabs, read signed‑in content, and perform form fills promise real time savings for research, travel booking, and shopping workflows. Copilot’s deep ties to Microsoft 365 magnify productivity value for enterprise users.
- Enterprise controls: Microsoft’s policy surface enables managed deployment and selective enablement—an advantage over smaller challenger browsers without the same enterprise tooling.
Trade‑offs and risks
- Privacy and trust costs: Even with local promises, short retention windows, and opt‑in toggles, the mere ability for an assistant to read history or use profile credentials will drive scrutiny and possible pushback from privacy advocates and regulators.
- Security complexity: Agentic features increase the attack surface and add novel failure modes (prompt injection, automated exfiltration). Robust mitigations must become standard browser security primitives.
- Web economics and publisher friction: Agentic summaries and reduced pageviews may reignite publisher debates about compensation; Perplexity’s Comet already proposes revenue sharing mechanisms for publishers, foreshadowing further commercial pressure.
What to watch
- Microsoft’s formal documentation on Journeys and Browser Actions—particularly whether Microsoft publishes a whitepaper describing local storage, encryption, and threat models.
- The final behavior and default settings in Edge stable releases and the enterprise policy guidance that will accompany them.
- Responses from browser and security communities to any high‑profile incidents involving agentic browsing.
Conclusion
Microsoft’s Copilot and Edge teams are testing a new frontier: giving an assistant the ability to operate inside a user’s real browsing profile and to synthesize short‑term browsing activity into resumable “Journeys.” The approach promises meaningful productivity gains and reinforces Edge’s strategy of integrating AI as a core browsing surface rather than a bolt‑on sidebar. But turning a browser into an agentic assistant also raises hard questions about consent, auditing, and the new attack classes that follow when an AI can act on behalf of a signed‑in user.Public Microsoft documentation confirms Copilot Mode’s direction and the administrative controls that will govern access, and independent reporting shows Perplexity’s Comet pushing the market toward agentic defaults. That competitive pressure is driving rapid experimentation in Edge—but it also increases the urgency for transparent, machine‑readable technical details about how agent scope, storage, and actions are implemented and audited.
Until Microsoft publishes definitive, auditable technical documentation for Journeys and any Browser Actions pipeline, the current signals—hidden toggles in test builds, deleted support snippets, and community screenshots—are best read as early product exploration rather than final commitments. The safest posture for cautious users and administrators is to treat these capabilities as opt‑in, validate them in controlled pilots, and insist that vendors provide clear, technical guarantees about on‑device processing, encryption, and non‑use of personal data for model training.
Source: Windows Latest Exclusive: Microsoft tests Perplexity Comet-killer Copilot integration in Edge for Windows 11