Edge Goes AI First: Copilot Tabs Journeys and Phi-4-mini

  • Thread Author
Microsoft Edge is quietly being rebuilt around Copilot: recent tests show you can now point the browser’s AI at the tabs you have open — literally sending those tabs into a Copilot conversation — and Microsoft is preparing a complementary feature called Journeys that would summarize days of your browsing activity, a capability that, in early previews, asks for up to seven days of history to “jump‑start” the experience.

Background / Overview​

Microsoft has been explicit about a long‑term plan to make Edge an “AI‑powered browser” rather than merely a browser with an AI accessory. The company’s Copilot Mode surfaces a chat‑centric New Tab Page (NTP) and promises deeper, permissioned access to browser context — tabs, page content and, with opt‑in, history and credentials — so Copilot can act on your behalf for tasks like comparisons, bookings and multi‑page summarization. This shift is already visible in Edge Canary builds and in Microsoft’s product pages.
At the same time Microsoft is testing complementary on‑device model infrastructure — notably the Phi family, including a compact “Phi‑4‑mini” — into Edge so some AI work can run locally without round trips to the cloud. Those on‑device models power new Web AI APIs (Prompt API, Writing Assistance APIs and others) and enable features such as in‑browser summarization and rewriting without necessarily sending full page content to Microsoft servers.
Two developments coming together define the current phase of this evolution:
  • The ability to send tabs to Copilot — adding open tabs as direct context for a Copilot conversation (reported in hands‑on tests).
  • The “Journeys” concept — an AI summary layer over your recent browsing that surfaces cards on the New Tab Page and helps you resume work or revisit past research, which preview documentation suggested may ingest seven days of history on first use.
Both are early, gated, and opt‑in — but they mark a clear product direction: Edge is moving from side‑pane, sidebar or extension AI into a first‑class, task‑oriented AI browsing experience.

What’s new: sending tabs to Copilot and “tab tagging”​

What was observed in testing​

A recent hands‑on report identified a new “Add tabs” affordance on Edge’s New Tab Page (NTP). Tapping it will show open tabs and allow you to add selected pages directly into your Copilot conversation. In the example shown by the testers, Copilot immediately incorporated the selected web pages into the chat context — the assistant could answer follow‑ups that referenced both pages at once. The experiment was described as a form of tab tagging: marking tabs as contextual inputs for the AI to use in a single conversation.
Microsoft’s own Copilot Mode marketing already promises multi‑tab context — the company states Copilot can “evaluate content across tabs and get more relevant responses” — so the observed behavior aligns with Microsoft’s product goals. The key difference is that hands‑on tests suggest this functionality is moving past marketing language into a concrete workflow (select tabs → add to Copilot → ask questions that span pages).

How it works in practice (based on reports and product pages)​

  • You opt into Copilot Mode or enable the Copilot‑in‑NTP flags in Edge Canary.
  • The NTP shows a Copilot compose box and an “Add tabs” (+) affordance. Selecting it lists your open tabs.
  • You choose one or more tabs to “tag” and Copilot imports relevant context from those pages into the conversation. The assistant then uses that context to answer follow‑ups, summarize, or compare information.
Microsoft’s guidance emphasizes that these capabilities are permissioned (the user explicitly enables them) and that usage limits may apply while features remain experimental.

Journeys: one feature to summarize days of browsing​

The pitch​

Journeys is a proposed Edge feature that turns browsing activity into curated cards and AI summaries on the New Tab Page. It’s pitched as a productivity layer: when you return to the browser you can be nudged back toward active projects, find previously read resources, and get a single‑page summary of a topic you’ve been researching. Microsoft frames this as a way to “revisit past work, resume tasks, and get timely recommendations.”

Paywall and pricing — the $20 context​

Early reporting tied Journeys’ availability to Microsoft’s paid Copilot tier (Copilot Pro), which historically cost $20/month. Multiple outlets explained that Journeys — at least in previews — would be gated behind Copilot Pro or equivalent paid access, and that the feature leveraged on‑device models like Phi‑4 as part of its local processing stack. Note: Microsoft announced a subscription restructure (Microsoft 365 Premium) on October 1 that folds Copilot Pro into a broader consumer tier, complicating the simple “$20 per month” framing.

The seven‑day browsing history claim — verified or not?​

WindowsLatest reported that a support page revealed Journeys may request the prior seven days of browsing activity “on first use” to jump‑start the experience, and that the setting would appear as a toggle during setup. PCWorld and other outlets repeated that finding. However, attempts to locate the original Microsoft support doc now yield either higher‑level privacy pages or product pages that don’t repeat the exact “seven days” language — Microsoft has been updating its documentation rapidly as features roll through preview. Because the underlying Microsoft support item referenced by early reports is no longer discoverable in the same form, the exact seven‑day wording should be treated as provisionally reported and subject to confirmation from Microsoft.

The technology beneath: on‑device models, Web AI APIs and Phi‑4‑mini​

Microsoft has baked a new set of Web AI APIs into Edge — the Prompt API and Writing Assistance APIs — that provide access to a built‑in, compact model named Phi‑4‑mini. These APIs are explicitly designed to let sites and extensions call a local model for tasks like summarization, rewriting and prompt‑driven composition. In practice, that means some of Edge’s AI features can run without shipping page content to a remote cloud model, improving latency and (arguably) privacy.
Key technical constraints and design choices:
  • Phi‑4‑mini is a deliberately small model (multi‑billion parameter architecture) that Microsoft positions for on‑device inference. The model downloads on first use and is shared across domains.
  • Edge’s Web AI APIs include controls and flags; administrators and advanced users can disable these APIs to prevent automatic model downloads. The model requires a certain amount of free storage (for example, Edge’s documentation mentions storage thresholds) and platform support (Windows 10/11, macOS recent versions).
That technical approach — small, local models invoked by JS APIs — is a meaningful privacy design shift compared with the simple “send everything to Copilot” approach. Nevertheless, it is not a silver bullet: local models still need input, and Edge must decide which data is supplied to those models and when.

Privacy, security, and control: dissecting the risks​

This is the most consequential part of the story for users and enterprises. Edge’s new features promise convenience, but they also introduce new risk vectors. The risks fall into four categories:

1) Scope creep and default settings​

The power of Copilot comes from context: tabs, pages, history, even credentials. That context is useful but sensitive. While Microsoft says Copilot features are opt‑in, product design choices — where toggles live, whether NTP prompts encourage enabling Journeys, and feature defaults across Canary/Dev/Stable channels — will determine how many users actually give permission. If an unfamiliar prompt or an NTP “Explore” card nudges users into sharing history, the privacy calculus changes quickly.

2) Local processing is not the same as perfect privacy​

Running a model on device is a clear improvement over mandatory cloud inference, but it’s not an ironclad guarantee. Software can still:
  • upload telemetry or metadata if the user enables certain connected experiences;
  • sync summaries or indices to cloud accounts when users are signed into Microsoft services; and
  • be subject to future policy changes that broaden server‑side processing. Microsoft’s privacy pages emphasize controls and opt‑ins, but they also list scenarios where browsing activity is used for personalized experiences if permission is granted. Users must understand precisely which toggles control local versus cloud behavior.

3) Sensitive history leakage and lack of fine‑grained filtering​

Early previews suggest Journeys may request seven days of history to get started. That is useful for surfacing active projects — but it also means accidental inclusion of sensitive pages (banking, health, legal research) in a dataset that will be analyzed by an AI model. Current reporting does not show robust, easy filtering for specific sites or categories, nor a simple “exclude these domains” control in the Journeys setup flow. Until Microsoft provides granular filters, organizations and privacy‑sensitive users will need to rely on manual clearing, InPrivate sessions, or disabling the feature entirely.

4) UI transparency and trust signals​

When a Copilot answer references content from an open tab or a history snippet, users should be able to see what the assistant used as evidence. Early tests show Copilot adding tabs to the conversation as context, but it’s not yet clear whether the assistant will always disclose the exact source text it used or provide direct links to the passages it cites. For users accustomed to traceable citations, the difference between an AI “answer” and a verifiable summary will matter. Microsoft’s Copilot product pages promise contextual awareness, but disclosure practices will make or break user trust.

Where Microsoft stands on data use and training​

Microsoft’s public statements have repeatedly stressed that data from local Edge features won’t be used to train Microsoft models, and that connected experiences require explicit permissions. For example, Microsoft’s privacy support pages explain how browsing activity is used for personalization only if the user explicitly opts in, and that local browsing history can be managed or deleted. Those are important protections, but they rely on accurate documentation and on settings that are discoverable and understandable to users.
At the same time, some of Microsoft’s recent product messaging — and the rapid launch cadence across Canary/Dev/Stable channels — has meant documentation trails sometimes lag feature appearances, which fuels confusion and concern among privacy‑conscious observers. The reporting around Journeys’ seven‑day history requirement is an example: early support pages were spotted and then changed, leaving third‑party outlets to report on ephemeral wording. That’s a normal part of software preview cycles, but it calls for caution when interpreting early claims.

Competition and strategic context: why Microsoft is pushing this now​

The browser space is following an AI playbook: make the browser not just a lens to the web, but a task platform — an assistant that can reassemble the web’s scattered information into actionable outputs. Competitors such as Perplexity (Comet), Opera Neon and browser vendors experimenting with Gemini or other AI integrations are pushing similar directions: agentic features that assist with comparisons, shopping, writing and task automation. Microsoft’s advantage is the depth of its app ecosystem (Office, OneDrive, Windows) and the ability to combine on‑device and cloud models under a single account umbrella.
There’s also a commercial motive: Microsoft’s consumer subscription strategy has been evolving. Copilot Pro (historically $20/mo) was the natural gate for premium Copilot functionality. On October 1, Microsoft announced Microsoft 365 Premium, a consolidated consumer tier that combines Office apps and Copilot capabilities, and signaled the end of Copilot Pro as a standalone SKU. That pricing and packaging shift changes the simple “Journeys costs $20/month” narrative and suggests Microsoft is steering users toward combined productivity/AI tiers rather than single‑feature add‑ons.

Practical advice for Windows and Edge users​

For readers who want to try the new features or who want to protect privacy while keeping options open, here are pragmatic steps:
  • Opt‑in consciously:
  • Only enable Copilot Mode and Journeys if you understand the permissions being requested. Experimental flags are not permanent defaults, but they can change quickly.
  • Use InPrivate for sensitive work:
  • Edge does not collect InPrivate session data into connected experiences. Use InPrivate for banking, sensitive research, and any browsing you don’t want analyzed.
  • Check and control Web AI APIs and local models:
  • If you don’t want Phi‑4‑mini to download, Edge has flags and administrative policies that can disable the Web AI APIs or prevent local models from being used. Advanced users and admins should audit those flags.
  • Manage browsing activity and history settings:
  • Review Settings → Privacy, search, and services to ensure “Allow Microsoft to save your browsing activity” is set according to your comfort level. You can also manually delete history or use account.microsoft.com to manage sync history.
  • Watch disclosure behavior:
  • If Copilot summarizes or cites content, check if it shows the source page and the excerpt it used. If it doesn’t, treat long summaries as helpful but not authoritative until citation behavior improves.

Strengths: where Microsoft’s approach shines​

  • Convenience and productivity: Bringing multi‑tab reasoning into a cohesive chat flow solves a real pain point for research and comparison tasks. Users can ask an assistant to synthesize information across several open topically related pages without manual switching and copy/paste. Early tests show Copilot can integrate multiple pages as context and produce accurate, coherent answers.
  • On‑device models reduce some privacy risks: Deploying Phi‑4‑mini and Web AI APIs that run locally reduces the need to ship raw page content to remote servers for certain operations. This lowers the surface for cloud exfiltration and reduces latency for common tasks.
  • Deep product integration: Microsoft can tie Copilot into Windows, Office and Edge simultaneously, which enables multi‑app workflows (e.g., extract content from web pages and draft a document in Word with Copilot’s assistance). That level of integration is an advantage generative AI‑enabled workflows.

Weaknesses and risks: what to watch​

  • Opaque defaults and discoverability: If opt‑in flows and toggles are buried or worded in promotional language on the NTP, users may enable features without understanding the privacy tradeoffs. Early reports about ephemeral support pages and changing wording emphasize the need for clearer, stable documentation.
  • Granularity of controls: There’s limited evidence so far that Journeys or similar features will offer easy, fine‑grained exclusion lists for domains or categories. Users need per‑site opt‑outs so research into sensitive topics isn’t accidentally ingested.
  • Auditability and provenance: For honest, trustworthy AI outputs, the assistant must show where it got its information. Without clear provenance and easy link‑back to source passages, summaries can be persuasive without being verifiable. Early experiences suggest the UI can import tab content into a conversation, but the disclosure model needs strengthening.
  • Policy and commercial churn: Subscription packaging changed with Microsoft 365 Premium, which folded Copilot Pro into a larger tier; that commercial churn can confuse users about what’s free, what’s paid, and which features are gated. Feature availability may shift as Microsoft aligns the browser with broader commercial plans.

What regulators and IT admins should consider​

Enterprises and privacy regulators will focus on how much browsing context is used and where it is stored. For organizations that require strict controls on what content leaves managed devices:
  • Clarify whether Journeys or Copilot Mode are available under managed policies and whether admins can centrally block local model downloads or disable connected experiences.
  • Insist on logging and audit trails for any automated summarization that becomes part of corporate knowledge: if an AI assistant produces a summary that employees act on, organizations need to track the data lineage.

Verdict: promising, but privacy‑sensitive​

Microsoft’s push to make Edge an AI‑first browser is a logical next step in the generative AI era. Multi‑tab context, tab tagging, and on‑device summarization would solve day‑to‑day problems for researchers, students, and knowledge workers who juggle many open pages. The technical move to ship compact, on‑device models like Phi‑4‑mini is smart: it reduces latency, cuts cloud costs, and offers a plausible privacy benefit.
However, the features — especially anything that ingests browsing history — create new privacy choices that must be explicit and granular. Early reporting around Journeys’ “seven‑day” history ingestion is compelling but not yet fully corroborated by persistent Microsoft documentation; treat that claim as provisionally reported and expect Microsoft to clarify as the feature matures. Users, IT administrators and privacy advocates should demand:
  • simple, clearly visible toggles for any feature that ingests history;
  • per‑site exclusions or category filters;
  • transparent provenance for AI answers that cite the exact sources used; and
  • clear documentation that distinguishes local processing from server processing.

Closing note​

AI is now moving from “assistant as plugin” to “assistant as platform.” Microsoft Edge’s experiments with sending tabs to Copilot and with Journeys are an important signal of where browsers are headed: toward a workspace that understands context, curates it, and helps you act. Those capabilities are powerful — and they require robust, easy to use privacy controls and strong transparency to earn users’ trust. Until Microsoft publishes stable, long‑lived documentation and admin controls that match the features in preview, cautious users and organizations should treat these early releases as optional—and explore settings to keep browsing data scoped to the uses they explicitly approve.

Source: Windows Latest Windows 11's MS Edge tests send tabs to Copilot, AI feature that needs 7 days of browsing history