Mico: Microsoft's Expressive Copilot Avatar for Voice on Windows

  • Thread Author
Microsoft’s Copilot just got a face: an animated, expressive avatar called Mico that appears in voice interactions and aims to make AI on the PC feel more like a helpful companion than a cold tool.

Copilot UI with Mico, a friendly gradient blob, asking “How can I help you?”.Background / Overview​

Microsoft’s Copilot Fall Release reframes the company’s consumer AI strategy around what it calls human‑centered AI, emphasizing productivity, creativity, and social collaboration rather than attention‑grabbing engagement. The centerpiece of the consumer rollout is Mico, a simple, blob‑shaped avatar that listens, reacts, and changes color to reflect conversational context when you use Copilot’s voice mode. This update is one of a dozen new Copilot features that also include long‑term memory, group chats for up to 32 participants, a “real talk” conversational style that can push back on bad assumptions, improved health responses, and richer integrations with cloud services.
The feature set is being delivered as a staged rollout: the Copilot Fall Release and Mico are live in the United States and are being expanded quickly to other English‑language markets in the coming weeks. Some capabilities, such as health features and certain early experiments, are explicitly region‑restricted during the initial release. Device and platform compatibility may vary by feature.

What is Mico?​

A visual identity for voice interactions​

Mico is an optional visual layer for Copilot’s voice mode. It’s not a photorealistic avatar — it’s an abstract, animated presence designed to convey listening, emotion, and responsiveness in a lightweight way. It animates while Copilot listens, shows subtle expressions while the system reasons, and can adjust colors and motion to match conversational tone.
Key characteristics:
  • Expressive but minimal: simple animations, color shifts, and shape changes rather than humanlike faces.
  • Voice‑mode centric: appears by default during Copilot voice sessions unless the user opts out.
  • Customizable and optional: users can disable the avatar if they find it distracting.

A nod to the past — with an Easter egg​

The design intentionally channels a playful connection to Microsoft’s old Office assistant, Clippy. There’s a lighthearted Easter egg: repeatedly tapping the avatar in some builds can transform Mico into the familiar paperclip. That move is clearly an exercise in nostalgia — a way to acknowledge a bygone era without repeating the same user experience mistakes.

Where Mico appears and how to use it​

Platforms and entry points​

Mico shows up in Copilot’s voice interactions across Microsoft’s consumer surfaces where voice mode is available. The primary, most integrated experience is on modern Windows devices with the Copilot app and Copilot voice mode enabled. Other Copilot endpoints (mobile apps and the web Copilot experience inside browsers) will receive similar conversational features, but availability and exact behavior may differ by platform and region.
General steps to see Mico:
  • Ensure Copilot is installed and up to date on your device.
  • Enable Copilot’s voice mode (and the wake‑word option if available and desired).
  • Start a voice session; the Mico avatar should appear and animate during the conversation.
  • If Mico is distracting, disable the avatar in Copilot settings.
Note: exact menu names and the presence of Mico may vary by OS version and device; some advanced Copilot features are being prioritized for devices running the latest Windows releases.

The Copilot Fall Release: features that matter​

Long‑term memory and personalization​

Copilot now offers a Memory & Personalization facility that can remember user preferences, ongoing projects, and personal details you authorize. Memories are editable and deletable by users. The intent is to reduce repetition and maintain conversational continuity across sessions.
Benefits:
  • Faster follow‑ups without re‑explaining context.
  • Personalized suggestions tied to your long‑running tasks and calendar events.
Risks and caveats:
  • Long‑term memory expands the attack surface for privacy and data‑leak incidents if misconfigured.
  • Users must actively manage and audit stored memories to avoid unwanted retention of sensitive details.

Groups: collaborative Copilot sessions​

Copilot Groups lets up to 32 people join a shared AI session where Copilot keeps everyone aligned, tallies votes, summarizes threads, and helps split tasks. It’s a collaboration layer aimed at classrooms, study groups, and small teams.
Use cases:
  • Group brainstorming and co‑editing.
  • Synchronous planning with a thread that the AI can summarize after a debate.
Security considerations:
  • Shared sessions raise questions around access control, link sharing, and the potential for data leakage between participants.

“Real Talk” conversational style​

The “real talk” mode is an optional tone modifier that makes Copilot more willing to challenge assumptions and take a firmer stance when a user’s reasoning appears faulty. This is an explicit move away from uniformly agreeable assistants toward one that can nudge users to think differently.
Pros:
  • Encourages critical thinking.
  • Helps prevent the illusions of correctness when a model hallucinates.
Cons:
  • Tone and pushback must be calibrated carefully; poorly tuned responses can frustrate users or be misread as adversarial.

Learn Live and tutoring​

A voice‑first tutoring mode — Learn Live — is designed to guide learners interactively instead of simply providing answers. Copilot uses voice, visuals, and questioning techniques to scaffold learning.
Advantages:
  • Active learning scaffolding, not just answers.
  • Potential classroom and personal tutoring applications.
Limitations:
  • Educational content must be accurate and sourced; the AI must avoid overconfident errors.
  • Access and moderation are critical where minors are involved.

Privacy, data, and compliance​

Memory management and user control​

The system provides controls for editing, viewing, and deleting memories. These controls are necessary but not sufficient; users should treat stored memories like a personal notes database and audit them periodically.
Recommendations:
  • Review Memory & Personalization settings after enabling them.
  • Remove any highly sensitive data (financial credentials, personal identifiers) from stored memories.
  • Use platform privacy dashboards to see which connectors (OneDrive, Gmail, Google Drive) are linked.

Connectors and cross‑service access​

Copilot’s connectors let it reason over content stored across cloud services. That’s powerful — it can surface an old email, summarize a file in OneDrive, or merge calendar context — but it increases the number of third‑party systems that hold permissioned access to your Copilot context.
Best practice:
  • Grant the minimum permissions needed.
  • Periodically review and revoke connectors not in active use.

Local vs cloud processing​

Some elements of the voice activation pipeline (wake‑word spotting) may run locally to reduce telemetry and latency, while full conversational processing is cloud‑based. This hybrid model aims to balance privacy and performance but means that once the session is active, content is processed in the cloud under whatever Microsoft policies govern Copilot.
Implication:
  • Users who need strong local‑only processing should avoid enabling cloud voice flows or should use devices with on‑device acceleration options if available.

Security and Windows 10 end of support​

Windows 10 lifecycle and upgrade choices​

The ecosystem context matters. Standard support for Windows 10 concluded in mid‑October 2025. That cessation means Windows 10 will not receive routine free security updates unless a user enrolls in an extended support scheme. Microsoft has provided options to extend critical security updates for a limited period, with conditions that vary by region and enrollment path.
Important points to know:
  • Extended security updates (ESU) are available as a consumer option for a limited time, often with enrollment requirements such as syncing settings to a cloud backup.
  • Pricing and qualifiers differ by region; some markets have special enrollment rules.
  • Staying on an unsupported OS increases exposure to new vulnerabilities and reduces eligibility for new features like Copilot Fall Release components.
Caution: a specific pricing figure reported in some outlets (a one‑off “£22” charge) cannot be consistently verified across official channels and appears to be a regional conversion or an inaccurate summary. The documented consumer ESU options include free enrollment under certain cloud‑backup conditions or alternate redemption options like reward points, while paid options have been reported in other regions (for example, a consumer fee was described in dollar amounts in multiple briefings). Users should confirm exact ESU pricing and enrollment steps through official system settings or their Microsoft account portal.

The nostalgia factor: does bringing Clippy back matter?​

The Clippy Easter egg is a safe, playful nod rather than a resurrection of the old Office assistant. Mico is intentionally nonintrusive by design; it’s an optional, small visual presence that can be turned off.
Why the Easter egg matters:
  • It signals Microsoft understands its history and user sentiment.
  • It’s a marketing-friendly easter egg that helps conversations about Copilot spread.
Why it doesn’t guarantee acceptance:
  • Clippy’s historical failure was not simply about personality — it was about intrusiveness and poor context sensitivity. Mico’s long‑term acceptance will depend on how well Microsoft lets users control attention and data.

Critical analysis: strengths, opportunities, and risks​

Strengths and potential upside​

  • Improved discoverability and presence: Voice + an expressive avatar can lower the barrier to using AI features, making help feel more immediate and friendly.
  • Productivity multipliers: Memory, connectors, and agentic actions (when responsibly permissioned) can speed complex workflows and reduce friction across apps.
  • Social and educational value: Groups and Learn Live offer new collaborative and pedagogical affordances that can be genuinely useful for teams and learners.
  • Human‑centered rhetoric backed by controls: The release emphasizes user control — memory deletion, disabling the avatar, and permissioned connectors — which is the right design direction for adoption at scale.

Risks, ambiguities, and regulatory exposure​

  • Privacy and data leakage: Long‑term memory and cross‑service connectors make it easy to accumulate sensitive context. Misconfiguration or unclear defaults could expose data.
  • Anthropomorphism and emotional design pitfalls: Mico’s warmth improves engagement but risks encouraging undue trust or social attachment, particularly among vulnerable populations or children.
  • Moderation and misinformation: “Real talk” is a valuable capability, but the system’s ability to adjudicate disputed facts or provide safe pushback is only as good as its grounding. In high‑stakes domains (health, legal, financial), the assistant must remain conservative and cite sources.
  • Platform fragmentation and compatibility claims: Messaging that suggests Mico or specific Copilot features are “Windows 11‑only” is oversimplified. Copilot is a multi‑surface product; feature availability varies by endpoint, device hardware, and region. Users on older OSes may find feature gaps.
  • Enterprise and compliance complexity: Organizations must consider data residency, audit trails, and legal discovery if Copilot memory and group sessions are allowed on managed devices.
  • Attention economy concerns: Any visual avatar is a potential attention sink. Microsoft’s promise to build “AI that gives you back time” must be demonstrated by default settings that favor minimalism and user control.

Practical advice for users and IT pros​

For everyday users (personal & small business)​

  • If interested, try Mico in a private, low‑risk workflow first. Use it for drafting, summarizing, and learning tasks rather than sharing passwords or legal/medical details.
  • Keep Copilot’s Memory & Personalization settings under review. Periodically delete memories you don’t want retained.
  • Revoke unused connectors and only grant Copilot access to cloud services you trust.
  • If you prefer a distraction‑free workspace, disable Mico in Copilot settings; voice mode and text interactions remain available without the avatar.

For IT administrators and privacy officers​

  • Evaluate whether Copilot Group sessions are permissible under your data governance policies before enabling them on managed endpoints.
  • Define acceptable use: which apps or data types can be shared with Copilot Vision or Connectors.
  • Audit and monitor Copilot access tokens and connectors; establish revocation processes.
  • Treat Copilot memory as a new data store: include it in retention, deletion, and e‑discovery policies.

If you’re still on Windows 10​

  • Recognize that mainstream Windows 10 support has ended; apply mitigations immediately. Options include upgrading hardware to meet Windows 11 requirements, enrolling in an Extended Security Update program under your region’s rules, or migrating to a supported OS.
  • Use official update/backup enrollment paths if you wish to receive any temporary consumer ESU offerings — free enrollment conditions sometimes require cloud backup or similar steps.
  • Avoid relying on press summaries for precise pricing; verify costs and eligibility through the OS update settings or your Microsoft account page.

How this changes the PC experience​

Mico represents a design bet: consumers are more likely to embrace AI when it feels human enough to be friendly but not human enough to mislead. The Copilot Fall Release advances several distinct ideas at once — shared AI sessions, persistent memory, a voice‑first assistant with a face, and more permissive cross‑service integrations.
If executed well, users will gain a more conversational, context‑aware assistant that shortens the gap between thought and action on the PC. If executed poorly — with defaults that favor convenience over privacy, or with inconsistent regional rollouts and compatibility claims — user trust could erode quickly.
The company’s stated philosophy — to “build AI that empowers human judgment” — is an encouraging guiding principle. The real test will be whether Microsoft’s defaults, UI affordances, and admin controls consistently put user control first as these features scale beyond early adopters.

Conclusion​

Mico is not a gimmick; it’s a deliberate attempt to humanize the next generation of desktop AI. It packages complex capabilities — voice, memory, connectors, group collaboration, and contextual vision — behind a simple avatar that can be turned off or ignored. The release is both an experiment in user interface design and a statement of product direction: Microsoft wants Copilot to be an assistant that feels personal, social, and capable without being coercive.
For users, the takeaway is pragmatic: try Mico if you want a warmer, voice‑driven Copilot experience, but treat the new memory and connector features like any service that stores personal context — review settings, minimize permissions, and keep sensitive information out of remembered data. For organizations, the rollout reinforces the need to update governance and endpoint controls as AI surfaces become part of routine workflows.
Finally, anyone still clinging to Windows 10 should be conscious that support has changed and that new Copilot experiences will continue to be developed with modern platforms in mind. The future of the PC interface is voice, context, and collaboration — Mico simply gives that future a face.

Source: GB News Meet Mico, the new face of Copilot AI on your Windows PC and the successor to much-loathed Clippy
 

Microsoft's newest conversational face for Copilot arrives as a colorful, expressive orb named Mico, a deliberately humanized AI avatar that aims to make talking to your PC feel familiar, tactile, and — for better or worse — emotionally resonant. The rollout centers on Copilot’s voice mode and brings together several of Microsoft’s recent bets: persistent memory, voice-first interactions, a Socratic-style tutoring mode called Learn Live, and a set of safeguards and conversational profiles that try to balance helpfulness with pushback. Mico is optional, enabled by default in the initial rollout, and carries intentionally playful nods to Microsoft’s past (there’s a hidden Clippy surprise), even as it signals a broader push to normalize voice-driven computing on Windows 11 and mobile devices.

A colorful gradient emoji floats beside memory prompts and a Learn Live card on a blue desktop.Background​

Microsoft has been steadily reshaping Copilot from a feature into a persistent, persona-driven assistant. Over the past year the company introduced voice and vision capabilities, expanded memory and actions in Copilot, and previewed the concept of giving the assistant a lasting identity — a “digital patina” that can accumulate context and continuity across interactions. Mico is the visible manifestation of that strategy: an animated avatar that reacts in real time to tone and content, backed by the same memory and reasoning features that power Copilot’s answers.
The introduction of Mico is part product update and part design experiment. It bundles a set of complementary features that include:
  • Real-time expressive animation that mirrors conversational tone.
  • Longer-term memory so Copilot can recall personal preferences and project details.
  • Voice-first interactions, with Copilot’s voice mode promoted as the preferred pathway.
  • Learn Live, a guided Socratic tutoring mode using voice and visual whiteboards.
  • Conversational modes such as “Real Talk” that can push back on false assumptions and surface reasoning.
  • Collaborative Groups, which let multiple people interact with Copilot in shared sessions.
Mico is launching first for U.S. users in Copilot’s voice mode and is enabled by default there; users can disable the avatar in Copilot settings if they prefer a more minimal interface. This availability and the exact regional rollout have varied in early reporting, so expect Microsoft to update geographic availability as the deployment expands.

What Mico is — and what it isn’t​

A face for Copilot, not a new assistant​

Mico is a visual and interactive front end for Copilot — not a separate AI engine. It leverages Copilot’s underlying models, memory system, and actions framework. Think of Mico as the persona layer: the combination of animation, timing, and emotional cues that aim to make conversations feel less mechanical.

Real‑time reactions and expressive design​

When you speak to Copilot in voice mode, Mico animates. It changes expression, color, and posture to reflect listening, confusion, affirmation, or empathy. The goal is to create an intuitive feedback loop: users see that Copilot is attending to tone and intent even before a spoken reply appears. The design intentionally emphasizes micro‑expressions and softness rather than literal human likeness, aiming to avoid uncanny-valley pitfalls while still signaling responsiveness.

Memory and continuity​

A key claim behind Mico is improved continuity across sessions. Copilot’s opt‑in memory stores facts you explicitly allow it to remember — preferences, ongoing projects, calendar rhythms, and some contextual cues. Mico uses that memory to make interactions feel more personal and less transactional, surfacing previously shared details when helpful. The memory system includes UI controls for reviewing, editing, and deleting stored items.

Learn Live — a Socratic tutor​

One of the more concrete use cases Microsoft showcased is Learn Live, a voice‑led mode that turns Copilot/Mico into a tutor. Rather than offering one‑shot answers, Learn Live scaffolds learning through guided questioning, step-by-step practice, and whiteboard-style visuals. It’s explicitly positioned for students and language learners, and it uses iterative prompts and explanations to build understanding rather than rote recitation.

Playfulness and Easter eggs​

Mico includes playful touches: animated responses, personality toggles, and an Easter egg that transforms the orb into a Clippy‑like surprise when prodded repeatedly. These nods to nostalgia are designed to soften the introduction of a new interaction model and generate social buzz.

Verified claims and points of caution​

Microsoft’s product materials and multiple independent reports agree on the broad strokes of Mico’s capabilities: expressive avatar, memory features, Learn Live, and a U.S.-first rollout. Two important clarifications and caveats are worth flagging:
  • Availability: Early reporting briefly listed multiple launch regions, but Microsoft clarified that the initial rollout is limited to the United States; international availability is slated to follow in phases. This kind of staggered regional release is common for features that touch voice, data sovereignty, and regulatory boundaries.
  • Opt‑in memory controls: Copilot’s memory features are presented as opt‑in and editable; however, memory persistence and retention policies vary by account type (consumer vs. organizational), platform, and account settings. Users should verify the memory controls in their Copilot profile if they want to restrict what is stored.
Other, less-certain claims that surfaced in secondary reporting — for example, exhaustive lists of new voice names or precise participant limits for Groups — have been reported variably across outlets. Where details differ, treat specifics as provisional until you see them reflected in Microsoft’s official Copilot release notes or support documentation. Any claim that appears in a single non‑primary source should be treated with caution.

Why Microsoft built Mico: design and strategy​

Humanizing AI to increase adoption​

Microsoft’s strategic problem is adoption: past efforts to make voice a mainstream way to use computers — remember Cortana’s push on Windows 10 — produced mixed results. Voice assistants are widely used on phones and smart speakers, but talking to your PC hasn’t become mainstream. Mico tackles this by adding visual, emotional, and conversational cues to reduce friction and make voice feel social rather than transactional.

A persistent digital companion​

Leadership at Microsoft AI has repeatedly described Copilot’s future as personalized and persistent, with the idea of an assistant that remembers, adapts, and develops an identity over time. Mico gives that vision a concrete surface: if Copilot can “have a room” and a face, users may form more durable mental models of the assistant, potentially increasing trust — or dependence.

Educational and accessibility aims​

Learn Live targets clear product opportunities: students cramming for exams, language learners practicing spoken conversation, and anyone who benefits from spoken scaffolding rather than text. Separately, voice-first interfaces and expressive avatars can improve accessibility for users with vision or mobility impairments by providing richer multimodal feedback.

Strengths: what Mico gets right​

  • Faster, clearer conversational feedback. The combination of voice plus expressive animation reduces uncertainty in voice interactions: you see when Copilot is listening, thinking, or uncertain.
  • Human-centered tutoring. Learn Live’s Socratic approach is a pedagogically sound move — guiding through questions often yields better retention than single-pass answers.
  • Personalization with controls. The visible memory UI and explicit edit/delete options give users direct control over what Copilot remembers, which is a good privacy-first design pattern when implemented well.
  • Integration with existing Copilot features. Mico doesn’t reinvent the wheel; it layers personality onto Copilot’s existing strengths (vision, web access, actions), making the experience feel coherent rather than bolted-on.
  • Marketing and familiarity. Nostalgia (the Clippy nod) combined with approachable design lowers the barrier for users who might otherwise resist corporate AI pushes.

Risks and concerns​

Privacy and data persistence​

Any system that stores long‑term memory about users raises privacy questions. Even with opt‑in toggles, users may be surprised by how much contextual inference Copilot can make, particularly if memory is enabled by default during rollout. The main risks:
  • Unintended persistence of sensitive details.
  • Ambiguity around retention windows and backups across devices.
  • Organizational policies vs. consumer defaults: enterprise admins may need fine‑grained controls to prevent leakage of corporate data.
Recommendation: users should review Copilot memory settings immediately after the update, and enterprises should update device management policies to cover Copilot memory and voice mode.

Behavioral influence and over-reliance​

A characterful assistant that pushes back or challenges the user can be valuable, but it also increases the risk of social engineering and over‑trust. A warm avatar may make users accept advice without scrutiny — the classic “authority bias” problem — even when the AI is mistaken or uncertain.

Safety for minors and mental health implications​

Persistent conversational agents that feel like companions raise complicated questions for children, teenagers, and people prone to substituting digital relationships for human interaction. Companies introducing emotionally resonant AI need robust age‑gating, parental controls, and transparent labeling of persona capabilities.

Misinformation and hallucination​

Copilot has improved, but all generative systems remain susceptible to hallucination. Visual and expressive cues may amplify the perceived credibility of a confident‑sounding response, so Microsoft’s “Real Talk” mode — designed to push back when Copilot senses inaccuracies — will need careful calibration and clear user signaling when the confidence is low.

Security surface area​

Voice interactions and new browser/in‑device actions expand the attack surface. Microphone permissions, clipboard access, and multi‑step browser actions that Copilot can perform must be tightly permissioned and auditable. Enterprises should monitor for related telemetry that shows unexpected or automated actions being executed.

How this affects Windows and PC users​

For consumers​

  • Expect a more conversational experience when you click the Copilot microphone in Windows 11 or the Copilot app on mobile.
  • Mico will be enabled by default in the initial U.S. rollout; you can turn it off through Copilot’s Voice or Appearance settings.
  • Check the Copilot memory controls to review what the assistant stores. Use the delete or forget options for anything sensitive.
  • Use Learn Live for study sessions, but treat its output like a guided tutor rather than an infallible authority.

For enterprise and IT admins​

  • Review your organization’s device configuration policies for Copilot features. Depending on your M365 management tooling, you may be able to disable voice mode, block Copilot’s memory, or restrict Copilot to web-only access.
  • Update employee guidance on the use of Copilot for work: set expectations about what types of data should never be shared with Copilot or external AI features, especially under regulatory constraints (health, finance, IP).
  • Monitor Microsoft’s administrative controls and tenant-level privacy and data retention options as they evolve.

Practical tips: manage Mico and Copilot safely​

  • Open Copilot settings (profile → Voice or Appearance) to toggle off Mico if you don’t want the avatar.
  • Navigate to Copilot memory controls and:
  • Review saved items,
  • Delete anything you don’t want retained,
  • Turn memory off if you prefer stateless interactions.
  • Check microphone permissions at the OS level: revoke access when you don’t intend to use voice features.
  • For shared devices, require account sign-in to use Copilot — this prevents cross-user memory bleed.
  • Use Learn Live with a verification habit: if an explanation is critical (medical, legal, financial), cross-check with authoritative sources.
  • Enterprises should push documentation and training modules that clarify what Copilot is allowed to do and what information employees must keep out of AI prompts.

The bigger picture: human interfaces and the future of voice on PCs​

Mico is part of a larger shift in which major platform vendors are making human‑centered AI interfaces a first-class interaction model. The move from text‑only chat windows to expressive, voice‑first companions is driven by advances in speech recognition, real‑time animation pipelines, and model architectures that can hold longer context.
Two important social dynamics will determine whether Mico is widely adopted:
  • Normalization: Whether people feel comfortable talking to their PCs outside of private settings. Background noise, public spaces, and social norms all influence adoption.
  • Trust calibration: Whether users develop accurate mental models of Copilot’s capabilities and limits. Too much anthropomorphism and users may over-trust; too little and the feature won’t gain traction.
Designers must balance warmth with transparency. That means clear signals for when the assistant is uncertain, easily accessible memory controls, and explicit cues whenever Copilot takes an action on behalf of the user.

Final assessment: promising, but proceed deliberately​

Mico represents a bold, well‑executed step toward making Copilot feel like a partner rather than a tool. The combination of expressive animation, voice-first interactions, memory, and structured learning modes is a coherent set of features that could meaningfully improve productivity, accessibility, and learning experiences on Windows and mobile devices.
At the same time, the move raises familiar but serious concerns: privacy, safety for vulnerable users, the risk of misplaced trust, and the need for strong administrative controls in enterprise contexts. The balance Microsoft strikes between default convenience and user control will determine whether Mico becomes a welcome helper or an interface that users turn off quickly.
For practical use today: treat Mico as an optional enhancement. Try Learn Live for study sessions, but keep memory turned off until you’ve reviewed what it stores and how to manage it. Organizations should take a proactive stance: audit Copilot settings, push clear guidance, and monitor how Copilot actions intersect with internal security policies.
Mico is more than a cute orb — it’s a litmus test for whether users are ready to accept emotion‑aware assistants as part of everyday computing. The design is smart and the vision ambitious; the real question now is whether a bit of personality and a Socratic tutor are enough to make people speak to their PCs without second guessing the privacy and safety trade‑offs.

Source: techcityng.com Microsoft Introduces Mico: The New AI Assistant Replacing Clippy and Cortana
 

Back
Top