• Thread Author
Microsoft has pushed Microsoft 365 Copilot deeper into the employee experience by making Copilot for Viva Engage generally available to eligible customers, embedding a context-aware generative AI assistant that summarizes conversations, surfaces trends, improves discoverability, and provides dynamic, pre-populated prompts across community feeds and campaigns.

Holographic UI displays Microsoft Graph data and Viva Engage on a monitor in a modern office.Background​

Viva Engage (previously Yammer) is Microsoft’s enterprise social layer for companywide announcements, communities of practice, campaign-driven outreach, and organic colleague-to-colleague conversations. The Copilot integration transforms Engage from a largely manual comms and community platform into an AI-augmented experience where employees can “catch up” on conversations, ask natural-language questions about internal posts, and get personalized summaries and suggestions.
Microsoft first previewed Copilot in Viva Engage during earlier rollouts and now signals a strategic shift: rather than only offering composition tools for draft posts and reply suggestions, the general availability release places heavier emphasis on contextual insight and navigation — surfacing trending themes, leader posts, and personalized updates that help users find the right conversations faster.

What Microsoft announced — the headline features​

Microsoft describes Copilot in Viva Engage as a context-aware assistant that draws on permitted Microsoft Graph data to ground responses and provide timely, relevant information inside the Engage experience. The general availability release highlights several practical capabilities:
  • Context-aware prompts and conversation starters that adapt to the section a user is viewing (feed, community, campaign) and suggest who to engage with or what topics are trending.
  • Summaries and smart catch‑up cards that condense activity into digestible updates so users can “catch up” without reading every thread.
  • Enhanced contextual search powered by natural-language queries scoped to a user’s permissions to find older posts, follow-ups, or leader communications.
  • In-app feedback controls (thumbs up/down) to rate Copilot responses and feed refinement signals back into Microsoft’s operational processes for improvement.
  • Integration with Outlook and Teams context so Copilot can ground summaries and replies in calendar, email, and chat data that the user is permitted to access.
These features are targeted to reduce friction in internal comms, help leaders stay visible without drafting every post manually, and improve cross-team visibility in large enterprises that can generate thousands of Engage posts each day.

How Copilot in Viva Engage works — technical overview​

Microsoft positions this Copilot experience as a grounded AI assistant. The sequence is straightforward:
  • A user invokes Copilot inside Engage (via home feed, a community, or a conversation).
  • Engage forwards the prompt to Microsoft 365 Copilot, which gathers relevant Graph data (emails, calendar, files, chats, Engage posts) that the user has permission to access.
  • The gathered context is sent to an LLM hosted through Azure OpenAI services to generate a response.
  • Copilot returns the answer in the Engage interface, and the user can accept, edit, or discard generated summaries and drafts.
Microsoft emphasizes that Copilot only uses data the user has access to for grounding and asserts that prompts, responses, and Graph-derived context are not used to train Microsoft’s public foundational models — all interactions occur within the Microsoft 365 service boundary and are protected by encryption and access controls. Organizations are encouraged to validate these privacy claims against their compliance posture before enabling Copilot widely.

Licensing, pricing, and rollout details — what IT needs to know​

The Copilot experience in Viva Engage requires a separate Microsoft 365 Copilot license; this is not covered by the older Viva premium drafting assistant that ships with some Viva suites. Microsoft’s documentation and product pages make this distinction clear: there are two different experiences — the earlier Engage Copilot v1 tied to Viva premium drafts, and the full Microsoft 365 Copilot in Viva Engage that is Microsoft Graph‑grounded and requires the Copilot add-on.
On pricing, Microsoft has publicly listed Microsoft 365 Copilot for enterprise at $30 per user per month (annually billed) as an add-on for the enterprise SKU; organizations should factor that cost into tenant-wide rollouts. This price point has been widely reported and appears on Microsoft’s pricing pages, but because pricing and packaging can change, IT procurement teams should confirm current rates and volume discounts directly with their Microsoft account teams during planning.Rollout mechanics are similarly administrative: admins can control Copilot availability with tenant-level and group-level policies, and there are controls to opt out or restrict AI summarization where needed. Microsoft has published admin guidance and step-by-step setup instructions to help tenant administrators configure Copilot for Engage.

Why organizations will want Copilot in Viva Engage — practical benefits​

For internal comms teams, community managers, and HR/people leaders, the integration delivers several measurable advantages:
  • Time savings: Drafting and responding to posts becomes faster with pre-populated prompts, reply suggestions, and automated summaries that reduce the time required to stay present across communities.
  • Improved discoverability: Natural-language, contextual search and theme grouping let employees find relevant conversations and prior decisions without manual hunting across disparate threads.
  • Higher participation: Conversation starters and suggested engagement targets can lower the barrier to posting for quieter employees, potentially increasing participation metrics and inclusivity.
  • Consistent messaging: By pulling in Outlook and Teams context, Copilot can help ensure announcements reflect current schedules and attached documents, reducing the likelihood of follow‑ups and corrections.
  • Actionable telemetry: Built-in feedback and analytics give comms and IT teams a way to measure adoption, track quality signals, and tune guardrails or prompt behavior.
These benefits are particularly compelling for distributed and large organizations where asynchronous communication and information discoverability are critical to alignment.

The material risks and tradeoffs — governance, authenticity, and hallucination​

While productivity gains are clear, embedding generative AI in internal communications introduces real risks that organizations must manage proactively.

Authenticity and the “fake enthusiasm” problem​

Automated recognition posts, leader statements, or overly polished celebration messages can feel formulaic and erode trust if overused. Microsoft attempts to mitigate this with an “Authenticity Coach” and nudges to preserve genuine tone, but the onus remains on human communicators to review and personalize AI drafts for authenticity. Over-reliance on AI-generated content can produce quantity without quality.

Hallucinations and factual errors​

Large language models can generate plausible but incorrect details — dates, names, or outcomes — which is especially dangerous when posts reference personnel actions, legal statements, or regulatory matters. Microsoft warns users to validate critical facts before publishing and provides admin controls to require human review for sensitive posts, but tight operational processes must still be enforced by the organization.

Data governance and compliance boundaries​

The integration depends on Microsoft Graph access and Azure-hosted LLMs. Even though Microsoft asserts that Copilot operates within the Microsoft 365 service boundary and that prompts/Graph data aren’t used to train public foundational models, organizations with strict data residency, sector-specific compliance (healthcare, finance, government), or contractual secrecy obligations should perform a legal and technical review before enabling Copilot tenant-wide. Data residency and regulatory compliance are not inferred automatically — they require explicit validation against corporate policy and applicable law.

Cost and licensing implications​

At an approximate list price of $30 per user per month for Microsoft 365 Copilot (enterprise add-on), a broad rollout can be expensive. A pilot limited to comms teams, leaders, and community managers will be a common pattern to control costs while measuring value. Procurement teams must confirm current pricing and consider whether per-seat licensing or scoped access (e.g., only leaders and moderators) achieves the required ROI.

Practical deployment checklist for IT and communications teams​

Deploying Copilot in Viva Engage responsibly requires both technical setup and change management. Below is a recommended sequential checklist to guide rollout.
  • Confirm licensing and budget: validate Microsoft 365 Copilot license availability and pricing with Microsoft account team.
  • Define scope and pilot group: start with a limited pilot (comm leaders, HR, internal comms, community managers) to measure impact and tune prompts.
  • Review data residency and compliance constraints: consult legal and security teams to confirm that Graph-derived data processing aligns with regulations and internal policies. Flag any sectors with special requirements.
  • Set tenant and group policies: use the admin controls Microsoft provides to gate Copilot features by group, enforce review workflows for sensitive posts, and disable features where needed.
  • Configure monitoring and feedback loops: capture thumbs up/down metrics, monitor adoption via Copilot analytics, and collect qualitative feedback from pilot users.
  • Train users and leaders: teach best practices for AI-augmented comms — how to use drafts, verify facts, preserve tone, and personalize Copilot output.
  • Iterate guardrails and expand: after pilot measurement (4–8 weeks), expand access in phases while continuously refining prompts and admin settings.

Governance and technical controls to demand from Microsoft and your internal teams​

Effective governance combines product-level controls and operational policies:
  • Enforceable admin policies that can disable summarization or drafting at the tenant or group level.
  • Audit logs that record when Copilot was used, what content was summarized, and by whom.
  • Human-in-the-loop requirements for leadership or sensitive communications.
  • A mechanism to opt users out and to prevent Copilot from accessing certain Graph scopes where necessary.
Administrators should validate that these controls meet industry-specific compliance demands and map to the organization’s own data classification scheme.

Measuring value — metrics to track during and after rollout​

To justify Copilot investment and to manage risk, track a combination of quantitative and qualitative metrics:
  • Adoption and usage: number of Copilot interactions, weekly active users within Engage, and feature usage breakdown (search, summaries, draft generation).
  • Quality signals: percentage of Copilot responses receiving positive feedback (thumbs up) and the volume of edits required before publishing.
  • Time savings: average time to draft announcements or reply to community threads before and after Copilot.
  • Engagement quality: changes in meaningful interactions (replies, sustained conversations, polls participation), not just raw post counts.
  • Compliance incidents: any misstatements, privacy exposures, or policy violations traced to Copilot-generated content.
Pair telemetry with periodic user surveys to capture sentiment about authenticity and trust in AI-generated communications.

Real-world scenarios: where Copilot shines and where it stumbles​

Where Copilot adds clear value​

  • Rapid executive updates: leaders can use Copilot to draft clear, consistent messages that they then personalize—saving time while preserving voice.
  • Cross-team discoverability: staff can ask, “What did the marketing team say about the new launch?” and receive a grounded summary of relevant posts and links.
  • Community management at scale: moderators can get summaries of high-volume threads, identify themes, and escalate issues efficiently.

Where to be cautious​

  • Sensitive HR or legal matters: never publish AI-generated drafts on personnel changes, terminations, or compliance matters without legal and HR sign-off.
  • High-stakes factual claims: AI hallucinations can introduce false attributions or incorrect dates — verification processes should be mandatory.
  • Over-automation of culture-building: using Copilot to automate employee recognition or celebration posts risks hollowing out authentic connections if humans do not curate and personalize the output.

Cross-checking the key claims — verification and caveats​

Multiple documentation summaries and Microsoft’s own guidance converge on several key technical and commercial points: that Copilot in Viva Engage is generally available to customers with Microsoft 365 Copilot licenses; that the feature leverages Microsoft Graph to ground responses; and that interactions are processed within the Microsoft 365 service boundary with encryption and access controls. These claims are reflected across Microsoft’s docs, TechCommunity posts, and product support pages summarized in the available briefing material.Two important caveats remain and should be treated as verification priorities for any IT team considering rollout:
  • Pricing and SKU details can change; the commonly reported $30/user/month enterprise add-on was listed in Microsoft product pages at the time of the public documentation summary — verify current pricing and licensing with Microsoft directly during procurement.
  • Microsoft’s statement that prompts, responses, and Graph‑derived context are not used to train public foundational models is a strong privacy assurance, but organizations with strict compliance requirements should perform their own legal, security, and technical validation to confirm that the interaction model and data residency align with contractual and regulatory obligations.
Where evidence in internal documentation or specific regulatory guidance is missing, treat the claim as unverified until confirmed with Microsoft or through proof-of-concept testing.

Recommended governance policy template (high level)​

  • Access control: limit Copilot for Engage to defined pilot groups for an initial period.
  • Human review: require explicit human approval for posts tagged as “leadership,” “HR,” or “legal.”
  • Data scope: restrict Graph scopes available to Copilot in line with data classification policies.
  • Audit and retention: enable audit logging of Copilot interactions and retain logs according to retention policies.
  • Feedback and improvement: monitor thumbs up/down signals and collect qualitative feedback to tune prompts and guardrails.

Final analysis — balancing productivity and prudence​

Microsoft 365 Copilot in Viva Engage represents a meaningful advance for internal communications: it reduces friction, improves findability, and helps leaders and community managers scale their efforts. The general availability release refocuses Copilot on insight and navigation rather than pure content authoring, which is a pragmatic shift that aligns with the most immediate pains in enterprise social platforms — discovery, summarization, and prioritization.However, the integration also amplifies governance responsibilities. Organizations must treat Copilot not as a simple feature flip but as an operational program: pilot, measure, codify rules, train users, and enforce human oversight in scenarios where accuracy and authenticity matter most. Cost and compliance are manageable with deliberate rollout designs, but they are non-trivial and deserve leadership attention.In short, Copilot in Viva Engage can be a productivity multiplier — but only when paired with strong governance, careful cost controls, and persistent human judgment to preserve trust and accuracy across internal communications.

Source: Redmondmag.com Microsoft 365 Copilot Boosts Viva Engage -- Redmondmag.com
 

Microsoft has pushed a major productivity pivot into the heart of workplace social networking with the general availability of Microsoft 365 Copilot in Viva Engage, delivering context-aware AI summaries, smarter search, and integrated writing assistance designed to help employees “catch up” faster and surface what matters across communities, storylines, and campaigns.

Person seated at a glass desk using a holographic Viva Engage interface with catch-up cards and natural language search.Background​

Viva Engage has long been Microsoft's employee social platform for conversation, recognition, and internal communications. Over the past two years Microsoft rolled incremental AI features into Viva Engage to aid post creation and engagement, but the newest update marks a step change: Viva Engage now integrates directly with Microsoft 365 Copilot, enabling deeper contextual signals from across Microsoft 365 and introducing a suite of features focused on catch‑ups, trending themes, and natural‑language search. The move reflects Microsoft’s broader strategy of embedding Copilot across productivity touchpoints—from Outlook and Teams to SharePoint and Viva—to reduce information overload and help knowledge workers act on the most relevant signals without wasting time digging through feeds.
This release is positioned as a premium capability for organizations that hold a Microsoft 365 Copilot entitlement, and it layers new AI-driven behaviors on top of existing Viva Engage functionality — turning passive feeds into actionable, summarized intelligence.

Overview of the new capabilities​

The announcement bundles multiple, tightly related features that change how people discover and interact with internal conversations and campaigns. The key capabilities are:
  • Context‑aware productivity: Copilot adapts answers and prompts depending on whether you’re viewing a community, a storyline, a campaign, or a single conversation.
  • Smart catch‑up cards: Compact, weekly prompts in the home feed, communities, and conversations that summarize recent activity and let users open a Copilot pane to read an organized summary.
  • Themes feed and trending grouping: Trending topics are grouped into themes; tapping a theme surfaces a feed of the conversations that contribute to it.
  • Enhanced natural‑language search: You can ask questions such as “show me updates from last week” or “find where I was mentioned” and receive direct results without manual filtering.
  • Improved writing assistance: Copilot can pull context from Teams and Outlook to help craft more complete and informed posts and articles inside Viva Engage.
  • Admin control via Viva Feature Access Management: Tenants retain governance over Copilot availability with tenant, group, and user level policies.
These features are designed to be discovery-first: Copilot surfaces what’s new and important and then offers direct paths into the underlying conversations or items.

What changed from previous Copilot in Viva Engage iterations​

To understand the significance, it helps to separate two phases:
  • The earlier Copilot experiences in Viva Engage (the initial waves) focused primarily on content creation — helping leaders and employees draft posts, suggest topics, and recommend conversational engagement based on trending items inside Viva Engage communities.
  • The current Microsoft 365 Copilot integration extends that capability outward and inward at the same time: outward to include signals from Teams and Outlook for better context when composing posts, and inward to deliver AI-powered catch‑ups, theme grouping, and search across Engage using natural language.
Put simply, the product has matured from being an assistant for composing content to becoming an assistant for staying informed and finding the right content — a critical distinction for organizations wrestling with signal vs. noise in enterprise social systems.

Deep dive: How the features work in practice​

Context‑aware productivity — AI that follows your flow​

One of the headline improvements is context awareness. Copilot recognizes the surface you are on and adapts the prompt set and summarization scope accordingly. If you’re inside a campaign, Copilot will prioritize campaign posts and analytics; in a storyline it will focus on longitudinal updates; inside a single conversation it will summarize thread activity and identify unanswered questions.
This means less friction: rather than asking Copilot to “summarize the campaign,” you can tap a context prompt and receive a summary tuned to that exact content and audience.

Smart catch‑up cards — weekly, scannable briefings​

Smart catch‑up cards appear in the home feed, communities, and conversation lists. On the home feed they are surfaced by relevance algorithms and typically appear on a weekly cadence; within a community or conversation they appear only when a meaningful threshold of activity has occurred.
The cards provide a compact preview of what’s new and the option to “open Copilot” for a single-click, organized summary. For employees who check feeds irregularly, these cards are intended to convert hours of scroll time into minutes of high-value reading.

Themes feed and trending grouping — find the signal​

Trending posts are grouped into themes, and selecting a theme opens a dedicated Themes Feed that aggregates all contributing conversations. This thematic grouping is useful for communications teams and leaders to identify emergent issues or celebration points without chasing individual posts.
The Themes Feed is also designed to help comms teams drill into sentiment and community reach, and it can accelerate campaign analysis by exposing commonly discussed topics across multiple communities.

Natural language search — ask and receive​

Search in Viva Engage takes a more conversational turn. Users can compose natural‑language queries like “show me posts about the Q3 launch from last month” or “where did John mention the bug in his post” and receive targeted results. The integration intends to remove frequent, time‑consuming tasks like filtering by date, community, or author.
This is a notable addition because earlier Copilot experiences in Engage were more composition-focused and lacked the cross‑surface search depth now available with Microsoft 365 Copilot.

Writing assistance with cross‑app context — better posts, faster​

When drafting a post, Copilot can now bring in context from Teams messages and Outlook threads to help build a more complete update. That can be a productivity gain when summarizing project status, extracting key decisions from a meeting, or assembling a digest of customer feedback. The assistant offers “Inspire” and “Draft” flows: one to generate ideas and the other to produce initial text you can edit.

Enterprise governance, admin controls, and licensing​

Enterprises will appreciate that governance remains a central design point:
  • Access to Microsoft 365 Copilot functionality in Viva Engage is controlled through existing feature access management and tenant policies.
  • Admins can set tenant‑wide, group‑level, or individual user policies to enable or disable Copilot features.
  • Some Copilot features require a Microsoft 365 Copilot license; organizations must confirm entitlement before enabling the full experience.
These controls are important because they allow IT and comms leaders to balance AI productivity gains against compliance, privacy, and cultural concerns.

Strengths and practical benefits​

The Copilot integration delivers measurable, practical benefits if adopted and governed well:
  • Time savings and cognitive load reduction: Weekly catch‑up cards and one‑click summaries reduce time spent scrolling and searching through feeds.
  • Faster onboarding to conversations: New hires or people returning from leave can use Copilot catch‑ups to quickly rejoin active threads.
  • Better leader communications: Contextual prompts and cross‑app context help leaders craft clearer, more informed posts that drive engagement.
  • Improved discoverability: Natural language search reduces the friction of finding mentions, posts, or campaign items buried in busy feeds.
  • Actionable trend visibility: The Themes Feed makes it simpler to spot trending organizational topics and respond proactively.
  • Admin controls: Fine‑grained policies mean organizations can pilot features with subsets of users before broad enablement.
For communications teams and community managers, these features could noticeably accelerate content planning and crisis response workflows. For knowledge workers, the integration promises to convert passive social listening into action faster.

Risks, limitations, and governance considerations​

No enterprise AI rollout is without hazards. The Viva Engage Copilot experience carries several practical and policy risks that organizations must manage thoughtfully.

Data exposure and privacy​

Because Copilot can pull context from Teams and Outlook, there is an elevated risk of exposing sensitive content in generated summaries or prompt responses. While the system is designed to only surface content users already have access to, administrators must review access controls and audit policies to prevent accidental disclosure across groups or to unauthorized users.

Accuracy, hallucination, and trust​

Generative models can produce convincing but incorrect information. When Copilot summarizes threads or generates a post using cross‑app context, organizations must be aware of potential hallucinations. Suggested mitigations include:
  • Requiring human review for leader posts and campaign communications.
  • Coaching users to verify facts when Copilot sources content from multiple places.
  • Turning off auto‑publish or auto‑insertion behaviors until workflows are proven.

Compliance and regulated environments​

Certain environments (e.g., government clouds, regulated industries) may impose restrictions on AI services. Historically, availability of Copilot features in high‑security clouds has been staggered; organizations operating in regulated contexts should confirm whether the new Copilot capabilities meet their compliance and data residency needs before enabling them broadly.

License cost and procurement complexity​

Microsoft 365 Copilot is a premium add‑on. Although the productivity gains can offset the cost, organizations should run a pilot and build a cost justification that measures time saved, engagement uplift, or reduction in manual tasks.

Cultural impact and moderation​

Automated prompts that suggest where to engage and what to say can subtly shape organizational voice. Over-reliance on AI‑generated drafts risks homogenizing communications and reducing authenticity. Communications leaders should combine Copilot outputs with editorial guidance to preserve tone and ensure posts align with organizational values.

Admin overhead​

Feature Access Management provides control, but it introduces new admin responsibilities: policy definition, rollout sequencing, user education, and monitoring feedback from Copilot’s thumbs‑up/thumbs‑down telemetry. Without investment in governance, the new capabilities could create uneven experiences across teams.

Recommended rollout and operational checklist​

Successful rollouts marry technology with governance and user enablement. A recommended phased approach:
  • Define goals and KPIs:
  • Identify target use cases (internal comms, campaign analytics, employee catch‑ups).
  • Set measurable KPIs (reduced time to catch up, engagement lift, reduced internal queries).
  • Pilot with a controlled group:
  • Start with communications teams and community managers.
  • Use Feature Access Management to scope the pilot.
  • Validate compliance and data residency:
  • Confirm the Copilot behaviors meet legal, security, and regulatory requirements.
  • Engage compliance and security teams early.
  • Train users and editors:
  • Publish guidance on when to use Copilot, verification steps, and editorial standards.
  • Create templates and best‑practice prompts for common scenarios.
  • Monitor and iterate:
  • Use feedback telemetry and user surveys.
  • Adjust tenant policies and guardrails as real usage data emerges.
  • Scale with governance:
  • Expand by role or region after success metrics are met.
  • Maintain a review cadence with stakeholder owners.

Practical examples and use cases​

  • A product manager returning from leave asks Copilot to “catch me up on the Q2 launch conversation” and receives a short, bulleted summary of decisions, open actions, and links to relevant threads.
  • A comms lead queries “show trending themes across the company this week” and is presented with grouped themes that highlight a cross‑company initiative gaining momentum — enabling a timely leader post.
  • A team member drafting an update on Storyline prompts Copilot to pull relevant Teams discussion points and the latest email thread to create a first draft that the author then refines.
These examples show how the integration shifts value from manual search and aggregation to curation and response — a useful reorientation for time‑pressed employees.

What administrators and IT leaders should watch for​

  • Check entitlement: Confirm which Copilot features in Viva Engage require a Microsoft 365 Copilot license and plan procurement accordingly.
  • Review Feature Access Management: Determine who will have access during pilot and production phases.
  • Audit logs and telemetry: Ensure logging and monitoring capture Copilot interactions for compliance and troubleshooting.
  • User training plans: Provide what/when/how guidance; supply standard prompts and editorial rules.
  • Localization and language coverage: If your workforce is global, validate support for the languages your teams require and test outputs for cultural appropriateness.

The competitive and strategic angle​

Embedding Microsoft 365 Copilot into Viva Engage is part of a broader tactical play: make AI an invisible productivity layer that connects employee communication with the rest of the digital workplace. For Microsoft, the value is twofold — increasing the utility and stickiness of Viva Engage while creating another compelling reason for organizations to subscribe to Microsoft 365 Copilot.
For customers, the strategic decision becomes less about whether AI can help and more about how to integrate AI responsibly into internal communications operations. Organizations that adopt with clear governance and strong editorial processes stand to benefit from faster situational awareness and more relevant internal engagement, while those who rush to enable Copilot without guardrails may encounter privacy, accuracy, or cultural problems.

Final assessment: balance of opportunity and caution​

The arrival of Microsoft 365 Copilot in Viva Engage is an important step toward reducing the time employees spend searching and catching up in internal social feeds. The combination of context‑aware prompts, smart catch‑up cards, thematic grouping, and natural‑language search is well aligned with real workplace pain points: information overload and fragmented context.
However, the benefits hinge squarely on governance, licensing clarity, and user training. The technology reduces friction, but it also introduces new responsibilities for admin teams and communications leaders. The most effective deployments will be gradual, measured, and tightly coupled with compliance reviews and editorial oversight.
Organizations should treat Copilot in Viva Engage as a force multiplier — powerful when governed and guided properly, risky when deployed as a “set-and-forget” productivity feature.

The implementation of AI in employee experience tools is accelerating; the practical question for IT and comms leaders now is not whether to use AI, but how to design policies, workflows, and cultural norms so that AI strengthens human communication rather than dilutes it. The new Copilot capabilities in Viva Engage bring considerable promise — and a clear set of responsibilities — for any organization that wants to make internal conversations more discoverable, more actionable, and more aligned with business priorities.

Source: Windows Report Microsoft 365 Copilot Arrives in Viva Engage
 

Back
Top