Grand Traverse County Explores Microsoft 365 Copilot Pilot with Entra ID P2

  • Thread Author
Grand Traverse County is preparing a cautious entry into generative AI: commissioners are being asked to approve a near‑$400,000 renewal of the county’s Microsoft 365 subscription that includes a proposal to add 100 Microsoft 365 Copilot licenses as a controlled pilot, accompanied by an identity‑upgrade to Entra ID P2 and a formal governance package intended to minimize risk while measuring productivity gains.

A futuristic office with holographic security UI and Entra ID P2 shield.Background / Overview​

Grand Traverse County’s Information Technology Department has placed a one‑year Microsoft 365 renewal on the commissioners’ agenda that totals roughly $398,083.80, with the incremental Copilot add‑on increasing the request by about $36,000 for the year. The county says the extra seats are intended as a pilot cohort — roughly 100 seats selected from an employee population of about 580 and distributed across nearly 27 offices and departments — not an enterprise‑wide enablement.
Those procurement decisions did not arise in a vacuum. The county responded to a ransomware incident that struck local government networks on June 12, 2024, which accelerated a shift toward cloud hosting for critical services and a more aggressive posture on identity and access protections. The broader package now couples cloud migrations, the Copilot pilot, and an Entra ID (formerly Azure Active Directory) licensing upgrade to P2 as complementary moves.
To make these numbers tangible: Microsoft publishes enterprise pricing for Microsoft 365 Copilot at approximately $30 per user per month (annual commitment), which matches the county’s budget math — 100 seats at that price level equals about $36,000 per year. That pricing point is why the county frames the 100‑seat request as a reasonable initial investment to test AI in day‑to‑day workflows.

What the county is buying — the technical picture​

Microsoft 365 Copilot: how it fits into a tenant​

Microsoft 365 Copilot is not a separate desktop app; it’s an add‑on to eligible Microsoft 365 tenants that can reason over work data accessible through the Microsoft Graph — including SharePoint, OneDrive, Exchange/Outlook and Teams — when permitted by tenant controls. Copilot features include chat‑style assistance inside apps (Word, Excel, PowerPoint, Teams) and agent‑style automation via Copilot Studio. Tenant administrators retain controls over features such as web grounding, file upload, and retention of Copilot chat histories.
  • Key technical attributes the county must deliberate:
  • Copilot's access surface: Graph connectors, Exchange mailboxes, SharePoint and Teams content.
  • Tenant controls: ability to disable web grounding, restrict file uploads, and configure chat retention.
  • Admin tooling: Copilot is managed via the Microsoft 365 admin center and subject to tenant security policies.

Entra ID P2 — identity as the first line of defense​

The county packet pairs the Copilot pilot with a planned upgrade from Entra (Azure AD) P1 to Entra ID P2, a move that brings Privileged Identity Management (PIM), risk‑based Conditional Access, identity protection telemetry, and access review capabilities into scope. PIM enables just‑in‑time elevation for administrative roles, time‑bounded role activation, and an approvals workflow — practical controls that materially reduce the attack surface exposed by standing privileged accounts. Entra ID P2 is standard advice where an AI assistant will be able to reason over sensitive tenant content, because identity compromise is often the simplest path for lateral movement into cloud resources.

Governance, training and the Center of Excellence​

Grand Traverse County is explicitly tying Copilot access to a governance package that includes:
  • Formation of a Center of Excellence (COE) in AI — presented as a governance framework to standardize rollout decisions, playbooks, and policies rather than a new physical office.
  • Mandatory AI technical training for pilot participants, focused on prompt safety, model limitations, and what constitutes never‑share content.
  • Signed AI use agreements that set ground rules for permitted use, handling of sensitive data, and disciplinary consequences for misuse.
These are sensible governance building blocks: a COE centralizes policy and measurement; training reduces operator errors; and formal agreements create enforceable boundaries. The real test will be whether those commitments are operationalized — i.e., actual training completion records, enforceable conditional access policies, tenant configuration checklists, and published KPIs for the pilot.

Public‑safety pilots and adjacent AI activity​

The Copilot item sits alongside other AI experiments the county is already testing. Earlier in 2025 the county approved a trial of an AI call‑handling system for non‑emergency Central Dispatch (Aurelian AI) intended to triage lower‑risk calls and free dispatchers to focus on true emergencies. That procurement carried a first‑year setup cost of roughly $60,000 and recurring fees thereafter — a concrete example of how the county is using vendor AI to offload repetitive tasks in public safety triage.
The sequence is relevant: the county’s June 2024 ransomware incident pushed mission‑critical services to favor vendor‑hosted cloud solutions with continuous patching and redundancy — an approach that underpins the county’s simultaneous decisions on Copilot, cloud hosting of permitting systems, and justice center planning.

Benefits the county is chasing​

  • Faster administrative workflows: drafting, summarization, and redlining for legal documents, HR memos and public notices.
  • Data‑upskilling at scale: Copilot in Excel can translate natural‑language questions into formulas and generate narrative summaries for presentations and audits.
  • Meeting and collaboration efficiency: Teams Copilot can produce notes, action items and concise meeting summaries.
  • Knowledge discovery: index cross‑departmental content to help staff rapidly locate policy or case information.
These are credible, measurable benefits — provided the county enforces human‑in‑the‑loop checks for outputs destined for legal, fiscal or public‑safety use.

Risks and trade‑offs — what commissioners must weigh​

No enterprise AI pilot is risk‑free. Grand Traverse County’s plan appropriately recognizes major hazards; successful execution depends on the discipline of follow‑through.

1. Hallucinations and factual accuracy​

Generative models can fabricate citations, misstate legal references, or invent data. For government communications, any AI‑generated language that affects legal obligations, budgets, or public safety must be verified by a human before publication. The county’s AI Agreement and review gates should require human sign‑off for external documents.

2. Data governance and exposure​

Copilot reasons over tenant data when authorized. If employees paste or upload content containing Social Security numbers, health records, law‑enforcement investigative details, or attorney–client privileged material, that content becomes part of the Copilot context unless tenant controls prevent it. Strict data classification, disabling file uploads for sensitive repositories, and administrative controls around connectors are essential.

3. Public records, discoverability and legal exposure​

AI‑generated prompts and outputs can be subject to Freedom of Information Act (FOIA) or state public‑records laws. The county must decide whether Copilot chat histories, prompts, and generated drafts are recordable and how they will be retained, audited and produced under legal request. The simplest response is to treat those artifacts as records and bake them into records management and retention policies from day one.

4. Identity and attack surface​

Copilot’s Graph connectors and integrations multiply the consequences of an identity compromise. Upgrading to Entra ID P2 is the right move, but licenses alone aren’t enough — P2 features must be configured and enforced: Conditional Access policies, PIM with time‑based activation for administrators, MFA enforcement, and periodic access reviews. Those are operational tasks that require clear ownership and auditing.

5. Ongoing operational cost​

Subscriptions scale. One hundred seats at $30/user/month translates to ~$36,000/year — reasonable for a pilot but potentially burdensome if scaled to the majority of the county workforce. Commissioners should require a 6–12 month pilot with explicit KPIs and a gating decision before additional seats are purchased. The county packet correctly notes that some procurement scenarios show little or no per‑seat discount at smaller volumes, so disciplined budgeting is necessary.

Practical checklist: what the county should require before enabling Copilot users​

  • COE charter and membership published, with reporting cadence to commissioners and an explicit budget line for AI oversight.
  • Signed AI‑use agreements for pilot participants and role‑based training completion tracked in a learning management system.
  • Entra ID P2 features operationalized:
  • PIM applied to privileged roles; time‑bound activation and approval flows enabled.
  • Risk‑based Conditional Access policies (e.g., block risky sign‑ins, require MFA on risky behavior).
  • Tenant configuration locked down by default: disable web grounding unless explicitly required, disable file upload for non‑approved groups, and enable comprehensive audit logging for Copilot activity.
  • Pilot metrics and evaluation gates: collect time‑saved per task, accuracy/error rates, human remediation incidents, and a cost‑per‑task analysis. Require a 3‑ and 12‑month report to the Board.
  • Records management policy updated to include prompts, outputs and logs; legal counsel sign‑off on retention and discoverability practices.

Procurement and contract caveats​

  • Ask vendors for explicit egress and portability language. Cloud hosting shifts risk to vendor contracts; negotiate the right to export data in usable form and reasonable egress pricing.
  • Validate pricing: the county’s $36,000 line aligns with Microsoft’s $30/user/month enterprise price, but final invoiced totals can differ because of taxes, enterprise agreements, or partner discounts. The procurement packet notes this uncertainty and prudent commissioners should confirm final contract math before approval.
  • Insist on tenant‑level auditability and contractual obligations to support FOIA or legal discovery requests for Copilot logs if needed.

Political and community angles the board should anticipate​

AI pilots that touch public‑safety, records, or direct public interaction (for example, triaging non‑emergency calls) will invite public scrutiny and demand transparency. The county’s earlier ransomware episode already heightened local awareness of cybersecurity and continuity concerns; coupling AI pilots with visible security investments (Entra ID P2, cloud hosting, public reporting) helps create a defensible narrative, but the county must show outcomes and safeguards, not just promise them.
Community expectations to manage:
  • Clear opt‑outs and human fallback for any AI call‑handling or public‑facing automation.
  • Public reporting on pilot metrics and incidents, ideally in anonymized, aggregate form.

Measurable outcomes that justify scaling​

If the county intends to scale Copilot beyond the pilot, commissioners should demand evidence tied to explicit KPIs:
  • Time‑saved per task and estimated annualized labor cost reduction.
  • Accuracy/error rates for outputs tied to human remediation and incidents logged.
  • Reduction in time‑to‑decision for common administrative approvals.
  • Changes in incident response mean time to recovery (MTTR) for IT or public‑safety workflows attributable to cloud migration.
  • Financial model showing five‑year total cost of ownership for both licensing and operational overhead.

Final assessment — pragmatic, conditional, and reversible​

Grand Traverse County’s approach is pragmatic: it couples a limited pilot with an identity upgrade and a governance framework, and it acknowledges the security scar tissue left by last year’s ransomware incident. The stepwise posture — pilot seats, COE, training, signed agreements, and an Entra ID P2 upgrade — follows many expert recommendations for public‑sector AI adoption.
That said, benefits are not automatic. Guardrails must be enforced, not just purchased. Entra ID P2 features must be configured and audited; tenant controls for Copilot must be conservative by default; records management must be updated to capture AI artifacts; and the county must demonstrate metrics that justify ongoing spending. Commissioners should treat the 100‑seat Copilot purchase as a time‑boxed experiment with mandatory reporting and clear stop/continue gates.
If executed with discipline — human review, strict tenant controls, transparent KPIs, and legal records management — the pilot could deliver real administrative value. If executed only as a licensing add‑on without the operational work, the county risks accuracy failures, privacy incidents, and a public back‑lash that will be harder to repair than a single line item in next year’s budget.

Grand Traverse County’s request to test Microsoft 365 Copilot with 100 seats is a noteworthy case study for municipalities balancing the promise of generative AI against the operational realities of identity risk, public‑records law, and procurement discipline. The county’s packet and public reporting make the trade‑offs explicit: a modest recurring cost to seed an AI pilot, paired with an identity licensing upgrade and governance commitments — a defensible path, but one that will only prove wise if the county matches the purchase with meticulous implementation and transparent measurement.

Source: Traverse City Record-Eagle County to test AI 'Copilot,' downsize request for 100 licenses
 

Back
Top