Austria DSB restricts Microsoft 365 Education cookies on pupils

  • Thread Author
At the start of 2026 Austria’s data protection authority (DSB) ordered Microsoft to stop deploying certain tracking cookies on a school‑issued device after finding the cookies were placed without a valid legal basis while a pupil used Microsoft 365 Education — a ruling that threads technical telemetry, corporate responsibility, and children’s data protection into a high‑stakes regulatory precedent.

Background​

The dispute began with complaints filed by the European digital‑rights group None of Your Business (noyb) against Microsoft’s education offering. The complaints, lodged in mid‑2024, alleged that Microsoft 365 Education was setting cookies and collecting telemetry on pupils’ devices in ways that went beyond what was strictly necessary for providing educational functionality. The DSB’s procedure produced two connected decisions: an October 2025 ruling focused on data‑subject access rights under Article 15 of the GDPR, and a follow‑up order in early 2026 addressing unlawful cookie deployment on a minor’s device.
That twin‑track enforcement — demanding transparency about what “internal reporting,” “business modelling,” and “improvement of core functionality” really mean, and then targeting concrete cookie identifiers — turned an abstract privacy argument into an operational directive with a short compliance window. The regulator ordered Microsoft to stop the contested processing for the affected pupil within weeks.

What the DSB actually found​

Concrete technical findings​

The DSB’s decision identifies multiple cookie identifiers and telemetry markers that it treats as not strictly necessary for the core educational service. Among the names cited by the regulator are MUID, MC1, MSFPC, FPC, MicrosoftApplicationsTelemetryDeviceId and ai‑session. The authority’s reasoning was that those identifiers can — depending on configuration and downstream uses — support analytics, profiling or advertising, and therefore require an independent legal basis such as informed consent. Because no valid consent for pupil‑level tracking was recorded, the deployment was judged unlawful in that instance.

Dual‑purpose cookies: a legal and engineering fault line​

A recurring technical concept in the ruling is the dual‑purpose cookie: an identifier that can be used both for necessary operational functions (session stability, basic authentication) and for optional analytics or marketing. The DSB emphasised a decisive legal point: if a cookie can be used for a purpose that requires consent (for example, behavioural advertising or profiling), that cookie cannot be treated globally as “strictly necessary” to avoid consent requirements. In practice this forces vendors and administrators to either split functionality into distinct, purpose‑bound identifiers or to obtain consent for the broader uses.

Jurisdictional and responsibility findings​

Microsoft attempted to frame the issue as a school deployment/configuration problem and pointed to its European subsidiary as the relevant party for EU supervision. The DSB rejected a paper‑thin jurisdictional shield and stressed that where the vendor retains decisive control over product defaults, telemetry design and downstream flows, the vendor carries substantive responsibility. The regulator therefore treated the Microsoft group’s decision‑making structure as relevant, rather than blocking enforcement behind a subsidiary label.

Why this matters for schools and student privacy​

The DSB’s rulings are notable because they convert high‑level GDPR principles into actionable obligations for vendors and customers in education settings. Schools traditionally regard cloud suites like Microsoft 365 Education as a procurement item: they buy tenant subscriptions and expect the vendor to deliver a compliant product. But the regulator’s approach pulls vendor design choices — cookie defaults, telemetry pipelines, and admin‑facing controls — into the compliance equation. That shift has three immediate consequences:
  • Operational responsibility for vendors: Cloud vendors cannot fully insulate themselves behind “we provide the tools” arguments when their defaults set data‑collection behavior on pupil devices.
  • Procurement and contractual change: Schools and ministries must demand verifiable technical controls and auditable inventories from suppliers as part of procurement terms.
  • Higher standard for minors: When children are involved, regulators expect age‑appropriate transparency, conservative defaults, and stronger safeguards for profiling‑type processing.
For IT directors, parent groups and education ministries, the practical takeaway is simple: vendor defaults matter legally, not just technically.

A technical breakdown: the cookies and telemetry named by the DSB​

The DSB named specific cookies and telemetry elements to illustrate the kinds of identifiers it considered non‑essential in the pupil’s context. While the presence of a cookie name does not by itself prove an unlawful transfer, naming these technical artifacts makes the decision far more operationally useful to administrators and auditors. Key identifiers flagged include:
  • MUID, MC1, MSFPC — cross‑site browser identifiers historically associated with Microsoft’s session and cross‑site tracking infrastructure.
  • MicrosoftApplicationsTelemetryDeviceId, ai‑session — telemetry and session identifiers used by Microsoft’s in‑product instrumentation.
The regulator’s emphasis is not on the labels alone but on the purposes attached to those identifiers. If the identifier is used for analytics, profiling, or advertising, it requires explicit legal grounding beyond “service necessity.” That requirement forces a technical response: either segregate telemetry channels by purpose, generate distinct cookie names for strictly necessary tasks, or shift optional uses behind explicit opt‑ins.

The contested question of third‑party flows and AI training​

A significant theme in the proceedings was whether telemetry from education deployments ever reached third‑party services (for instance, advertising intermediaries or external AI model pipelines). The DSB explicitly asked Microsoft to clarify whether telemetry referenced parties such as LinkedIn, OpenAI or specific ad intermediaries, and required more precise disclosures following the earlier Article 15 access ruling. However, public summaries of the DSB’s order do not publish an itemised, independently verified log demonstrating each transfer; instead, the regulator demanded that Microsoft disclose the facts. Until Microsoft or the DSB publishes definitive logs or contracts, claims about specific downstream uses remain unresolved and should be treated with caution.
This is an important caution: the regulator’s order expands the obligation to be transparent and to demonstrate lawful bases — but asserting that pupil data definitely trained external models would be an allegation until verified by either vendor disclosures or regulatory evidence. The DSB’s public posture is that it expects suppliers to prove lawful bases or to stop disputed transfers.

Microsoft’s immediate compliance window and likely responses​

The DSB ordered Microsoft to stop the contested tracking for the complainant within a short statutory timeframe (reportedly four weeks in the DSB’s order). Microsoft said at the time that Microsoft 365 for Education meets required data‑protection standards and that institutions can continue to use it in compliance with GDPR, adding that it would review the DSB’s decision and decide on next steps. The regulator’s tight timeframe — particularly when a child’s data is at issue — is designed to push rapid remediation rather than protracted argument.
Microsoft’s practical options are straightforward but politically and technically significant:
  • Rapid product changes: Adjust defaults for education tenants, segregate telemetry, and issue an auditable cookie inventory to schools.
  • Targeted configuration changes: Provide admin‑level toggles that default to conservative telemetry settings for pupils.
  • Legal appeal: Launch an administrative appeal to contest the DSB’s interpretation — a move that buys time but risks escalating attention across member states.
Either route imposes costs: engineering work to separate telemetry channels across millions of tenants, contractual re‑negotiations, and potential reputational fallout.

Practical checklist for school IT teams and procurement​

For schools, the ruling converts legal obligations into a short, tactical to‑do list. Administrators should treat these steps as urgent compliance and risk‑management tasks:
  • Audit: perform a cookie capture session on managed devices during typical classroom workflows (Teams, Word for the web, SharePoint). Log cookie names, endpoints and lifetimes.
  • Inventory: demand a machine‑readable cookie inventory from vendors mapping each identifier to purpose, retention, and any downstream recipients.
  • Default to privacy: configure tenant and device policies so non‑essential telemetry is disabled by default for pupil accounts.
  • Contract: insist on contractual audit rights, notification requirements, and change‑control provisions that prevent vendor‑side telemetry changes from silently rolling into education tenants.
  • Transparency to parents: provide child‑friendly privacy notices and clear explanations to parents about what is collected and why.
These steps are both protective and pragmatic: they preserve service continuity while making compliance demonstrable to regulators.

What parents and guardians should know and do​

Parents often lack visibility into the telemetry emitted by learning platforms. The DSB’s rulings strengthen parental and pupil rights under the GDPR: access to processing records, clearer explanations, and the ability to challenge unlawful processing. Parents should:
  • Request a formal description of processing from the school under data‑subject access rights.
  • Ask whether non‑essential telemetry is disabled for pupil accounts and request screenshots or exportable logs.
  • If unsatisfied, lodge a complaint with the national data protection authority.
Parents should also be careful not to conflate telemetry used for product reliability with profiling or advertising uses. Regulators will draw that line — but parents should insist on clear, written answers from schools and vendors.

Broader implications for cloud providers, advertising and AI​

This is not only an education case. The DSB’s approach is a practical signal to cloud providers that default telemetry choices and how telemetry is described matter legally. Vendors that reuse production telemetry for product improvement, analytics or AI training must either justify those uses with a lawful basis that fits the sensitivity of the data (especially for minors) or implement strict segregation and opt‑in flows. The ruling therefore tightens the legal corridor for:
  • Advertising pipelines that rely on cross‑site identifiers emitted from cloud services.
  • AI model training workflows that ingest production telemetry from sensitive contexts.
  • Product teams that treat telemetry as “invisible” product improvement data without mapping legal bases.
If other EU supervisory authorities adopt the DSB’s posture, cloud vendors may face multi‑jurisdictional demands to change product architecture or face corrective measures.

Strengths of the regulator’s approach — and its limits​

Notable strengths​

  • Technical specificity: By naming cookie identifiers and explaining the dual‑purpose problem, the DSB provided actionable guidance administrators can implement now.
  • Child‑focused protection: The rulings treat minors as a specially protected group and apply a conservative reading of consent and profiling limits.
  • Reduced jurisdictional wriggle room: The DSB’s refusal to accept a narrow subsidiary shield undercuts a common vendor defence strategy.

Limitations and open questions​

  • Fact‑specific scope: The decision applies to the facts of the complaint — a particular pupil’s device and tenant configuration at a point in time. Whether identical configurations across other tenants will produce the same finding depends on audits and further DPA assessments.
  • Complexity of telemetry: Modern cloud stacks emit voluminous telemetry where operational and optional metrics mix. Distinguishing strictly necessary from optional telemetry is non‑trivial and resource‑intensive for vendors and customers alike.
  • Unresolved downstream uses: The DSB demanded clarification about transfers to third parties, but public briefs do not conclusively document all downstream transfers; regulators are still seeking those disclosures. Until those logs are presented, some transfer allegations remain pending verification.

What vendors should do now​

The DSB’s order is a practical roadmap for product and legal teams at any cloud provider offering services to schools:
  • Default to privacy in education tenants: disable non‑essential telemetry by default and make opt‑ins auditable.
  • Separate cookies by purpose: use distinct identifiers for strictly necessary operations and for analytics/marketing.
  • Provide a machine‑readable cookie inventory and a one‑click export for tenant admins and regulators.
  • Supply child‑appropriate privacy summaries that schools can reuse.
Implementing these changes will reduce regulatory exposure and, crucially, restore trust with public sector customers and guardians.

Risk assessment and the road ahead​

This DSB decision raises the legal stakes for telemetry design in sensitive sectors. The short compliance window for a single complainant may be narrow, but the broader risk is systemic: other European DPAs will watch whether Microsoft complies, appeals, or uses the ruling as a test case. Non‑compliance could invite corrective orders, administrative fines or parallel probes elsewhere. Even if Microsoft appeals, the reputational cost and the engineering burden to audit and reconfigure millions of education tenants will be significant.
Predictable near‑term outcomes include:
  • Vendor product updates and conservative education defaults.
  • Procurement changes that require auditable telemetry inventories.
  • Increased parental and school‑level scrutiny, including more access‑requests and DPA complaints.

Conclusion​

The Austrian DSB’s rulings — first forcing transparency on how Microsoft defined nebulous product‑level terms, then ordering an end to certain cookie deployments on a pupil’s device — are a clear regulatory signal: when children’s data are involved, cloud providers must do more than publish privacy promises. They must design telemetry with privacy‑by‑default, provide auditable cookie inventories, and give schools the technical levers to disable non‑essential tracking.
For school IT teams and parents the path forward is practical and immediate: audit, document, demand inventories and apply the most restrictive defaults for minors. For vendors the message is equally plain: product defaults and telemetry architectures are now a legal front line. The ruling will not only reshape how Microsoft configures its education service — it will influence procurement language, product roadmaps and the rules of the road for telemetry across education, advertising and AI training pipelines.
The DSB ordered corrective action for a single pupil’s account, but its technical specificity and child‑centred framing make this a de facto blueprint for regulators, schools and vendors alike. The next phase will be revealing: will vendors comply quickly and rebuild defaults around privacy, or will they litigate and force a wider regulatory test? Either course will reshape the invisible plumbing of modern digital education.

Source: Windows Central Austria’s Microsoft ruling is a warning for all cloud providers