St Hilda’s Anglican School for Girls and Hale School have quietly done something rare in competitive education: they pooled people, policies and intellectual property to co‑build AI tools that tailor learning to individual students—using Microsoft’s Azure OpenAI platform, Power BI and Microsoft Fabric as the plumbing behind a set of teacher‑ and student‑facing services.
In late 2023, leaders from two long‑established independent schools in Western Australia—St Hilda’s (a girls’ school) and Hale (a boys’ school)—agreed to collaborate on generative‑AI solutions rather than build in isolation. That collaboration was formalised with a short legal Statement of Intent that allowed the schools to safely share code, prompts and implementation notes while keeping each school’s student data under its own tenant controls. The project was intentionally pragmatic: combine standard, widely supported Microsoft tooling so the work would remain maintainable and portable across school administrations.
This story sits at the intersection of three trends that educators and IT leaders watch closely: (1) the mainstreaming of enterprise‑grade large language models (LLMs) into everyday institutional software, (2) a renewed emphasis on data governance and on‑prem / cloud hybrid architectures for sensitive student data, and (3) pragmatic, cross‑institutional sharing of IP to accelerate delivery while limiting vendor lock‑in. The Microsoft customer profile describing this work is detailed and explicit about architecture, governance choices, and early impact.
Key technical and operational points about Hilda360:
Why these two approaches are important together:
Several independent Microsoft documents and announcements confirm the core platform capabilities the schools rely on:
A commissioned Forrester analysis for Azure OpenAI also reports that organisations can see operational improvements and cost efficiencies when deploying Azure OpenAI for customer and constituent engagement—findings that are relevant when schools seek to scale assistance services and reduce repetitive staff tasks. While Forrester’s study is not limited to K‑12, it provides cross‑sector evidence that Azure OpenAI deployments can contribute to measurable efficiency gains.
Operational governance measures described include:
There is no silver bullet: models change, new risks will appear, and policy frameworks must evolve. But the core lesson is simple and transportable: when schools combine clear purpose, sound governance and pragmatic platform choices, they can responsibly harvest the benefits of generative AI at scale. The combination of concrete technical detail (Fabric + Power BI + Azure OpenAI), identity‑first authentication, and an IP‑sharing legal mechanism makes this an instructive case for educators worldwide to study and adapt.
Source: Microsoft St Hilda’s and Hale co-develop Azure OpenAI tools to personalize learning | Microsoft Customer Stories
Background / Overview
In late 2023, leaders from two long‑established independent schools in Western Australia—St Hilda’s (a girls’ school) and Hale (a boys’ school)—agreed to collaborate on generative‑AI solutions rather than build in isolation. That collaboration was formalised with a short legal Statement of Intent that allowed the schools to safely share code, prompts and implementation notes while keeping each school’s student data under its own tenant controls. The project was intentionally pragmatic: combine standard, widely supported Microsoft tooling so the work would remain maintainable and portable across school administrations. This story sits at the intersection of three trends that educators and IT leaders watch closely: (1) the mainstreaming of enterprise‑grade large language models (LLMs) into everyday institutional software, (2) a renewed emphasis on data governance and on‑prem / cloud hybrid architectures for sensitive student data, and (3) pragmatic, cross‑institutional sharing of IP to accelerate delivery while limiting vendor lock‑in. The Microsoft customer profile describing this work is detailed and explicit about architecture, governance choices, and early impact.
What the schools built — two complementary products
Hilda360: a teacher‑centric, AI‑assisted student dashboard
St Hilda’s team consolidated years of siloed data—academic grades, attendance, pastoral notes, pastoral interventions and program participation—into a single dashboard named Hilda360. The dashboard uses a Microsoft Fabric lakehouse for the data store, Power BI for visualization and Azure OpenAI to produce concise, AI‑generated summaries that help teachers and leaders surface trends and pastoral flags. AI‑generated summaries are explicitly labelled and accompanied by clear reminders to review and personalise outputs before taking action—a simple but crucial design choice to maintain human oversight.Key technical and operational points about Hilda360:
- Nightly ingestion of on‑prem systems through Microsoft’s On‑Premises Data Gateway into a Fabric Lakehouse.
- AI summaries and explainable highlights surfaced in Power BI workspaces.
- Emphasis on teacher workflows rather than automated decisioning—reports are prompts for teacher action, not automatic interventions.
Hale GPT: a student‑facing, personalised tutor assistant
Hale’s team built a personalised chatbot—Hale GPT—designed as a study assistant and flow‑helper for students. It authenticates students with Microsoft Entra ID, pulls in individualized data (grades, assignments, cohort averages and deadlines), and refreshes that context hourly so the assistant can generate personalised schedules, practice questions and explainers. The model backbone reported by the school is GPT‑4o‑mini, chosen for its balance of performance, multimodal potential and cost efficiency. Hale monitors interactions in Power BI to detect off‑track conversations and tune prompts and safety filters.Why these two approaches are important together:
- Hilda360 amplifies teacher capacity by aggregating longitudinal insight.
- Hale GPT supports learner autonomy while reserving judgement and high‑stakes decisions for educators.
- Combining both approaches creates a feedback loop—teacher insights can inform student assistant behaviour and vice‑versa.
The technical architecture and design choices (what was verified)
The schools intentionally built on mainstream Microsoft services to keep the solution supportable by an IT team familiar with the Microsoft ecosystem. The primary components cited by the schools are: Microsoft Fabric (lakehouse), Power BI, Azure OpenAI Service and Microsoft Entra ID for authentication. The Microsoft customer profile explicitly describes nightly data ingestion from on‑prem systems through the On‑Premises Data Gateway into Fabric, and the use of Power BI both as a monitor and as the teacher UX.Several independent Microsoft documents and announcements confirm the core platform capabilities the schools rely on:
- Azure OpenAI Service supports models such as GPT‑4o mini, which Microsoft lists as an available, lower‑cost and high‑throughput model suitable for assistants, retrieval‑augmented generation (RAG) and real‑time experiences. Microsoft’s Azure blog provides pricing and throughput guidance for GPT‑4o mini and describes safety features enabled by default.
- Microsoft’s integration story for enterprise identity and workload authentication confirms that Entra ID (formerly Azure AD) and managed identities are supported patterns for authenticating applications and services that call Azure resources, including the OpenAI endpoints. That means a school can use single‑sign‑on and identity‑based tokens—preferred for auditability—rather than long‑lived API keys.
- Microsoft’s Education blog and customer case studies document multiple instances where Azure OpenAI, Fabric and Power BI are combined in education scenarios; this corroborates the platform pattern used by St Hilda’s and Hale.
Pedagogical impact so far — benefits, early outcomes and real‑world examples
The Microsoft narrative highlights immediate, practical outcomes reported by teachers and students at both schools:- Teachers at St Hilda’s can reach insight into a student’s performance and pastoral record within minutes rather than compiling fragmented records. This is reported to improve the quality and speed of parent‑teacher conversations and to increase staff data literacy across the school.
- At Hale, students reported tangible improvements in daily organisation and study depth; for example, a Year 8 student used the assistant to convert a cumbersome diary into a reliable, AI‑generated study schedule and personalised practice questions. Monitoring and iterative prompt engineering are used to prevent the assistant from becoming an answer machine rather than a learning coach.
A commissioned Forrester analysis for Azure OpenAI also reports that organisations can see operational improvements and cost efficiencies when deploying Azure OpenAI for customer and constituent engagement—findings that are relevant when schools seek to scale assistance services and reduce repetitive staff tasks. While Forrester’s study is not limited to K‑12, it provides cross‑sector evidence that Azure OpenAI deployments can contribute to measurable efficiency gains.
Governance, privacy and the legal framework: what the schools did differently
The collaboration’s governance model is a standout feature. Rather than treat IP and implementation notes as secret competitive advantages, the two schools signed a Statement of Intent that explicitly allowed sharing of intellectual property between them while keeping student data under separate tenancy and control. That legal instrument is a pragmatic enabler: it reduces duplicative work, accelerates fixes to prompt engineering, and creates a collaborative environment for ethical guardrails and safety practices.Operational governance measures described include:
- Identity‑based student authentication using Microsoft Entra ID so the assistant can be certain of the requesting user and grant the correct scope of personalised information.
- Hourly context refresh for personalised assistants to limit stale or inaccurate data.
- Logging and monitoring using Power BI to track assistant interactions and identify off‑track or unsafe responses for iterative prompt correction and mitigation.
Strengths and notable innovations
- Collaborative IP sharing in a competitive sector. The Statement of Intent is a small legal document with outsized practical value: it removes protectionist friction that normally slows innovation in independent school networks. This model is replicable—schools with non‑overlapping markets can share implementation IP and accelerate safe innovation.
- Platform pragmatism. Building on Microsoft Fabric, Power BI and Azure OpenAI meant the schools could rely on standard connectors, documented identity patterns, and familiar support channels—reducing long‑term technical debt. This increases the probability that future teams (if personnel change) can maintain the work.
- Human‑centred workflows. Both solutions emphasise augmentation—making teachers more effective and students more independent—rather than replacing core pedagogical judgments. AI outputs are labelled and presented as starting points for human action.
- Operational observability and continuous improvement. Using Power BI both to deliver insights and to monitor assistant conversations creates a tight feedback loop for prompt engineering and safety tuning. This is a low‑cost, high‑impact approach to keeping generative AI systems on‑task.
Risks, limitations and open questions
No responsible analysis would ignore the real risks and unresolved questions that accompany school deployments of generative AI.- Over‑reliance and academic integrity. If students use assistants to produce answers rather than practice thinking, learning quality can suffer. The schools attempt to mitigate this with instruction design and monitoring, but long‑term behavioural shifts require systemic policy changes and assessment redesign. The academic community is still grappling with how to assess learning when AI is ubiquitous.
- Safety and hallucinations. LLMs can produce plausible but incorrect outputs. While Azure OpenAI and platform content‑safety features reduce some classes of harmful output, the risk of hallucinated or misleading feedback—especially when models ingest student textbooks or pastoral notes—remains. Robust monitoring and conservative prompt design are necessary but not sufficient; there will always be residual risk.
- Data minimisation and privacy culture. Pulling together pastoral notes, health flags and behavioural records into an AI‑summarised profile raises privacy questions. Even with tenancy separation and identity controls, the social effect of more detailed, AI‑amplified teacher knowledge requires careful policy about who can see what, retention policies, and parental/guardian consent. The schools’ approach to labelling outputs and urging human review is good practice, but governance must extend to policy and legal review.
- Vendor dependency and model lifecycle. The schools used GPT‑4o mini for cost and performance reasons. Model availability, pricing and retirement timelines can change—product teams should budget for migration and versioning. Microsoft and OpenAI publish retirement windows and notices; operational teams should treat model selection as a life‑cycle decision rather than a one‑time technical choice.
- Equity of access. Not all families and students will have equal access to devices or internet at home. School‑wide AI assistants can widen the gap if off‑campus access is uneven. Schools must design blended strategies so AI supplements in‑person supports rather than skewing advantage toward students with better home connectivity.
Practical playbook: how other schools can adopt a similar approach
- Start with purpose, not technology. Define the specific teacher or learner pain point you want AI to address (e.g., cut repetitive marking, summarise pastoral history, strengthen study planning).
- Use identity and tenancy control from day one. Prefer Entra ID or similar SSO mechanisms so actions are auditable and data access is governed by roles.
- Build observability into the UX. Log interactions and visualise them—Power BI or similar tools are invaluable for surfacing off‑task behaviour and tuning prompts.
- Label AI outputs and require human sign‑off for sensitive decisions. This preserves accountability and clarifies the AI’s role to staff and families.
- Share IP where appropriate. If your institution isn’t a market competitor with other schools, consider legal frameworks that allow sharing of prompts, connectors and architectures to accelerate safe adoption.
Broader sector implications and why this matters beyond two schools
There are three wider consequences worth highlighting:- A template for cooperative innovation. The Statement of Intent model demonstrates how institutions that are not direct competitors can accelerate progress by sharing IP while keeping sensitive data local. This may be particularly valuable for district consortia, church/independent school networks and collegiate systems.
- Proof that enterprise LLMs can be operationalised safely in K‑12. The federation of identity (Entra ID), standard telemetry (Power BI) and a data lakehouse (Fabric) shows an operational design that balances agility with governance—an important counterweight to “AI hype” that imagines impossible technical futures. Microsoft and independent studies show that when designed correctly, AI assistants can be both scalable and responsible.
- Pressure on policy and assessment design. As assistants shift routine questions away from teachers, schools must redesign assessments, feedback loops and student accountability models. The classroom will adapt: teachers can spend more time on relational and creative work, and assessments must measure the skills that matter most in an AI‑enabled world. Independent research shows that AI tools can help tutors and educators become more effective—but only if the pedagogy adapts alongside the technology.
Conclusion — why the St Hilda’s / Hale approach is worth watching
St Hilda’s and Hale offer a highly practical playbook for K‑12 institutions thinking seriously about generative AI: choose proven platform components, prioritise identity and observability, label AI outputs, enforce human oversight for sensitive actions, and—importantly—be willing to share non‑competitive IP with trusted partners. Their approach converts AI from a risk‑laden novelty into an augmentation platform that amplifies teacher capacity and strengthens learner independence.There is no silver bullet: models change, new risks will appear, and policy frameworks must evolve. But the core lesson is simple and transportable: when schools combine clear purpose, sound governance and pragmatic platform choices, they can responsibly harvest the benefits of generative AI at scale. The combination of concrete technical detail (Fabric + Power BI + Azure OpenAI), identity‑first authentication, and an IP‑sharing legal mechanism makes this an instructive case for educators worldwide to study and adapt.
Quick reference — verified claims and corroboration
- St Hilda’s and Hale co‑developed AI tools and signed a Statement of Intent enabling IP sharing and joint development.
- St Hilda’s Hilda360 was built using Microsoft Fabric lakehouse, Power BI and Azure OpenAI; data is ingested nightly via the On‑Premises Data Gateway.
- Hale GPT authenticates users with Microsoft Entra ID, uses GPT‑4o mini as a model choice, and refreshes personalized context hourly; Power BI is used to monitor conversations for tuning and safety.
- GPT‑4o mini is an available Azure OpenAI model designed for high throughput and lower cost relative to earlier frontier models; Microsoft documents pricing/throughput and safety‑by‑default features.
- Education‑facing Azure OpenAI deployments have been studied and show potential operational and educational benefits in independent analyses and academic research; the Tutor CoPilot randomized trial demonstrates measurable learning gains from human‑AI hybrid tutoring approaches.
Source: Microsoft St Hilda’s and Hale co-develop Azure OpenAI tools to personalize learning | Microsoft Customer Stories