As schools across England are now being issued with AI-generated minimum attendance targets, the practical experiment described by a Welsh headteacher—who used Microsoft Copilot to turn raw register data into per-pupil “attendance profiles”—is an instructive case study in both the potential and the pitfalls of using AI to tackle persistent absenteeism.
Attendance has been a stubborn policy problem since the pandemic, with overall absence rates still above pre‑2019 levels and a worrying rise in severe absence (pupils missing more than half their sessions). National policymakers have moved from incentives and enforcement alone towards data‑driven interventions: the Department for Education now says every school will be issued an individual attendance improvement baseline, generated using AI and local data, and linked to targeted support. At the same time, schools and trusts around the world—already under severe workload pressure—are piloting productivity and analytics tools such as Microsoft 365 Copilot and Power BI to compress admin time and surface actionable patterns in pupil data. Early case studies report large time savings for teachers and leaders, while also flagging governance, privacy and equity questions that demand careful mitigation.
Instead of blanket communications and ad‑hoc rewards or sanctions, the school aimed to use a series of short, consistent prompts to produce a concise, parent‑friendly “attendance profile” for each pupil that highlighted strengths, recurring absence patterns, and predicted risks. The school iterated prompts, simplified data, and translated outputs into plain English before sharing them with parents.
Artificial intelligence offers powerful affordances for attendance work: faster pattern detection, personalised family communications, and dashboards that turn mass registers into targeted action. The Welsh school’s experience is a practical blueprint—start with clear objectives, preserve human oversight, secure data rights, and scale only after rigorous pilot evaluation. The DfE’s AI‑generated target programme makes that local capability immediately relevant; success will depend on whether schools are given the tools, funding and governance needed to use AI responsibly, not merely another top‑down target to meet.
Source: Schools Week Lessons for England on using AI to boost attendance
Background
Attendance has been a stubborn policy problem since the pandemic, with overall absence rates still above pre‑2019 levels and a worrying rise in severe absence (pupils missing more than half their sessions). National policymakers have moved from incentives and enforcement alone towards data‑driven interventions: the Department for Education now says every school will be issued an individual attendance improvement baseline, generated using AI and local data, and linked to targeted support. At the same time, schools and trusts around the world—already under severe workload pressure—are piloting productivity and analytics tools such as Microsoft 365 Copilot and Power BI to compress admin time and surface actionable patterns in pupil data. Early case studies report large time savings for teachers and leaders, while also flagging governance, privacy and equity questions that demand careful mitigation. How one Welsh school used AI to dig into attendance
The problem: too much data, too little insight
The headteacher described a familiar situation: registers and attendance records generate vast tables—“one year group over one academic year in our school alone takes up 300,000 rows in Excel”—but conventional spreadsheets and human inspection miss patterns that only emerge when datasets are aggregated and modelled. The school had previously used Copilot for exam-results analysis and expanded that workflow to attendance.Instead of blanket communications and ad‑hoc rewards or sanctions, the school aimed to use a series of short, consistent prompts to produce a concise, parent‑friendly “attendance profile” for each pupil that highlighted strengths, recurring absence patterns, and predicted risks. The school iterated prompts, simplified data, and translated outputs into plain English before sharing them with parents.
What the AI surfaced
The AI-driven analysis revealed patterns that are easy to miss manually, including:- Recurrent family holidays taken at the same weeks each year.
- Stereotyped absence spikes (e.g., Fridays before school holidays, Mondays after).
- Geographic or demographic clustering — higher absence rates from specific neighbourhoods or certain year groups.
- Correlations with local events, weather patterns, or online cultural phenomena.
Why the DfE’s AI‑generated targets make this timely — and complicated
What the DfE has announced
The Department for Education has published a roadmap that includes issuing each school an AI‑powered Attendance Baseline Improvement Expectation (ABIE), calculated from local context (deprivation levels, school characteristics and historical trends), and pairing schools with high‑performing peers for targeted support. The stated purpose is to reduce variation across the system and return attendance to pre‑pandemic norms. The policy is explicitly framed as supporting improvement rather than formal accountability.Why the move is provocative
The policy has prompted immediate pushback from teacher unions and headteacher groups who warn that algorithmic targets can feel like a top‑down diktat that ignores root causes beyond schools’ control — poverty, housing instability, mental health and transport. Critics argue that targets without investment in the underlying social supports will simply add administrative burden and moral blame to already overstretched leaders. Media coverage and sector commentary have echoed these concerns.Technical realities: how Copilot, Excel and Power BI fit together
From spreadsheets to AI prompts
The Welsh school’s process followed a common pattern:- Export raw registers and contextual data (pupil attributes, postcode, attainment, term calendars) to a simplified table.
- Use Copilot or a similar LLM to summarise, clean and code the data (e.g., flagging recurring same‑day absences, converting free‑text absence reasons into categories).
- Generate human‑readable narratives—short attendance profiles—for parents and staff.
- Produce dashboards and scheduled alerts for leaders using a BI tool.
Why schools turn to Power BI next
A common next step is migrating from Excel to Power BI because it handles larger data volumes, scheduled refreshes and interactive dashboards. Power BI supports import and DirectQuery modes: import caches offer fast, feature‑rich modelling but have dataset size limits in shared capacities (commonly 1 GB) while Premium or Premium‑per‑user tiers dramatically increase capacity and enable large‑dataset storage formats. For many schools, the practical choice is either summarising and importing compressed extracts or using DirectQuery against an underlying database so the BI layer queries only slices of interest. These technical trade‑offs shape whether a school can analyse multiple years and multiple institutions together.The benefits: what the technology can deliver for attendance work
- Precision interventions: AI can flag likely problem trajectories earlier—so pastoral staff meet families before absence becomes entrenched.
- Workload reduction: Automated summarisation and templated reports free leaders from endless spreadsheet work, returning time for relationship‑building. Case studies show notable time savings when Copilot is used for admin and marking tasks.
- System learning: Shared analytics across schools in a region can reveal community‑level drivers and allow coordinated responses—targeted holiday messaging, community outreach or joint transport solutions.
- Personalised family engagement: Simple, evidence‑based attendance profiles give headteachers and heads of year an objective conversation starter with parents that focuses on facts rather than feelings.
The risks and unresolved governance questions
1. Data privacy and pupil protection
AI services and vendor platforms differ in how they treat tenant data. Education‑grade contracts and enterprise SKUs may include non‑training clauses and stronger deletion and audit rights, but districts must not assume default consumer protections apply. Sensitive pastoral notes, special‑educational‑needs information and free‑text comments are particularly high‑risk if fed into a public model or misconfigured service. Procurement must insist on explicit contractual protections, data‑minimisation and tenant isolation.2. Hallucinations and factual reliability
LLMs produce fluent text but can hallucinate or misclassify cases when prompts or the input data are messy. Attendance profiles for families must be checked for accuracy; an erroneous pattern (e.g., incorrectly attributing absences as unauthorised) can damage trust and escalate conflict. Schools must preserve human‑in‑the‑loop verification as an explicit step.3. Deskilling and overreliance
If pastoral teams come to rely on model outputs without understanding the underlying logic, they risk atrophy of judgement. Leaders must design workflows where AI augments record‑keeping and pattern detection, while professional discussion and casework remain human‑led.4. Equity and the digital divide
Some schools will have Premium BI capacity and managed Microsoft tenancy, while others—often in deprived areas—will lack licences, devices and bandwidth. This creates an AI‑enabled attendance improvement divide unless central funding or shared regional platforms are provided. Surveys show gaps in AI readiness between private and state sectors, reinforcing the need for targeted support.5. Policy and accountability design
Algorithmically generated targets must be transparent about inputs, assumptions and limits. Where targets are used to triage support, that’s defensible. Where they are used to punish or rank schools without context, they risk unfair outcomes. The DfE has said ABIE is for support rather than formal accountability, but the sector remains sceptical; clear safeguards, published methodologies and appeal routes are essential.Practical implementation checklist for IT leaders and heads
- Clarify objectives: decide which attendance problems you’re solving—holiday absence, persistent absence, or short‑term spikes—and what success looks like.
- Inventory data: collate registers, pastoral notes, FSM (free school meals) status, safeguarding flags and local event calendars, and decide which fields are necessary—minimise sensitive free text where possible.
- Choose the right pipeline:
- For small, rapid projects: export trimmed extracts to Copilot or a local model with a human review step.
- For multi‑year or multi‑school programmes: adopt Power BI with a database back‑end (DirectQuery or Premium where needed) to enable scalable queries and governed sharing.
- Procurement must demand:
- Education SKUs with non‑training and tenant‑isolation clauses.
- Clear SLAs for data deletion and audit access.
- On‑prem or regionally hosted options where law or policy requires.
- Embed human checks: every AI‑generated profile must be reviewed by a named pastoral lead before parent communication. Log the review.
- Train staff: short, practical PD on prompt design, verification workflows and sensitive‑data handling is higher‑leverage than an hour‑long demo.
- Co‑design communications with parents: test parent‑facing language for clarity and tone; offer an appeal process if a family disputes the AI‑derived summary.
- Start small, publish results: run a bounded pilot with measurable KPIs (e.g., reduction in unauthorised holiday absences for the pilot cohort), evaluate, then scale.
What systemwide use could look like — and what to watch
If regional clusters or local authorities pooled anonymised attendance data and ran joint AI analyses, they could spot area‑wide drivers (transport strikes, major concerts, persistent local illness, or cyclical holiday patterns) and coordinate pre‑emptive action. The Welsh head’s idea of working with feeder primaries to track year‑4/5/6 patterns before they arrive in year 7 is a textbook preventive strategy. But for cross‑school models to be ethical and legal, the following must be in place:- Robust anonymisation and data‑sharing agreements.
- Clear governance over model inputs and outputs.
- Independent validation of model accuracy and fairness.
- Transparent rules on whether targets will be used for support only or for performance management.
A cautionary note on metrics and causality
Data can show correlation but rarely prove causation without careful design. A drop in absences after an intervention does not necessarily mean the intervention caused the change—seasonality, cohort effects, or concurrent policy shifts can explain results. Rigorous pilots, with matched comparison groups where feasible, are the right evaluation approach before scaling. Policy designers should be especially wary of single‑metric incentives: they can create perverse behaviours (e.g., shifting absence to other categories). Researchers and procurement teams should demand transparent methodology and, where possible, independent evaluation.Verdict: practical, not magical
AI is a tool that can make attendance work smarter by surfacing patterns and freeing staff time for conversation and tailored support. The Welsh head’s experiment shows how modest investments—structured prompts, clear outputs and a named senior lead—can transform a tangle of registers into targeted parental conversations. But the tool does not eliminate the core levers of attendance: relationships, local services, transport, poverty mitigation and mental‑health support. Policymakers should avoid substituting algorithmic targets for those investments.- Short term: roll out pilots that prioritise privacy, human review and teacher professional development.
- Medium term: fund shared analytics platforms or Premium BI capacity for deprived local authorities and mandate education‑grade contracts.
- Long term: commit to independent evaluation, transparent methodology for any national target‑setting model, and integration of attendance improvement into wider social support programmes.
Practical next steps for readers
- Audit what attendance data you already collect and simplify it to the minimum fields required for pattern detection.
- Run a two‑term pilot: one year group, one senior lead, scheduled Copilot‑generated profiles plus mandatory human verification before parent circulation.
- Prepare a procurement checklist: education SKU, non‑training clause, data deletion rights, audit logs, regional hosting options.
- Publicly publish your pilot’s methodology and findings so peers can learn what worked and what didn’t.
Artificial intelligence offers powerful affordances for attendance work: faster pattern detection, personalised family communications, and dashboards that turn mass registers into targeted action. The Welsh school’s experience is a practical blueprint—start with clear objectives, preserve human oversight, secure data rights, and scale only after rigorous pilot evaluation. The DfE’s AI‑generated target programme makes that local capability immediately relevant; success will depend on whether schools are given the tools, funding and governance needed to use AI responsibly, not merely another top‑down target to meet.
Source: Schools Week Lessons for England on using AI to boost attendance