The school photo problem just changed

hero-jobbies-7

The National Crime Agency is telling schools to take pupil photos offline. Here's what Heads should do this week.

By Susan Burton, CEO and co-founder, Classlist


I read this morning's Guardian as a CEO. But first, I read it as a parent.

Are photos of my children on their school's website? What about the older year groups? Are alumni still up there? What is the school's image policy? When was it last reviewed? Has anyone written to current and former parents to tell them what's being done?

I don't know the answers. And that, I think, is the point.

What the Guardian reported

Dan Milmo's front-page piece in today's Guardian (8 May 2026) reports that the National Crime Agency and child safety experts are now advising headteachers to remove identifiable photos of pupils from school websites and social media. Or to consider not using them at all. Criminals are using AI to turn those photos into child sexual abuse material.

The Internet Watch Foundation has confirmed a recent blackmail attempt against an unnamed UK secondary school. Criminals took photos of pupils from the school's website and social media. They manipulated the images into CSAM using AI. They then sent the images to the school and demanded payment. 150 of the images could be classified as CSAM under UK law.

Jess Phillips, the safeguarding minister, called it a "deeply worrying emerging threat." The IWF says it is "only a matter of time" before more schools are targeted.

This is no longer a theoretical risk at the edge of the safeguarding agenda. It is now NCA advice. It is an active criminal pattern. It is on the front of a national newspaper. By Monday morning, parents will be asking questions.

Why this is harder than it looks

It is tempting to read the headline and think: take the photos down, update the policy, move on. The website work is the easy part. The hard part is what comes next.

A school with 800 pupils and an active alumni community could receive several hundred parent queries in a week. Current parents will ask what's on the site. Former parents will ask whether their now-adult child's image has been removed. Prospective families will ask what your safeguarding posture looks like.

Without a system, those answers will be inconsistent. They will be handled by whoever picks up the phone. The parents who feel ignored will be the ones who escalate. And escalation on a topic like this does not stay inside the school gates.

For British international schools the reputational stakes are higher still. Photos in the prospectus and on the website do real work in admissions. Pulling all identifiable images affects marketing as well as safeguarding. The decision needs to be made deliberately, not in panic. An incident at one British international school travels through expat parent networks faster than through any official channel. The cost of acting late is significantly higher than the cost of acting early.

Five steps for Heads, this week

These are not original to me. They are the steps I would want my own children's school to take. Written down in the order I would want them taken.

1. Remove identifiable photos of pupils from the public website and social channels. All of them. Today, or by Monday at the latest. This is now National Crime Agency advice, not a matter of opinion. The cost of doing this is low. The cost of being the next school in the Guardian is not. You can rebuild your photo galleries later, in a closed environment, with proper consent and access controls. The website does not need to be photo-free forever. It needs to stop being a public face-database by the end of the weekend.

2. Publish a clear photo policy and send it to every current and former parent. Don't wait to be asked. The school that gets ahead of the question earns trust. The school that waits to be challenged loses it. The policy should cover what images you collect, where they are stored, how long you keep them, who has access, how parents withdraw consent, and how alumni images are handled when a child leaves. A short, plain-English version sent to parents is worth more than a long policy document filed on the website.

3. Name one route for questions. One named person. One inbox. One form. Not "contact the office." Parents need to know exactly who is accountable for this and how to reach them. This is a small detail that does an enormous amount of work. It concentrates the workload. It lets you brief one person properly. And it stops the same question being answered five different ways by five different members of staff.

4. Put a ticketing system behind that route. This is the step most schools will skip. It is also the one most will regret skipping. The volume of questions will be higher than you expect. The questions themselves will be more detailed than you expect. "What about the photo from the 2021 sports day." "What about the Year 6 leavers' video on YouTube." "What about the prospectus from 2019 that's still cached." A simple ticketing workflow, where every query is logged, assigned, tracked and closed, is the difference between a manageable week and a reputational one. It also gives you a record. That matters if a parent later asks why their question wasn't answered.

5. Move photo sharing into a closed, parent-only environment. Parents do want photos of their children. On trips, in plays, scoring goals, in the school production. That hasn't changed and shouldn't. What needs to change is where those photos live. A closed community, where photos sit on the platform rather than on a teacher's phone, where access is restricted to verified parents, and where alumni access lapses when a child leaves. That is what "sharing photos" should look like in 2026. Public Facebook pages and open Instagram accounts were never built for this threat model. They are not the answer.

Why a closed community matters here

It is tempting to treat this as an IT job. Strip the gallery. Update the privacy notice. Move on. That is necessary but not sufficient.

The deeper issue is that schools have used public channels for years to do work that should have been done in a trusted community. Photos. Event invitations. Parent updates. class lists. Much of it has drifted onto platforms that were not designed with safeguarding at their core.

In our experience at Classlist, schools that bring this work into a closed parent community see complaints fall, not rise. Parents feel informed and in control. Teachers aren't carrying hundreds of children's photos on personal devices. Alumni access lapses when it should. The community itself becomes a safeguarding asset rather than a leak.

This is the reframing I think Heads need to make this week. The question is not "where do we store the photos now that the website is off-limits." The question is "what does our community infrastructure look like for the next decade, given that the open internet is no longer a safe place to share children's faces."

A note for Heads at COBIS this weekend

The 44th COBIS Annual Conference runs from tomorrow at Convene 155 Bishopsgate. The theme is Listen. Learn. Lead. That feels apt for a weekend that is going to start with a lot of Heads having difficult conversations about exactly this issue.

I'm not at COBIS this year, but my colleagues Clare Wright and Toby Burton are. If you are working through any of this with your leadership team, please come and find them. The policy. The parent communications. The platform question. The sequencing of the next ten days. We have spent years thinking about how schools share photos and information with parents safely. The patterns we've seen across 500+ schools in 40+ countries are likely to be useful to you. No pitch required. The conversation matters more than that.


Frequently asked questions

What is the National Crime Agency advising schools to do about photos on school websites? The NCA, alongside child safety experts, is advising headteachers to remove identifiable photos of pupils from school websites and social media accounts. Or to consider not using them at all. The advice was reported on the front page of the Guardian on 8 May 2026. It follows a blackmail attempt against a UK secondary school in which criminals used AI to turn pupil photos into child sexual abuse material.

Has a UK school actually been blackmailed using AI-generated images? Yes. The Internet Watch Foundation has confirmed that an unnamed UK secondary school was recently targeted. Criminals took photos of pupils from the school's website and social media. They manipulated the images into sexually explicit content using AI. They then demanded payment to prevent publication. The IWF classified 150 of the images as CSAM under UK law and used digital fingerprinting ("hashing") to help leading platforms block the images from being uploaded.

What should be in a school photo policy in 2026? A current photo policy should cover what images the school collects and for what purpose. Where images are stored and for how long. Who has access to them. How consent is given and how parents can withdraw it. How alumni images are handled when a pupil leaves. And the school's position on AI image abuse and how it will respond to incidents. The policy should be written in plain English, sent directly to current and former parents, and reviewed at least annually.

How should schools manage the volume of parent questions about photos? By naming a single route. One named person. One inbox. One form. And by putting a ticketing system behind it. This concentrates the workload with someone properly briefed. It ensures every query is logged and answered. And it prevents the same question being handled inconsistently by different staff members. Parents who feel ignored escalate. Parents who feel heard generally don't.

What is the safest way for schools to share photos with parents? Through a closed, parent-only platform. Access should be restricted to verified parents. Photos should sit on the platform rather than on individual teachers' devices. And access should lapse automatically when a child leaves the school. Public social media, school websites with open photo galleries, and generic messaging apps were not designed with this threat model in mind. They should not be the primary channel for sharing children's images.


Susan Burton is CEO and co-founder of Classlist, the community layer for school technology. Classlist is used by 500+ schools across 40+ countries to share photos, events and updates safely inside a verified parent community. 

 


Are you on Classlist yet?

Classlist's award-winning parent communications app is the safer alternative to public social networks. It's easy to set up! Be amongst more than 400,000 parents using Classlist in 40+ countries. Get started today!

Find your school