Safeguarding, Online Harmful Sexual Behaviour and Image Sharing
In recent sessions with Designated Safeguarding Leads (DSLs), a clear theme has emerged: image sharing among pupils is increasing, and it’s changing shape. Alongside familiar concerns around nude image sharing, schools are now navigating deepfakes, AI-generated images, and “nudification” apps that digitally remove clothing from photos.
For safeguarding and pastoral leads, knowing the right terminology, and how to respond, is no longer optional. It’s essential.
Key Terms Every Safeguarding Team Should Know
Online Harmful Sexual Behaviour (OHSB)
Sexual behaviours involving digital technology that are harmful, abusive, or exploitative. This includes coercing someone to share images, creating sexualised content of peers, or distributing intimate images.
Youth Produced Sexual Imagery (YPSI) (often still referred to as “sexting”)
Images or videos created by young people under 18 that are sexual in nature. Even when “consensual,” these remain illegal to possess or share and must be handled through safeguarding procedures, not discipline alone.
Deepfakes
AI-generated or manipulated images or videos that place someone’s face onto another body or scenario, often sexual. These can be created without consent and are increasingly realistic.
Nudification Apps
AI tools that digitally remove clothing from images. These are being used to create fake nude images of pupils, often from innocent photos taken in school or on social media.
Image-Based Abuse
The creation, sharing, or threat to share intimate images without consent including AI-generated content.
So-called ‘Revenge Porn’
Non-consensual sharing of intimate images/ films.
Upskirting/Downblousing
Non-consensual taking of intimate images/ films.
Cyberflashing
Sending images/ films without consent
Sextortion
Coercion and threats to release images/ films
Deepfakes
AI generated images and videos made to look real (while varying in their veracity, these images can be humiliating regardless)
Understanding these terms helps staff recognise concerns earlier and respond consistently.
Why This Matters?
For young people, the impact is real: shame, anxiety, isolation and often emotionally based school avoidance. Because AI-generated content can be created from any image, any pupil can become a victim, even if they’ve never shared a photo themselves.
Practical Guidance for Staff
Here’s how pastoral and safeguarding teams can support effectively:
1. Respond calmly and without judgement
Young people are more likely to disclose when adults stay neutral. Avoid blame-based language such as “Why did you send it?” and instead use “I’m glad you told me.” Use the TED analogy, Tell me, Explain to me and Describe to me.
2. Never ask to view or forward images
Follow your safeguarding policy. If devices need checking, involve the DSL immediately and use approved procedures.
3. Reassure pupils about support and next steps
Explain what will happen, who will be involved, and that their safety is the priority. Reporting is really hard, it’s ok to say this pupils. You could say, reporting is really tough, this is what we will do to support you.
4. Record concerns accurately
Use correct terminology and log incidents clearly. Detail whether content is real, AI-generated, shared, or threatened. If you are concerned, contact Social Services for advice.
5. Act quickly on takedown
Support pupils to report content to platforms and use tools such as Report Remove. Early action limits harm.
6. Teach proactively
Embed learning about consent, digital manipulation, AI risks, and respectful online behaviour into PSHE and assemblies. Pupils need to understand that creating fake sexual images is abuse. Ask open questions and discuss the topic with pupils.
7. Look beyond the image
Ask: Is there coercion? Peer pressure? Power imbalance? Emotional harm? Safeguarding is about context, not just content. If you are concerned, contact Social Services for advice and the police if you are concerned laws have been broken.
A Shared Responsibility
Safeguarding in the age of AI requires confidence with language, curiosity about new technologies, and a trauma-informed approach. When staff understand the terms and feel equipped to respond, pupils are more likely to seek help, and harm can be reduced earlier.
In our Resources section we have included guidance from CEOP - Financially motivated sexual extortion: guidance for further education professionals