Anyone watching the Netflix series Adolescence will likely feel a deep sense of dismay about the dangers young people face in an increasingly unregulated digital world.
The series exposes a warped portrayal of masculinity - often referred to as toxic masculinity - that appears disturbingly pervasive. One episode highlights how emojis serve as a coded form of bullying, instantly understood by young people but often invisible to adults.
The threats of childhood are no longer confined to the physical world; they have become virtual, manifesting through the addictive lure of digital screens behind closed bedroom doors.
This aligns with recent findings from a Teacher Tapp survey, commissioned by the Association of School and College Leaders, which asked teachers and school leaders in England what social media-related issues they had noticed since the start of the academic year.
The results were stark. Nearly three-quarters of secondary school teachers reported that students had been bullied by peers on social media.
Almost a third noticed signs that students had accessed pornographic or violent content. Shockingly, nearly one in five noticed signs of students engaging with extremist material.
‘This chaos must end’
ASCL president Manny Botwe put it bluntly at our annual conference: “This chaos must end. For too long, tech billionaires have been given immense power without accountability. They hide behind the defence that they are champions of free speech while profiting from platforms that allow harm to fester.”
If anything, his comments feel even more urgent in light of Adolescence and the broader realities of the digital world.
How did we let this happen? The warning signs have been there for years. No previous generation has been subjected to such powerful, profit-driven technologies - ones which prioritise algorithms, engagement metrics and monetisation above all else.
And yet, we have failed to act decisively.
We need action, not minor policy tweaks
The legislative process has been painfully slow. The long-delayed Online Safety Act, which aims to enhance child protection, took years to pass. Even now, its provisions are only beginning to take effect, and it remains unclear whether they will be robust enough to make a real difference.
Meanwhile, it was recently reported that the government is expected to update its relationships, health and sex education guidance with additional content that “will enable schools to tackle harmful behaviour and ensure that misogyny is stamped out”.
But is new wording really the answer? The existing guidance already urges schools to address these issues. What we need is meaningful action, not just minor policy tweaks.
We should be considering far bolder interventions.
A social media ban?
In February, MPs debated an e-petition signed by over 130,000 people calling for social media to be banned for children under 16. The government has dismissed such a ban, but should it reconsider?
At the very least, we must enforce the existing minimum age requirements, which social media platforms generally set at 13.
ASCL’s survey found that nearly three-quarters of teachers - across both primary and secondary schools - reported pupils using social media below the required age.
The Online Safety Act gives Ofcom the power to fine companies up to £18 million or 10 per cent of their qualifying worldwide revenue (whichever is greater) for failing to enforce these rules. Perhaps issuing a few such fines would sharpen minds.
Furthermore, a major issue in policing online platforms is that the responsibility lies in the wrong place. Currently, it is generally up to individuals to complain about harmful content, leaving tech companies to decide whether it breaches their house rules.
Tech companies must take responsibility
The burden must shift. Tech companies, which profit enormously from these platforms, should be required to prevent damaging content from appearing in the first place. Governments must ensure that compliance is not optional.
Some argue that regulating global tech giants is difficult, given that many are based outside the UK. But as a society, we must insist on the standards we expect and find ways to enforce them.
Other industries - finance, pharmaceuticals, food safety - are held to strict regulatory standards. Why should the tech sector be any different when the risks to children are so severe?
A moral duty
Children and young people need us to step up. They need us to act with confidence and determination, to demand better protections and to hold tech firms accountable.
We have a moral duty to act far more decisively than we have so far managed.
The time for half-measures and platitudes is over.
If we fail to act, we will look back with regret, wondering why we allowed a generation of children to be raised in a digital world where harm was not just predictable but inevitable.
Pepe Di’Iasio is general secretary of the Association of School and College Leaders
For the latest education news and analysis delivered every weekday morning, sign up for the Tes Daily newsletter