Children’s online safety is a growing concern for parents, governments, regulators, and young people, with many having negative experiences on social media platforms. In the US, President Biden has called for stricter regulations on social media companies, while the UK has announced new draft rules under the Online Safety Act that will hold tech companies accountable for keeping children safe online. The rules include enforcing age verification and improving moderation to filter out harmful content such as pornography and references to self-harm, suicide, and eating disorders. Companies that fail to comply could face fines or even jail time for executives.

Social media platforms have been criticized for their role in contributing to mental health issues among children, with tragic consequences such as the death of British teenager Molly Russell, who died by suicide after viewing harmful content on Instagram and Pinterest. Tech companies have made efforts to enhance moderation and introduce new features to protect children, but challenges remain as children are still exposed to harmful content and face risks of exploitation. Feedback from children has influenced the new UK rules, which aim to empower young users to have more control over their online experience and improve reporting channels for safety concerns.

Ofcom, the UK regulator, consulted with over 15,000 children to gather input for the new child safety code, which is expected to be published within a year. Tech companies will have three months to assess the risks they pose to children and publicly report their mitigation efforts. Continued consultations with young people will help identify new threats, such as those posed by generative AI. Ofcom emphasizes the need for effective reporting channels and plans to work with tech firms to address any feedback or concerns raised by children regarding their online safety.

The severity of the sanctions in the UK’s Online Safety Act is aimed at pushing social media platforms to prioritize children’s safety more effectively. The rules are designed to ensure that tech companies take proactive measures to protect young users from harmful content and exploitation. By consulting with children and incorporating their feedback into the regulations, Ofcom seeks to empower young people to have a safer and more positive online experience. The regulator will continue to monitor the effectiveness of measures introduced by tech companies and address any inadequacies or emerging threats to children’s online safety.

Share.
Exit mobile version