Social media services are "failing girls at every stage" and allowing grooming, abuse, and harassment, children's charity the NSPCC has warned. Investigators created fake profiles of a teenage girl and found they were swiftly identified by adult predators who sent unsolicited messages to the accounts.
The charity also says , messaging applications and gaming services are designed in a way that encourages girls to increase their online activity. It is now urging watchdog Ofcom to tighten up a code of conduct which came into force in March as a result of the Online Safety Act.
Polling by the charity also shows that a strong majority of adults, 86%, believe to protect girls from harm on social media.
Parents of girls aged 4 to 17 highlighted contact from strangers, online grooming, bullying from other children and sexual abuse or harassment as their top concerns.
The NSPCC has heard from young girls about their negative experiences online through confidential helpline Childline.
One 15-year-old who contacted Childline said: "I've been sent lots of inappropriate images online recently, like pictures of naked people that I don't want to see. At first I thought they were coming from just one person, so I blocked them. But then I realised the stuff was coming from loads of random people I don't know. I'm going to try disable ways people can add me, so hopefully I'll stop getting this stuff."*
Rani Govender, NSPCC Policy Manager for Child Safety Online, said: "Parents are absolutely right to be concerned about the risks their daughters' are being exposed to online, with this research making it crystal clear that tech companies are not doing nearly enough to create age-appropriate experiences for girls.
"We know both on and offline girls face disproportionate risks of harassment, sexual abuse, and exploitation. That's why it's so worrying that these platforms are fundamentally unsafe by design - employing features and dark patterns that are putting girls in potentially dangerous situations.
"There needs to be a complete overhaul of how these platforms are built. This requires tech companies and Ofcom to step up and address how poor design can lead to unsafe spaces for girls.
"At the same time Government must layout in their upcoming Violence against Women and Girls (VAWG) Strategy steps to help prevent child sexual offences and tackle the design failures of social media companies that put girls in harm's way."
The charity's proposals include allowing users to take a screenshot when they report abuse, and increasing restrictions to stop adults from being able to video call young users.
An Ofcom spokesperson said: "No girl should have to face abuse or sexual harassment simply for being online. Yet this is the reality for too many girls growing up in the UK - and it's time for tech companies to act.
"Under the , platforms are legally required to protect all users from criminal activity. This includes violent threats, harassment, stalking, and coercive or controlling behaviour targeting girls. Children must also be protected against harmful content promoting violence, abuse and hate. Companies that fail to meet these new requirements can expect to face enforcement action.
"But we also know that many online harms disproportionately affect women and girls. So there is a moral imperative for tech firms to take action, and we've proposed practical guidance on what more they can do to ensure women and girls can live safer lives online."
You may also like
Menendez brothers who murdered their parents have their sentences slashed
AP and Telangana Weather, May 14: Mostly scorching heat; thunderstorms in some areas
Menendez brothers resentenced: Parents' murder convicts eligible for parole after 35 years in prison
Turkish President meets NATO chief, reiterates support for Russia-Ukraine ceasefire
'I've visited 18 European countries but these 4 left such a big impression on me'