It’s been a year since the Internet Safety Act. Did it keep the children safe?

It’s been a year since the Internet Safety Act. Did it keep the children safe?

BBC An edited image showing a teenage girl looking at her phone surrounded by emojisBBC

There’s an internet forum I bet you’ve never heard of. I don’t think even your loved ones know about this place. It has tens of thousands of members and millions of posts. This is a forum where anyone, including children, can go to discuss one topic – suicide.

But it’s not just a talking shop. There are detailed instructions on how to do it, a thread where you can post your own death, and even a “contact” section where members can meet others to die. It encourages, fosters and normalizes suicide. Most of the users are young, sad and vulnerable.

This is a site that provides an alarming insight into the state of online regulation and issues of the Online Safety Act (OSA), which will be one year old on October 26, 2023 after being granted Royal Assent. His aim was to make Britain the safest country in the world. A place in the world to access the Internet.

Ofcom, the network and broadcasting regulator, has acknowledged the dangers it poses. Control one of our reportsin November last year he wrote to the site under OSA (when fully operational) that the site was breaking UK law by encouraging suicide and self-harm. He advised the forum administrators to take action or face other consequences in the future.

The forum is believed to be based in the US, but the location of the admins and servers is unknown.

In response, the admins posted a message on the forum that they were blocking UK users. This “block” lasted only two days.

The forum is still active and accessible to young people across the UK. Indeed, my research shows that at least five British children have died after coming into contact with the site.

BBC Sounds logo

Claiming responsibility

Smaller websites based overseas with Ofcom’s hidden users hide the fact that even with OSA in full force, it may be unavailable.

But big tech may find it harder to ignore the new legal burdens that OSA places on platforms available in the UK. However, it is important that before the regulator Ofcom undertakes any of these tasks, it must consult the public on codes of practice and guidance. We’re still in the consulting stage – and its real power is now the threat ahead for tech firms.

But while the law hasn’t gone into effect yet, it could bring real change. This massive law is designed to address everything from access to pornography and terrorist content to fake news and child safety.

OSA has been many years in the making. In fact, you have to trace five prime ministers and at least six digital ministers back to its origins. Its roots lie in a time when the public and lawmakers began to realize that big tech’s social media platforms were gaining too much influence, but they weren’t being held accountable.

Then there was the death of schoolgirl Molly Russell, which activated Parliament more than any other event. Molly’s story was very moving. She could have been anyone’s daughter, sister, niece or friend.

Getty Images Molly Russell, a teenage girl in a school uniformGetty Images

Molly Russell took her own life after being bombarded with dark content online

Molly was just 14 when she ended her life in November 2017. His father, Ian Russell, found out after his death that he was bombarded with dark, depressing content on Instagram and, to a lesser extent, Pinterest. A coroner at her inquest ruled that social media had “more than a little” to do with her death.

It was fed graphic, disturbing content – when it was played in open court during the inquest, some people left the room.

His family decided something had to change, and Mr. Russell launched a legislative campaign to take control of Silicon Valley.

Ian took his campaign to Parliament and Silicon Valley. He spoke to tech insiders, including Sir Tim Berners Lee, the founder of the modern internet, and he even touched on his case with the Duke and Duchess of Cambridge.

Getty Images The Duke of Cambridge in a gray suit, next to the Duke of Cambridge in a navy blue suit and Ian Russell in a patterned shirtGetty Images

Ian Russell with the Duke and Duchess of Cambridge

Last October, he got his wish. As he said recently, it was a bittersweet moment.

“Seven years after Molly’s death, I am convinced that effective regulation is the best way to protect children from the harms of social media and that the Internet Safety Act is the only way to prevent so many families from suffering unimaginable pain,” he said.

The OSA claims to be the most comprehensive law of any country. This is supported by the possibility of multi-million pound fines against the platforms, and even criminal penalties against the tech bosses themselves if they repeatedly refuse to comply.

It sounds intense. But the truth is, many campaigners say it’s not hard enough or fast enough. The law will actually be introduced in three stages, but will only be introduced after Ofcom has been talking to the government, campaigners and big tech leaders for months.

All of these rules apply to behavior on platforms, but when it comes to individuals and how they behave online, the conversation has turned to action. New criminal offenses around cyberbullying, spreading fake news and inciting self-harm went live in January this year.

But while many of the law’s provisions have yet to be implemented, Silicon Valley seems to be paying attention.

The Silicon Valley shakeup

On September 17th, a press release appeared in my inbox. It was from Meta, the company that owns Instagram, Whatsapp, Facebook and Messenger. There is nothing to be happy about, it always happens. Only this was different. It was the biggest shakeup in Instagram’s short history, effectively announcing the creation of real “teenage accounts.”

In short, this means that all existing accounts owned by under-18s will be upgraded to new accounts with built-in restrictions, including enhanced parental controls for children under 16. Any child who signs up from that week onward will automatically receive one of the new “secure” accounts. accounts.

The reality of the new teen accounts may not live up to the hyperbole of the press release, but Meta didn’t need to make the change. Was it the specter of OSA that forced their hand? Yes, but only partially.

It would be a mistake to think that all the positive changes to protect children online are due to the future of OSA. Britain is just one player in a global move to limit the power of big tech. In February this year, the EU’s Digital Services Act came into full force, imposing transparency obligations on large firms and making them liable for illegal or harmful content.

In the US, federal legislation appears to have stalled, but lawsuits targeting the biggest social media platforms are underway. Families of children affected by the harmful content, school boards and attorneys general of 42 states are suing the platforms. Will be brought to court in accordance with the law on protection of consumer rights. In these cases, the claim is that social media is addictive and lacks adequate protections for children. They demand billions of dollars in ransom.

The effect of Molly Russell’s story is also special here. I have met with some of the petitioners and their lawyers. They all know his name. Like the old folks in Silicon Valley. Companies began implementing content moderation long before OSA became law.

That said, Ian Russell believes more needs to be done.

“As firms continue to move fast and disrupt things, cowardly regulation can cost lives … We need to find a way to move faster and be bolder.”

He has a word. Continuous encryption means law enforcement is blind when it comes to child exploitation material. If they cannot see the material, they cannot identify the suspects and victims. On some platforms, the spread of misinformation goes unchecked. Age verification is not yet certain. A major emerging problem is the misuse of AI, such as sextort scams targeting young people.

The law is intended to be “technology neutral” and regulate the harmful effects of any new technology. But how it will deal with new AI products remains to be seen.

Dame Melanie Dawes, chief executive of Ofcom, rejects much of the criticism.

“From December, tech firms will be required to start acting legally, meaning 2025 will be a crucial year in creating a safer online life.”

“Our expectations will be high and we will hit the underdogs hard.”

So, are children safer online thanks to OSA?

In reality, large technology platforms are transnational and only a global approach can force significant change. OSA is only one piece of the global puzzle of laws and legal actions.

But the pieces of that puzzle are still missing, and the danger for those children is still in those gaps.

Lead Image: Getty

BBC InDepth it’s the new home for the best analysis and expertise from top journalists on the website and app. Under a distinctive new brand, we bring you fresh perspectives against assumptions and in-depth reporting on the biggest issues to help you make sense of a complex world. We’ll also be showing thought-provoking content on BBC Sounds and iPlayer. We’re all about thinking small, but thinking big, and we want to know what you think – you can send us your feedback by clicking the button below.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *