The Online Safety Bill is close to becoming law.
It is expected to be one of the government’s flagship pieces of legislation this term, but follows several delays due to controversy over its potential privacy implications.
Online Safety Bill set to clear final hurdle – latest political news
Sky News understands it will finally clear its final parliamentary hurdle on Tuesday, passing through the House of Lords without further amendments to gain royal assent.
Before his long-awaited death, here’s what you need to know about the Online Safety Bill.
What does the online safety bill aim for?
The government never shrinks from the opportunity to showcase the UK as a world leader and said the bill would make the country “the safest place in the world to go online”.
It aims to achieve this by imposing rules on businesses like Meta, Appleand even Wikipedia, in an effort to keep inappropriate and potentially harmful content out of vulnerable eyes.
This includes things like self-harm materials, that a coroner ruled last year contributed to the adolescent Molly Russell suicide.
The bill also aims to hold platforms accountable for illegal content such as images of child sexual abuse, require adult websites to properly enforce age restrictions, and prevent minor children from creating content. accounts on social networks.
Perhaps most controversial is that one of the proposals would require platforms like WhatsApp and Signal to undermine message encryption so that private chats can be checked for illegal content.
I’ve been reading about this for ages – why did it take so long?
As this last article indicates, this is a very broad bill.
Other illegal content it wants to crack down on is the sale of drugs and weapons, inciting or planning terrorism, sexual exploitation, hate speech, scams and revenge porn.
Then there is potentially dangerous but not illegal material, like eating disorder content and allegations of bullying.
There have been concerns within the conservative party that it simply goes too far, potentially to the point of threatening free speech online.
Those concerns weren’t enough to strike the bill’s former chief advocate, then culture secretary. Nadine Dorries.
Indeed, the proposals have become even stricter between the first presentation of the bill in 2019 and its possible parliamentary debut in 2022, adding measures such as criminalize cyber-flashing.
This already three-year gap was blamed on the pandemic, and the delays that followed were exacerbated by the fall of the prime minister. Boris Johnson and then Liz Truss.
The bill now falls under the scrutiny of Michelle Donelanthe technology secretary, who made some changes to mitigate criticism while satisfying his supporters.
Who is for it?
Supporters of the bill include charities like the NSPCC, safety group Internet Watch Foundation (IWF), bereaved parents who say harmful online content contributed to their child’s death, and survivors of sexual abuse.
Before the bill enters its final stages in Parliament this week, woman who suffered years of abuse on encrypted messaging app was one of more than 100 people who signed a letter to big tech bosses aiming to highlight the need for action.
The recent NSPCC campaign cited reports of an increase in cases of online child groomingwhich the charity says shows the legislation is “desperately needed”.
And the IWF published new figures a day before the bill is expected to pass through the House of Lords, warning of “unprecedented” numbers of children. victim of online sex extortion.
Molly Russell’s father is one of several parents who have expressed support for the bill and welcomed an amendment filed in committee. which could allow coroners and bereaved families to access phone data of deceased children.
Four in five British adults would also support making top executives of tech companies legally responsible for children harmed by what they see on their platforms.
Who opposed it?
In addition to Conservative MPs, the main opposition comes, unsurprisingly, from technology companies.
They have long expressed concerns about rules on legal but harmful content, suggesting it would make them unfairly responsible for content on their platforms.
Ms Donelan recognized the problem and removed the requirementbut the bill they are still responsible for protecting children from harmful content such as that which promotes suicide and eating disorders.
The update also saw material encouraging self-harm became illegal.
Much of the recent criticism of tech companies has focused on message encryption. with major platforms like WhatsApp even threatening to leave the UK if they are forced to allow text scanning.
Encryption prevents messages from being seen by people outside the chat.
Supporters of the technology say any attempt by the government to allow a “backdoor” would compromise people’s privacy and potentially allow bad actors to break in as well.
Ministers have sought to downplay the chances of the measure actually being used, but it remains in the bill.
Read more tech news:
Rockstar’s biggest game turns 10
What you need to know about the new iPhone update
How will the bill be implemented?
Enforcement of these measures will fall to media regulator Ofcom.
Businesses that breach the bill can be fined up to £18 million or 10% of their annual global turnover, whichever is greater (and in the case of a business like Meta, it’s comfortably the latter).
Companies and senior executives could also be held criminally liable if they are found to not be doing enough to protect children.
In extreme cases, platforms may even be blocked from operating in the UK altogether.