Social media companies have asked algorithms not to recommend harmful content to children

0
Social media companies have asked algorithms not to recommend harmful content to children

Social media platforms must do more to prevent their algorithms from recommending harmful content to children, Ofcom has said.

The regulator has published its draft child safety codes of practice, setting out the new standards it will expect tech giants to follow to protect children under the Child Safety Act. line.

But two mothers who believe their children died after copying dangerous challenges on social media say they feel “belittled” by Ofcom for its failure to listen to grieving parents.

Sky News spoke to mothers about Archie BattersbeeWHO died aged 12 after a “prank or experiment” it went badly with them and Isaac Kenevan13-year-old who reportedly died after participating in a social media chokehold challenge.

“They should listen to us like grieving parents,” said Lisa, Isaac’s mother.

“Ofcom have the power, the police, and we feel like we’ve been belittled. They’ve said some things but there’s just no action at the moment.”

Archie’s mother Hollie explained: “I’ve seen a handful of parents now go through what we’re going through and it’s heartbreaking… in a civilized society this shouldn’t happen.”

When the Government passed the Online Safety Act last October, it gave Ofcom new enforcement powers.

Hollie and Lisa campaigned tirelessly to get the bill passed and both are frustrated by the painfully slow process.

Ms Kenevan said: “This law has been put in place but nothing has really changed, which is frustrating for us, it’s almost like an insult to us because we’ve worked so hard.

“It’s too late, our guys are gone… but Ofcom should really step in and hold their feet to the fire… step in quickly to stop the content from being there in the first place.”

Learn more:
Electronic gates work again at UK airports after travel chaos
Porn Star Describes Awkward, Unexpected ‘Sexual Encounter’ With Trump

Pubs can extend their opening hours if England or Scotland reach the Euro semi-finals

Tame the algorithms

Ofcom’s draft code of conduct includes robust age checks, improved complaints procedures and a commitment by social media platforms to take steps to rein in algorithms that recommend harmful content to children.

If they fail, they could in theory be fined 10% of their overall turnover.

Ofcom’s chief executive, Dame Melanie Dawes, told Sky News: “In less than a year we will be able to apply these codes and what I am saying today to the technology industry, it is not to wait for this moment.

“Over the next few years we will see this change and we will push it forward with every possible tool we have.”

“A big change for the industry”

Ofcom denies excluding people from its consultations, insisting that victim groups and bereaved families are among the 15,000 children and 7,000 parents it has already spoken to.

Dame Mélanie insisted: “Those families who have lost children because of what happened to them online, we ask them to continue to work with us.

“What we are proposing today is a step change for the industry. Please work with us and talk to us, so we can make this happen.”

Fear of children at risk

For Archie and Isaac’s parents, the fear of the number of children still at risk is omnipresent.

Ms Kenevan said: “While these laws are trying to be put in place, unfortunately more and more children are dying and that’s the most frustrating thing because we’re in a club that we don’t want to be in belong and we don’t I want someone else to join this club.

To avoid every parent’s worst nightmare, change can’t come soon enough.

[ad_2]

T
WRITTEN BY

Related posts