British police are reportedly investigating sexual abuse of a child’s avatar in the Metaverse, prompting the NSPCC to warn that tech companies must do more to protect young users.
Online abuse is linked to physical abuse in the real world and can have a devastating impact on victims, the association’s campaigners said.
The comments were made in response to a report by Mail Online that police are investigating a case in which a young girl’s digital persona was sexually assaulted by a gang of adult men in an immersive video game.
It is believed to be the first virtual reality sexual offense investigation carried out by a UK police force.
The report states that the victim, a girl under the age of 16, was traumatized by the experience during which she wore an augmented reality headset.
THE metaverse is a 3D model of the Internet in which users exist and interact as avatars – digital versions of themselves that they create and control.
Around 21% of children aged five to ten owned their own virtual reality (VR) headset in 2022 – and 6% engaged in VR regularly, according to the latest figures released by the Institute of Engineering and Technology.
Richard Collard, deputy head of child online safety policy at the NSPCC, said: “Online sexual abuse has a devastating impact on children – and in immersive environments where the senses are heightened, the damage can be felt very similar to that of the “real world”. “.
He added that tech companies are rolling out their products at a rapid pace without prioritizing the safety of children on their platforms.
“Companies must act now and step up their efforts to protect children from abuse in virtual reality spaces,” Collard said.
“It is crucial that tech companies can see and understand the damage to their services and that law enforcement has access to all the evidence and resources necessary to protect children.”
In a report published in September, the NSPCC urged the government to provide guidance and funding to officers dealing with VR crimes.
The charity also called for the Online Safety Act to be regularly reviewed to ensure emerging harms are covered by the law.
Read more tech news:
Why Music Megastars Are Embracing the Metaverse
Secret US government space plane embarks on classified mission
Google and Amazon must act after woman dies in suicide pact
Ian Critchley, head of child protection and abuse for the National Police Chiefs’ Council, said the grooming tactics used by offenders are constantly evolving.
He added: “That’s why our collective fight against predators, as in this case, is essential to ensure young people are protected online and can use technology safely, without threat or fear.
“The passage of the Online Safety Act is instrumental in this, and we need to see a lot more action from tech companies to do more to make their platforms safer.”
The law, passed by Parliament last year, will give regulators the power to sanction social media companies for content posted on their platforms, but it has not yet been enforced.
Ofcomthe communications regulator, is still developing its guidelines on how the rules will work in practice.
A spokesperson for Meta, owner Facebook, Instagram and operates a metaverse, said: “The type of behavior described has no place on our platform, which is why for all users we have an automatic protection called personal limit, which keeps people you don’t know not a few meters from you.
“While we were not provided with any details about what happened prior to the publication of this story, we will look into it as soon as details become available.”
British police are reportedly investigating sexual abuse of a child’s avatar in the Metaverse, prompting the NSPCC to warn that tech companies must do more to protect young users.
Online abuse is linked to physical abuse in the real world and can have a devastating impact on victims, the association’s campaigners said.
The comments were made in response to a report by Mail Online that police are investigating a case in which a young girl’s digital persona was sexually assaulted by a gang of adult men in an immersive video game.
It is believed to be the first virtual reality sexual offense investigation carried out by a UK police force.
The report states that the victim, a girl under the age of 16, was traumatized by the experience during which she wore an augmented reality headset.
THE metaverse is a 3D model of the Internet in which users exist and interact as avatars – digital versions of themselves that they create and control.
Around 21% of children aged five to ten owned their own virtual reality (VR) headset in 2022 – and 6% engaged in VR regularly, according to the latest figures released by the Institute of Engineering and Technology.
Richard Collard, deputy head of child online safety policy at the NSPCC, said: “Online sexual abuse has a devastating impact on children – and in immersive environments where the senses are heightened, the damage can be felt very similar to that of the “real world”. “.
He added that tech companies are rolling out their products at a rapid pace without prioritizing the safety of children on their platforms.
“Companies must act now and step up their efforts to protect children from abuse in virtual reality spaces,” Collard said.
“It is crucial that tech companies can see and understand the damage to their services and that law enforcement has access to all the evidence and resources necessary to protect children.”
In a report published in September, the NSPCC urged the government to provide guidance and funding to officers dealing with VR crimes.
The charity also called for the Online Safety Act to be regularly reviewed to ensure emerging harms are covered by the law.
Read more tech news:
Why Music Megastars Are Embracing the Metaverse
Secret US government space plane embarks on classified mission
Google and Amazon must act after woman dies in suicide pact
Ian Critchley, head of child protection and abuse for the National Police Chiefs’ Council, said the grooming tactics used by offenders are constantly evolving.
He added: “That’s why our collective fight against predators, as in this case, is essential to ensure young people are protected online and can use technology safely, without threat or fear.
“The passage of the Online Safety Act is instrumental in this, and we need to see a lot more action from tech companies to do more to make their platforms safer.”
The law, passed by Parliament last year, will give regulators the power to sanction social media companies for content posted on their platforms, but it has not yet been enforced.
Ofcomthe communications regulator, is still developing its guidelines on how the rules will work in practice.
A spokesperson for Meta, owner Facebook, Instagram and operates a metaverse, said: “The type of behavior described has no place on our platform, which is why for all users we have an automatic protection called personal limit, which keeps people you don’t know not a few meters from you.
“While we were not provided with any details about what happened prior to the publication of this story, we will look into it as soon as details become available.”