How Roblox is being used by groomers

Image caption,

James was eight when he was groomed on Roblox

By Emma Hallett

BBC News, West of England

Concern about sexual content in children’s online games is growing. Ofcom, the regulator for online safety, has told tech firms to hide “toxic” content from children and published draft codes of practice. “James” shared his experience of being approached on Roblox and being asked for sexual images.

Eight-year-old James is sat in his bedroom playing on Roblox – a game aimed at children that allows them access to “the ultimate virtual universe”.

A message pops up: “Do you want to be friends?”

Just a couple of weeks later, this person will ask James for intimate, sexual photos of himself.

“I was young and I thought these people were my friends,” James, not his real name, said.

“It started off very innocent… then it was like, ‘let’s take this conversation to Kik [a mobile chat app]’.

“They asked for sexual pictures, they wanted to see my naked body, that was the words they used, and they tried to portray it as being OK and normal.”

Image caption,

James said the conversation started innocently, but became more explicit when it moved to a chat app

For those unfamiliar with the platform, Roblox allows people to play and create a variety of games in a 3D world.

There are no age restrictions, allowing adults and children to play and communicate together.

Police forces across the country count it as one of a number of children’s gaming sites where grooming is a growing problem.

If you have been affected by issues in this story you can find help and support at the BBC Action Line.

James, who is from Gloucestershire and now in his 20s, is one of thousands of children who have been asked for sexual photos online by someone pretending to be their own age.

“At a young age, all you want is friends, so I spoke to that person,” James said.

“They asked questions like how old am I, where am I from, my name, which I answered because at a young age you don’t think anything of that information, you don’t think people will use that maliciously against you.

“It was a kids game, I didn’t think any adults would play it.”

Image source, Getty Images

Image caption,

Roblox allows users to program and play games created by themselves or other users

Asking for intimate or sexual photos of a child is an offence under the Sexual Offences Act 2003 – even if the photos are not sent.

James said he knew this was wrong even aged eight and quickly blocked the person he now believes to have been an adult man.

But that was not the end of it.

“They were persistent in trying to communicate with me back on Roblox,” James said.

“They would come on the account I originally met them on… they would then harass me by following me into the games I was playing and continuously talking to me to the point I couldn’t play for a while.

“I felt vulnerable because I was constantly being followed by this person who was just horrible. It was terrifying.”

Feeling embarrassed about what happened, James did not tell anyone.

“I wanted to lock it away,” he said.

“You think ‘how did I let that happen’, ‘why did I do that’, but I do now realise that none of it was in my control.

“You get told as a kid don’t talk to strangers, but when you’re online you think everyone is your friend.

“They knew what they were doing, they knew how to work around it.”

Despite what happened, James did not feel turning away from Roblox was an option.

It was where his real-life friends were and he did not want to miss out. Although he felt pressured and uncomfortable, he kept on using the game until he was about 13.

In its purest form, Roblox has a lot to offer. There are more than 4.4 million “active immersive experiences”, from gaming to social hangouts, to sports, education, and entertainment.

Roblox said: “The wellbeing of the Roblox community is our first priority, and we are deeply saddened to hear about this case.

“We are continually investing in and evolving the safety tools, teams and policies we dedicate to catching and preventing attempts at malicious or harmful activity on our platform to protect members of the community.”

What can parents do?

Image caption,

Megan Haldane has been helping students and parents stay safe online

Being online is part of everyday life for the majority of children.

Data from Ofcom shows nearly all children (99%) spend time online, and that nine in 10 own a mobile phone by the time they reach 11.

Ofcom also found, external three in five secondary-school-aged children (11-18 years) have been contacted online in a way that potentially made them feel uncomfortable.

Some 30% have received an unwanted friend or follow request. Around one in six secondary students (16%) have either been sent naked or half-dressed photos, or been asked to share these themselves.

Avon and Somerset Police is one of many forces proactively reaching out to help.

It said there were 93 reported offences of sexual grooming in the past 12 months that carried a “cyber flag”.

Of those, four were also identified as involving child abuse, and a further 39 included reference to child sexual exploitation.

The force has been holding workshops at schools for children and parents in an effort to raise awareness of risks varying from online grooming to bullying and financial loss.

In the past year, more than 150 schools were visited, reaching more than 25,000 people.

The BBC was invited to Bailey’s Court Primary School in Bradley Stoke, near Bristol, where children were quizzed and taught about online risks while parents listened to the potential dangers and how best to mitigate them.

Megan Haldane, cyber protect officer for the force, said: “Over the last couple of years there has been a steady increase [in cases of online grooming] as children do rely more and more on technology and that is not going away, they will always rely on technology, it is part of everyday life.”

Image caption,

Megan Haldane spoke to Year 5 and 6 pupils at Bailey’s Court Primary School in Bradley Stoke

Ms Haldane, who has been working with the cyber team for about two years, said as technology had evolved and changed, so had the way children are targeted.

“There are lots of different platforms out there and each one goes through a phase of being the most concerning at that given time,” she said.

“It tends to circle around, as with James with Roblox. That is where he was targeted and unfortunately he is not alone for having experienced that, a lot of stories I hear are in relation to that particular platform, but that doesn’t mean that every other platform is safe, unfortunately it is not.

“If it has a chat function or the ability to communicate with other individuals it is instantly a concern because that is where that concern does comes from.”

Ms Haldane believes the risk will “never go away” entirely, adding: “As we get a handle on one platform, and learn to control it, or it becomes better regulated, criminals will move on to another platform and start to utilise that one instead.”

That said, she hopes the workshops she leads will help children and parents navigate the online world. She emphasised the importance of reporting any concerns to either the police or charities such as Childline and Crimestoppers.

“An important message is that parents and children shouldn’t feel embarrassed or ashamed – there is only one person to blame and that is the offender,” she said.

“The one take-home tip I would give is to always keep that conversation going with your children and keep that door open.

“Knowledge is key, it gives you the power to spot potential crimes and protect you and your loved ones.”

Image source, Roblox

Image caption,

Roblox said it continues to update its text chat filters, with stricter filters automatically applied to children under 13

On its website, Roblox says all parents, caregivers, and teachers need “a little support sometimes to help them guide and empower their kids and teens in a world that is so different from the one they grew up in”.

As such, it has produced a number of guides for parents, external.

A spokesperson for Roblox said: “We have never allowed user-to-user image sharing, and we have rigorous text chat filters that block inappropriate words and phrases.

“We block user behaviour that clearly indicates an attempt to bring someone off our platform, including the sharing of personal information; and we proactively scan for and investigate potential cases of grooming.”

Roblox also said it now offers a feature where parents can limit text chat to selected contacts or turn it off altogether.

However, part of the problem is when, like in the case of James, conversations are taken to a chat app.

For James, it was Kik, a mobile chat app founded in Canada.

For years it was known for its features preserving users’ anonymity and a BBC News investigation in 2018 suggested it had featured in 1,100 UK child sexual abuse cases police had investigated in the previous five years.

A year later, Kik said it was shutting down – but it is still available for download, with a version update three weeks ago announcing “Kik is here to stay”.

The BBC could not get hold of anyone at Kik Interactive for comment, but on its website, it said: “We understand that our Kik users need to feel safe and respected when they use our services and we take the safety of our users very seriously.

“Unfortunately, inappropriate behaviour is a risk with any kind of communication platform.

“We’ve designed Kik to help users manage the messages they see, and we provide tools to help.”

‘False identity’

And Kik is just one of multiple chat apps and sites that can be used.

In February, Ofcom said the sexual exploitation and abuse of children online was “a persistent and growing threat with devastating consequences for those affected”.

The NSPCC found UK police forces recorded more than 34,000 online grooming crimes, external against children across 150 different platforms between 2017 and 2023.

It said the risk was particularly high where offenders could abuse online with features like anonymity or where they could create a false identity to manipulate a child.

In October, the Online Safety Act became law. Under the Act, online services – such as social media apps, instant messaging platforms and gaming sites, as well as search engines and pornography sites – are required to take steps to better protect children online in a range of ways.

Image caption,

Ofcom is now working on how to keep children safer online

Outlining more than 40 steps which services must take to keep children safer, Ofcom chief executive Dame Melanie Dawes said: “We want children to enjoy life online, but for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control.

“In line with new online safety laws, our proposed codes firmly place the responsibility for keeping children safer on tech firms.

“Our measures – which go way beyond current industry standards – will deliver a step-change in online safety for children in the UK.

“Once they are in force we won’t hesitate to use our full range of enforcement powers to hold platforms to account. That’s a promise we make to children and parents.”

Roblox is one of a number of tech, gaming and social media companies which says it is working to make the online world safer for children.

A spokesperson for the company said: “We strongly believe the industry needs to work together to move the needle on safety, which is why we actively collaborate cross-industry to provide others with better tools for detecting grooming chat.”

But James, who even now struggles to trust and communicate with people online, has a warning for parents.

“You should be able to trust people, but unfortunately you just can’t.”

Related Internet Links

The BBC is not responsible for the content of external sites.

Read More