Roblox to ban youthful children from messaging others
Roblox to ban youthful children from messaging others
Roblox has announced it will block under-13s from messaging others on the online gaming platform as part of recent efforts to safeguard children.
kid users will not be able to send direct messages by default unless a verified parent or guardian gives them permission.
Parents will also be able to view and manage their kid’s account, including seeing their list of online friends, and setting daily limits on their play period.
Roblox is the most popular gaming platform for eight to 12 year olds in the UK, according to Ofcom research, but it has been urged to make its experiences safer for children.
The corporation said it would commence rolling out the changes from Monday, and they will be fully implemented by the complete of March 2025.
It means youthful children will still be able to access community conversations seen by everyone in games – so they can still talk to their friends – but cannot have private conversations without parental consent.
Matt Kaufman, Roblox’s chief safety officer, said the game is played by 88 million people each day, and over 10% of its total employees – equating to thousands of people – work on the platform’s safety features.
“As our platform has grown in scale, we have always recognised that our way to safety must evolve with it,” he said.
Besides banning children from sending direct messages (DMs) across the platform, it will provide parents more ways to easily view and manage their kid’s activity.
Parents and guardians must verify their identity and age with a form of government-issued ID or a financing card in order to access parental permissions for their kid, via their own linked account.
But he acknowledged identity verification is a test being faced by a lot of tech companies, and called on parents to make sure a kid has the correct age on their account.
“Our objective is to keep all users secure, no matter what age they are,” he said.
“We inspire parents to be working with their kids to make accounts and hopefully ensure that their kids are using their accurate age when they sign up.”
Maturity guidlines
Roblox also announced it planned to simplify descriptions for content on the platform.
It is replacing age recommendations for sure games and experiences to “content labels” that simply outline the nature of the game.
It said this meant parents could make decisions based on the maturity of their kid, rather than their age.
These range from “minimal”, potentially including occasional mild violence or terror, to “restricted” – potentially containing more mature content such as powerful violence, language or lots of realistic blood.
By default, Roblox users under the age of nine will only be able to access “minimal” or “mild” experiences – but parents can allow them to play “moderate” games by giving consent.
But users cannot access “restricted” games until they are at least 17-years-ancient and have used the platform’s tools to verify their age.
It follows an announcement in November that Roblox would be barring under-13s from “social hangouts”, where players can communicate with each other using text or voice messages, from Monday.
It also told developers that from 3 December, Roblox game creators would require to specific whether their games are suitable for children and block games for under-13s that do not provide this information.
The changes arrive as platforms accessed and used by children in the UK prepare to meet recent rules around illegal and harmful material on their platforms under the Online Safety Act.
Ofcom, the UK watchdog enforcing the law, has warned that companies will face punishments if they fall short to keep children secure on their platforms.
It will publish its codes of habit for companies to abide by in December.
Post Comment