Popular online gaming platform Roblox is preparing to introduce new regulations to ensure the safety of children and young users. While criticism of security measures has been on the agenda for a long time on the platform, Roblox management took these criticisms into account and announced new policies, especially covering users under the age of 13. The company will implement two different restrictive measures in an effort to make users’ experiences safer. At this point, it is worth remembering that Roblox is still closed to access in Turkey.
First, starting December 3, Roblox users under the age of 13 will not be able to search, explore, or play unrated content on the platform. However, these users will only be able to access general information on the content page when they have a direct link. In this context, Roblox emphasizes that content producers should arrange visual and textual elements such as title, description and thumbnail on experience pages in accordance with age restrictions. These elements must comply with the standards set as “suitable for all ages” or “suitable for ages 9 and above”.
On the other hand, with a second rule that will be valid as of November 18, “Social Meeting Areas” and “Free-form User Creation” experiences will also be closed to users under the age of 13. These areas are known as regions on the platform where users can interact and have the freedom to create content. With this regulation, the company aims to keep young users away from potentially risky or unsupervised interactions.
Roblox is a platform that has faced various criticisms regarding security in the past, with its structure appealing to the younger age group. Particularly in the USA in recent years, legal action has been taken against some people for allegedly abusing or kidnapping children through the platform, which has been among the events that have questioned the company’s security measures. These developments led to increased concerns, especially among parents, about Roblox and led the company to accelerate its efforts to ensure the safety of its young users. The latest regulations are among the new measures taken within this framework.
Roblox launched its artificial intelligence-supported moderation system
Roblox management has also implemented an artificial intelligence-supported moderation system to keep its young users safe. This system aims to automatically prevent possible dangers by checking user content for age appropriateness. However, it will be seen over time whether these changes and new security measures on the platform will be fully effective. Security experts point out that protecting child users on platforms such as Roblox, which has a large user base and is growing day by day, requires a very complex and challenging process.
Roblox’s new policies require not only the responsibilities of content creators, but also parents to be conscious and careful about their children’s game content. For example, the parental control tools offered by Roblox stand out as an important aid in ensuring that children have access to age-appropriate content. Additionally, parents’ monitoring of what kind of content their children interact with on the platform plays a critical role in improving the safety of young users.