Roblox deploys AI system to shut down servers violating the game’s policies: All details

4 days ago 4
ARTICLE AD BOX

 All details

Roblox has introduced a real-time AI moderation system. The company claims that this new system scans entire in-game scenes simultaneously and has been shutting down around 5,000 servers per day that violate Roblox’s Community Standards since its deployment.

Unlike conventional moderation tools that check one object or piece of text at a time in isolation, this new system checks the entire scene from the user’s perspective, including avatars, 3D objects, and text at a given time. Once the system recognises the pattern of behaviour in one instance of the game, it closes the specific server rather than the entire game. This way, all users in the other games can continue playing without interruption.It fills a current gap in the way user-generated content platforms control changing gameplay. The AI moderation system can combine approved elements like avatars, clothing, and movements in ways that are acceptable individually but not collectively. In games with drawing capabilities, for example, a user could potentially draw offensive images using other acceptable elements. The system aims to detect these combinations in real time, frequently before other users encounter them.

Roblox says it is working to scale the system to cover 100% of playtime and is developing tools to identify and remove specific bad actors without disrupting the broader player base.

Roblox gets new dashboard and industry training programme

Alongside the moderation system, Roblox has added a chart to its Creator Dashboard that shows developers how many of their game servers have been shut down for bad user behaviour on a given day. The tool is intended to help creators identify spikes in problematic activity and assess whether changes to in-game features, such as custom emotes or avatar editing tools, are needed.Roblox is also participating in a certification programme for digital community managers, developed in partnership with Keyword Studios, Riot Games, and research psychologist Rachel Kowert, who serves as Games for Change Research Director. The programme aims to address the absence of standardised training for online moderators and community managers in the gaming industry. Kowert said the initiative aims to “translate research on gaming communities and online behaviour into practical tools that digital leaders can use to build more resilient and sustainable online communities.”

Read Entire Article