Australia targets gaming platforms in child safety campaign

Australia’s online safety regulator on Wednesday ordered major online gaming platforms including Roblox, Minecraft, Fortnite and Steam to explain how they are protecting children from sexual predators and online radicalization.

The eSafety Commission said it had issued legally enforceable transparency notices seeking details on security systems, staffing and moderation practices. Companies that fail to comply may face penalties and civil action.

Most Australian youth play online games

eSafety Commissioner Julie Inman Grant said online games have become a social hub for young people, with nine in 10 Australians aged eight to 17 playing online games.

They warned that predators use gaming platforms to “make contact with children in an online game environment, then they lead the children to private messaging services.”

Grant said that “predatory adults” “target children through framing or embedding terrorist and violent extremist narratives in gameplay, increasing the risks of contact violations, radicalization, and other off-platform harms.”

Austria sends 70,000 kids on three-week smartphone detox

Please enable JavaScript to view this video, and consider upgrading to a web browser Supports HTML5 video

The move comes as Australia steps up efforts to prevent harm to minors online after banning children under 16 from major social media platforms last year.

However, the online safety watchdog found that “a large proportion of Australian children” were still scrolling through banned platforms three months after the ban.

Roblox accused of failing to protect children

Roblox faces more than 140 US lawsuits alleging it failed to prevent the sexual exploitation of children.

On Tuesday, Roblox agreed to a settlement with the US states of Alabama and West Virginia worth more than $23 million. A week ago, the company announced customized accounts for younger users.

Edited by: Louis Olofse



Source link

Leave a Comment