Roblox, Discord, OpenAI and Google found new child safety group

Roblox, Discord, OpenAI and Google are launching a nonprofit organization called ROOST, or Robust Open Online Safety Tools, which hopes “to build scalable, interoperable safety infrastructure suited for the AI era.”

The organization plans on providing free, open-source safety tools to public and private organizations to use on their own platforms, with a special focus on child safety to start. The press release announcing ROOST specifically calls out plans to offer “tools to detect, review, and report child sexual abuse material (CSAM).” Partner companies are providing funding for these tools, and the technical expertise to build them, too.

The operating

→ Continue reading at Engadget

Similar Articles

Advertisment

Most Popular