The White House released a statement today outlining commitments that several AI companies are making to curb the creation and distribution of image-based sexual abuse. The participating businesses have laid out the steps they are taking to prevent their platforms from being used to generate non-consensual intimate images (NCII) of adults and child sexual abuse material (CSAM).
Specifically, Adobe, Anthropic, Cohere, Common Crawl, Microsoft and OpenAI said they’ll be:
All of the aforementioned except Common Crawl also agreed they’d be:
“incorporating feedback loops and iterative stress-testing strategies in their development processes, to guard against AI models outputting image-based sexual abuse”
And “removing nude images from
→ Continue reading at Engadget