The UK’s Children’s Commissioner is calling for a ban on AI deepfake apps that create nude or sexual images of children, according to a new report. It states that such “nudification” apps have become so prevalent that many girls have stopped posting photos on social media. And though creating or uploading CSAM images is illegal, apps used to create deepfake nude images are still legal.
“Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone — a stranger, a classmate, or even a friend — could use a
→ Continue reading at Engadget