iOS 18.2 has a child safety feature that can blur nude content and report it to Apple

In iOS 18.2, Apple is adding a new feature that resurrects some of the intent behind its halted CSAM scanning plans — this time, without breaking end-to-end encryption or providing government backdoors. Rolling out first in Australia, the company’s expansion of its Communication Safety feature uses on-device machine learning to detect and blur nude content, adding warnings and requiring confirmation before users can proceed. If the child is under 13, they can’t continue without entering the device’s Screen Time passcode.

If the device’s onboard machine learning detects nude content,

→ Continue reading at Engadget

Similar Articles

Advertisment

Most Popular