Apple openly defends its CSAM monitoring
The company took some serious criticism after it announced it would begin scanning photos that get uploaded for iCloud for child abuse imagery. Now, they're going on offense.
Last week, Apple announced a major initiative to combat the spread of child abuse imagery. Beginning with the company’s next round of major software updates, Apple says everything from your iPhone to your Apple Watch would automatically scan your photos once they get uploaded to iCloud to see if they match any stored hashes in a database of CSAM (or chi…
Keep reading with a 7-day free trial
Subscribe to Legendary Scoop to keep reading this post and get 7 days of free access to the full post archives.