There are 1B+ iPhone users in the world.
Unfortunately, a percentage of those users traffic in child pornography (AKA CSAM, or child sexual abuse material).
Last week, Apple made a big move to stop the spread of CSAM.
… for illicit photos, as reported by Daring Fireball’s John Gruber. Here is a simplified breakdown of the steps:
While combating CSAM is the noblest of goals, not everyone agrees with Apple’s unilateral approach.
Will Cathcart — the head of Facebook’s WhatsApp, which reported 400k+ CSAM cases last year — says Apple built a “surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.”
For example: China could ask Apple to flag all users that have photos of Tiananmen Square.
Tim Cook has even said privacy is a “human right.”
Can it maintain this promise with the new CSAM initiative? To further clarify its position, Apple put out a 6-page FAQ on Monday (in it, Apple says it will “refuse government demands” to track non-CSAM images).
With the seriousness of this matter — from all angles — expect many more updates.