According to credible reports, Apple plans to start installing software on iPhones in the United States that will automatically scan local photos for child abuse imagery. Apple will reportedly only scan photos uploaded to iCloud with this technology—at first, at least.
The reports of Apple’s plans originated from The Financial Times and Johns Hopkins University Professor Matthew Green, both generally reliable sources. Of course, until Apple confirms this to be the case, there’s always a chance this doesn’t happen. Apple reportedly demonstrated the plan to some US academics earlier this week.
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.