According to Appleinsider,After receiving a great deal of criticism,applesThe company recently announced that it will no longer launch its CSAM child protection feature as planned.
Previously, Apple had planned to launch a new feature to review photos saved on users' phones, as well as images posted and uploaded to iCloud through iMessage, to identify child pornography and abuse content (CSAM) and combat its spread. The feature was originally planned to be rolled out with iOS 15, and will first be available in the US. Following the announcement, the feature was previously criticized by numerous nonprofit organizations, customers, and security researchers. Apple said it will continue to gather relevant opinions and delay the launch of this feature.