share_log

争议太大!苹果宣布延迟推出CSAM儿童保护功能

It's too controversial! Apple Inc announces delay in launching CSAM child protection function

新浪科技 ·  Sep 3, 2021 21:42

Sina Science and Technology News on the evening of September 3, Beijing time, it is reported thatAppleThe company announced several new child safety features last month, but some of them are controversial, including scanning CSAM content in iCloud Photos. AppleIt said today that more time will be devoted to improving these features before it is postponed to the public.

"Last month, we announced functional plans to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse materials," Apple Inc said in a statement. Based on feedback from customers, advocacy groups, researchers and others, we decided to spend more time collecting opinions and making improvements in the coming months before releasing these critical child safety features. "

According to the original plan, Apple Inc's new child safety features will be released later this year as part of the iOS 15, iPadOS 15 and macOS Monterey updates. Today, it is still unknown when these features will be launched. As for how to further improve these functions, Apple Inc did not provide any details in today's statement.

Last month, Apple Inc was strongly resisted and criticized by privacy advocates after announcing the new CSAM testing technology. Even Apple Inc's internal employees worry that the government may force Apple Inc to use the technology for censorship by looking for content other than CSAM. Other employees worry that Apple Inc is damaging his industry-leading reputation for privacy.

However, Apple Inc is very optimistic about the prospect of this function, and said that if it is really implemented, it will be better thanAlphabet Inc-CL CFacebook IncThe technologies used by other companies, such as those used by other companies, are better able to protect privacy. In addition, Apple Inc also confirmed that, in fact, Apple Inc has been scanning child sexual abuse materials (CSAM) in iCloud emails since 2019, but has not yet scanned iCloud photos or iCloud backups.

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment