Instead, Apple told WIRED this week, it is focusing its anti-CSAM efforts and investments on its “Communication Security” features, which the company initially announced in August 2021 and launched last December. Parents and guardians can turn on protections through family iCloud accounts. The features work in Siri, Apple’s Spotlight search, and Safari search to alert you if someone is viewing or searching for child sexual abuse material, and provides on-site resources for reporting the content and asking for help. Also at the heart of protection is secure communications for messages, which healthcare professionals can set up to provide warning and resources to children if they receive or attempt to send photos that contain nudity. The goal is to stop child exploitation before it happens or becomes entrenched and to reduce the creation of new child sexual abuse acts.
“After extensive consultation with experts to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the secure communications feature that we first made available in December 2021,” he said. the company to WIRED in a note. “We have also decided not to move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies collecting personal data, and we will continue to work with governments, children’s advocates and other companies to help protect young people, preserve their right to privacy and make the internet a safer place for children and for all of us.”
Here comes the CSAM update from Apple along with its announcement today that the company is significantly expanding its end-to-end encryption offerings for iCloud, including adding protection for backups and photos stored on the cloud service. Child safety experts and technologists working to fight CSAM have often opposed a broader implementation of end-to-end encryption because it makes user data inaccessible to tech companies, making it harder for them to scan and tamper the CSAM. Law enforcement agencies around the world have done the same he mentioned the terrible problem of infantile sexuality abuse in opposing the use and expansion of end-to-end encryption, although many of these agencies have historically hostile state towards end-to-end encryption in general because it can make some investigations more challenging. Research has constantly shownhowever, that end-to-end encryption is a vital security tool for the protection of human rights and that the disadvantages of its implementation do not outweigh the benefits.
Communications security for messages is turned on and analyzes image attachments that users send and receive on their devices to determine if a photo contains nudity. The feature is designed so that Apple never has access to Messages, the end-to-end encryption offered by Messages is never breached, and Apple never even learns that a device has detected nudity.
The company told WIRED that while it’s not ready to announce a specific timeline for expanding its communications security features, the company is working to add the ability to detect nudity in videos sent via Messages when protection is on. enabled. The company also plans to expand offerings beyond messaging to its other communications applications. Ultimately, the goal is to allow third-party developers to incorporate communications security tools into their applications. The more features can proliferate, Apple says, the more likely children are to get the information and support they need before they are exploited.
“Potential child exploitation can be stopped before it happens by providing opt-in tools to parents to help protect their children from unsafe communications,” the company said in its statement. “Apple is dedicated to developing innovative privacy solutions to combat child abuse material and protect children, while addressing the unique privacy needs of personal communications and data storage.”
Similar to other companies that have publicly grappled with how to address CSAM, including Meta, Apple told WIRED that it plans to continue working with child safety experts to make it as easy as possible for its users to report exploitative content and situations. to defense organizations and law enforcement agencies.
Countering the CSAM is a complicated and a nuanced effort with extremely high stakes for kids everywhere, and it’s still unknown how much traction Apple’s bet on proactive intervention will get. But the tech giants are walking a fine line as they work to balance CSAM detection and user privacy.
Leave a Comment