Apple: Read what Apple has to say about not scanning iCloud Photos

iPhone maker Apple announced its plans to roll out three new child safety features in August 2021. These features included a system to detect known Child Sexual Abuse Material (CSAM) images stored in ‌iCloud Photos‌, a Communication Safety option that blurs sexually explicit photos in the Messages app and child exploitation resources for Siri.
In December 2021, Apple launched the Communication Safety feature in the US along with iOS 15.2.The feature was later expanded to other regions including — the UK, Canada, Australia, and New Zealand. The company also made Siri‌ resources available, but the CSAM detection feature was never rolled out.
According to a report by Wired, the Cupertino-based tech giant has shared a new statement related to explaining why the CSAM feature wasn’t adopted. The response comes as child safety group Heat Initiative demands Apple to “detect, report, and remove” CSAM from iCloud and offer more tools for users to report such content to the company.
Read what Apple has to say about the CSAM detection feature
“Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it,” Erik Neuenschwander, Apple’s director of user privacy and child safety, wrote in the company’s response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.
“Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit,” Neuenschwander wrote. “It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”
Why Apple changed its CSAM detection plans
Earlier, Apple said that it would include the CSAM detection feature would be included in an update to iOS 15 and iPadOS 15 by the end of 2021. However, the rollout of this feature was later delayed based on “feedback from customers, advocacy groups, researchers, and others.”

The CSAM detection feature was also criticised by a long list of individuals and organisations. This includes including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.
Apple facing ‘an encryption’ issue in the UK
Along with the CSAM detection issue Apple is also going through an encryption debate with the UK government. The country is planning to amend surveillance legislation. This law will require tech companies to disable security features like end-to-end encryption without telling the public. Apple has warned to pull out services like FaceTime and iMessage in the UK if the legislation is passed in its current form.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Todays Chronic is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – todayschronic.com. The content will be deleted within 24 hours.

Leave a Comment