Recents in Beach

Apple’s Photo-Scanning Plan Sparks Outcry From Policy Groups

More than 90 policy groups from the US and around the world signed an open letter urging Apple to drop its plan to have Apple devices scan photos for child sexual abuse material (CSAM).

“The undersigned organizations committed to civil rights, human rights, and digital rights around the world are writing to urge Apple to abandon the plans it announced on 5 August 2021 to build surveillance capabilities into iPhones, iPads, and other Apple products,” the letter to Apple CEO Tim Cook said. “Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”

The Center for Democracy and Technology (CDT) announced the letter, with CDT Security and Surveillance Project codirector Sharon Bradford Franklin saying, “We can expect governments will take advantage of the surveillance capability Apple is building into iPhones, iPads, and computers. They will demand that Apple scan for and block images of human rights abuses, political protests, and other content that should be protected as free expression, which forms the backbone of a free and democratic society.”

The open letter was signed by groups from six continents (Africa, Asia, Australia, Europe, North America, and South America). Some of the US-based signers are the American Civil Liberties Union, the Electronic Frontier Foundation, Fight for the Future, the LGBT Technology Partnership and Institute, New America’s Open Technology Institute, STOP (Surveillance Technology Oversight Project), and the Sex Workers Project of the Urban Justice Center. Signers also include groups from Argentina, Belgium, Brazil, Canada, Colombia, the Dominican Republic, Germany, Ghana, Guatemala, Honduras, Hong Kong, India, Japan, Kenya, Mexico, Nepal, the Netherlands, Nigeria, Pakistan, Panama, Paraguay, Peru, Senegal, Spain, Tanzania, and the UK. The full list of signers is here.

Scanning of iCloud Photos and Messages

Apple announced two weeks ago that devices with iCloud Photos enabled will scan images before they are uploaded to iCloud. An iPhone uploads every photo to iCloud right after it is taken, so the scanning would happen almost immediately if a user has previously turned iCloud Photos on.

Apple said its technology “analyzes an image and converts it to a unique number specific to that image” and flags a photo when its hash is identical or nearly identical to the hash of any that appear in a database of known CSAM. An account can be reported to the National Center for Missing and Exploited Children (NCMEC) when about 30 CSAM photos are detected, a threshold Apple set to ensure that there is “less than a one in 1 trillion chance per year of incorrectly flagging a given account.” That threshold could be changed in the future to maintain the one-in-1 trillion false positive rate.

Apple is also adding a tool to the Messages application that will “analyze image attachments and determine if a photo is sexually explicit” without giving Apple access to the messages. The system will be optional for parents, and if turned on will “warn children and their parents when receiving or sending sexually explicit photos.”

Apple has said the new systems will roll out later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It will be only in the US at first.

Both scanning systems are concerning to the open-letter signers. On the Messages scanning that parents can enable, the letter said:

Algorithms designed to detect sexually explicit material are notoriously unreliable. They are prone to mistakenly flag art, health information, educational resources, advocacy messages, and other imagery. Children’s rights to send and receive such information are protected in the UN Convention on the Rights of the Child. Moreover, the system Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that those individuals have a healthy relationship. This may not always be the case; an abusive adult may be the organizer of the account, and the consequences of parental notification could threaten the child’s safety and well-being. LGBTQ+ youths on family accounts with unsympathetic parents are particularly at risk. As a result of this change, iMessages will no longer provide confidentiality and privacy to those users through an end-to-end encrypted messaging system in which only the sender and intended recipients have access to the information sent. Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit.


Post a Comment

0 Comments