fbpx
How to fix an iPhone 11 Pro that's stuck on white Apple logo

Since the early 2000s, Apple has been a leading manufacturer in the technology world due to it’s various features and constant updates. The newest update from Apple is sought to combat Child Sexual Abuse Material (CSAM) and has sparked great controversy amongst the country.

The update was inspired by a report which found that over 45 million photos and videos contained CSAM and is spread by other users. To end this proliferation, Apple CEO Tim Cook states he plans to “incorporate algorithmic scanners of user’s devices and messages and take legal action from there”.

Government officials and tech company have viewed this practice as “unethical” and are a violation of privacy. According to reporter Lucas Ropek, they fear that these features “could one day be repurposed to search for different kinds of material other than CSAM. Such a shift could open the door to new forms of widespread surveillance and serve as a potential blueprint for encrypted communications—one of privacy’s last, best, and strongest hopes”.

The feature itself will consist of a tool that scan photos uploaded to iCloud from the Apple device to check for CSAM. More specifically, Cook explains it is a “neural functioning machine that assesses whether images or videos on a user’s iPhone match known digital fingerprints or hashes of CSAM” It compares the shared image to a database of CSAM created by the the National Center for Missing and Exploited Children (NCMEC). If enough offenses are discovered, Apple can create a report to the NCMEC about the material and, in turn, the FBI.

The company stresses that the feature does not “learn anything about images that do not match [those in] the known CSAM database”. This means that, tentatively, the feeature is not designed to scan through people’s personal photos or photo albums. Many still fear, however, that technology is unreliable and a glitch may cause these photos and other personal information to be divulged over the Internet.

In addition, the CSAM feature will also include an iMessage feature which “warns children and parents whether them or their child are receiving or sending CSAM”. They receive a notification alerting them that the database algorithm has deemed the content “sexually explicit”. It also encourages them not to look at the image as the image is blurred until the user consents to viewing it. If the child is under 13, a notification will always be sent to the parent’s Apple device.

This feature has been met with much controversy as well. Questions have been asked about how exactly information is encrypted, how protective the feature is, and how the update circumvents that protection. Various security websites such as Norton and McAfee describe encryption as “protecting the contents of a user’s message by scrambling it into unreadable cryptographic signatures before it is sent, essentially nullifying the point of intercepting the message because it’s unreadable”. The problem, however, is that Apple scans child accounts for CSAM before encryption, meaning the information can still be accessible.

The Center for Democracy and Technology has deemed this Apple update an invasion of privacy provided by it’s end-to-end encryption: “The mechanism that will enable Apple to scan images in iMessages is not an alternative to a backdoor—it is a backdoor, they stated. “Client-side scanning on one ‘end’ of the communication breaks the security of the transmission, and informing a third-party (the parent) about the content of the communication undermines its privacy.”

Federal affairs further notes that the feature is unreliable in that the tools are un-auditable, decreasing the validity of these devices. “There is no way for outside groups like ours or anybody else—researchers—to look under the hood to see how well it’s working, is it accurate, is this doing what its supposed to be doing, how many false-positives are there. Once they roll this system out and start pushing it onto the phones, who’s to say they’re not going to respond to government pressure to start including other things—terrorism content, memes that depict political leaders in unflattering ways, all sorts of other stuff.”

While detecting CSAM has been highly encouraged, especially in today’s society, the tools used to detect these occurances are quite concerning. Privacy advocates and security experts have written letters to Apple asking that the company reconsiders it’s updates and have created petitions to do so. Currently, over 5,000 people have agreed to the petition and avoid the threat of malfunctioning and sharing personal information.

Society, it is best to turn our attention away from technology and address other issues affecting America.

Leave a Reply

Discover more from TREMG

Subscribe now to keep reading and get access to the full archive.

Continue reading