OPINION: Apple’s new privacy features could set a dangerous new precedent

Computers+like+these+could+be+used+to+unknowingly+steal+your+information.%0APhoto+by+Maxim+Hopman+on+Unsplash.

Computers like these could be used to unknowingly steal your information. Photo by Maxim Hopman on Unsplash.

Aiden Bonilla

Apple has rolled out IOS 15, its latest all-device update, over the past two weeks. The official release of the system isn’t due until next month, but the features detailed in the latest patch notes are causing mass concern among many Apple consumers. 

This comes from Apple’s announcement that all iCloud photos will be scanned for possible child sexual abuse material (CSAM). Many have praised the move, as the tech giant has finally taken actions against the heinous acts done using their devices. Others have criticized this decision, calling it an invasion of privacy and a flawed plan that will do more harm than good.

The algorithm used to locate this material, “neuralMatch,” only runs pre-existing photos through a database of CSAM, so any original photos taken by users will not be subject to processing or flagged. So while someone’s pictures of a newborn will not register in the system, it also means that any non-registered explicit photos taken by Apple products won’t register as well. Unless someone already has the database photos in their iCloud, the system won’t detect it. 

In addition, Apple plans on using this same system to scan the iMessages for those recognized as ages thirteen and under. It will look for any possible sexually explicit images and alert their parents if such is sent or received. Any pictures that trigger the system will be blurred and sent to a human worker for review and to see if it’s indeed criminal. Any legitimate CSAM detected will result in a thirty day account suspension and notification to the National Center for Missing and Exploited Children.

The primary concern that I have is the fact that Apple now has an outside system reading into and analyzing encrypted messages. This move has not taken into account any of the possibilities being laid bare by its critics, it’s ironic for a company with full commercials detailing their supposed high quality privacy.

The utilization of an outside source gives hackers a brand new pathway into our phones. Even more alarming is the reality that the government now has another way to monitor us. For example, Edward Snowden, the famous National Security Agency (NSA) whistleblower and one of the most staunch critics of Apple’s new system, called it a form of, “mass surveillance to the entire world,” One of the main selling points of the iPhone is just how well they take care of an individual’s privacy, but now that line of PR has been fractured by a system that could backdoor in foreign threats as well as domestic ones.

Some are commending Apple for their retaliation to child pornography, as the company  has never before taken measures against the transmission of CSAM through their devices, but the issue with this praise is that there is a middle ground between doing nothing about these images, and violating their user’s privacy. While some might argue it’s only speculation whether or not someone could hack your device, what we must realize is that it’s only a matter of time before big companies or skilled individuals find their way in through this backdoor.

Other tech companies have taken a different, much safer approach to CSAM the past few years. Both Facebook and Microsoft use images stored on their company computer servers so that everything (in online terms) is kept in the same place, leaving less openings for malicious entries. If Apple instead had gone with this type of system, then there would be less worry about potential security risks, and in turn less backlash surrounding the update as a whole. 

That’s the difference right there. 

Apple’s system is an untested, unique approach to a problem they should’ve addressed long ago. It’s only the first week of beta testing, so no one knows exactly how well it’ll hold up in practice; only time will tell if Apple’s bold move will work wonders or backfire spectacularly.

We must show Apple that we won’t stand for such a relentless violation of our right to privacy. The most effective method of fighting back against this issue is by emailing Tim Cook, the CEO of Apple, through his public email, [email protected].