Apple Will Scan Your iPhone Photos For Child Abuse Pics: What About Privacy?
Apple states that it would review each report manually if they find a match. If the report is accurate, the account of the user would be blocked with the authorities getting informed about the perpetrator.
Apple will soon set up a way to scan photos of users on their iPhones in the US to look for images of child sexual abuse or child pornography.
Also Read: Apple iOS 14.5 Update Declares War On Facebook, Google, Other Data Tracking Apps
To implement this, Apple will be making use of a hash algorithm to check through the photos stored on a device and will use photo identification software in the background to see if the library has images that look like children or any other kind of abuse.
The whole system works by comparing images to a database that consists of known child sexual abuse images that are compiled by the US National Centre for Missing and Exploited Children, along with other child safety organisations.
These images are converted to hashes which are essentially numerical codes that can be matched to an image on any Apple device -- iPad, iPhone or Mac. The system is also capable of detecting edited variants of the original image.
Apple explained the process in a press release, ¡°Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.¡± According to Apple, the system has an ¡°extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.¡±
Apple states that it would review each report manually if they find a match. If the report is accurate, the account of the user would be blocked with the authorities getting informed about the perpetrator.
Also Read: Apple Will Stop Apps Like Facebook From Tracking User Data For Ads On iPhone
What about user privacy?
This is surely alarming considering Apple has always had the reputation of giving user privacy the utmost emphasis. With Apple snooping into people¡¯s private images, even though for a good cause, it is still an invasion of the privacy of the user.
Matthew Green, a cryptography researcher at Johns Hopkins University, warns (highlighted by the Guardian) that the system could be fooled to frame an innocent individual by sending them non-offensive images that would get detected by the system to trigger matches for child abuse images. That could easily fool Apple¡¯s algorithm and alert law enforcement.
He also highlights that this could result in other abuses like government surveillance of protesters or activists. He says, ¡°What happens when the Chinese government says: ¡®Here is a list of files that we want you to scan for.¡¯ Does Apple say no? I hope they say no, but their technology won¡¯t say no.¡±
Washington-based Center for Democracy and Technology also asked Apple to abandon the changes, claiming that it would destroy the company¡¯s guarantee of end-to-end encryption. By just scanning messages for sexually explicit content on devices would break the security.
Also Read: iOS 14.5 Effect: 96% iPhone Users Have Disabled Apps From Data Tracking
The group also raised questions surrounding the technology¡¯s accuracy for differentiating between dangerous content and something as juvenile as perhaps a meme. They claim such technologies are known to be quite error-prone, it said.
Apple however has denied that the changes to the backdoor degrade its encryption. It says that they¡¯ve carefully considered innovations that do not disturb user privacy but rather strongly protect it.
Do you think a feature like this would dethrone Apple from being the most user-privacy-centric brand in the tech industry? Let us know in the comments below. And for more cool tech news, keep checking Indiatimes.com.