Is Apple Scanning Your iPhone Photos? Here’s What You Need to Know

Apple has been renowned for its security and privacy policies. But the recent plan of scanning iPhone photos has been a raising alarm in the tech world. Apple is planning to implement this plan to put a bar on the rising cases of child sexual abuse material(CASM).

Sexually explicit content is not acceptable in society both morally and legally but sex trafficking has made it mandatory to take some hard steps for the companies like Apple. Apple will scan photos on iPhone and will figure out if the user is involved in sex/human trafficking and this measure will be a step forward to end this evil in society.

However, this new plan of Apple is looked down upon by its users as users are concerned about their privacy. This is undoubtedly a privacy breach where the data of your phone will not be safe because Apple’s new photo scanning system on iPhone will have your information.

Is Apple Scanning Your iPhone Photos? Here’s how it scan?

CSAM detection technology will be available with the new iOS 15 updates which will include three features to control CASM.

  • The first feature in the update will be parental control. When this feature will be enabled then it will scan the photos in the Messages app on the children’s phone and if any explicit content is detected then a notification will be sent.
  • The second feature in the update will scan photos in iCloud for child sexual abuse material.
  • The third feature in the update will warn users when they use Siri or the search bar in safari for explicit content.

All three features will help in eradicating the CSAM evil from society by detecting and flagging sexually explicit content involving children on a device. CSAM is spreading like a virus with millions of videos and images. Over time approximately 19000 victims have been identified by law enforcement.

Apple said that its new features aim “to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material.”

How Will This Feature Work?

Apple’s CSAM detection technology is based on a computer-based algorithm that will scan iCloudimages and then will compare them to the codes of known child sex abuse images. These codes are stored in the database of child safety organizations, including the National Center for Missing and Exploited Children (NCMEC).

If the Photo is matched with the database then a person will review the  image  to see if it is CSAM. If the photo will be determined to be a CSAM then your account will be Shutdown by Apple and it will be reported to law enforcement and NCMEC.

Will New Technology Become A Security Threat?

With the induction of new features in the update, Apple will certainly comprise the security of users because it will affect end-to-end encryption. This scanning feature can become a loophole for the government, hackers, employees of Apple itself who can get unauthorized access to your data can use it for personal motives. It can be a serious threat to the user’s privacy because such sensitive data can be compromised by any authority and can lead to the larger collapse of privacy.

According to Hijazi. “I believe Apple’s heart is in the right place, and it is trying to do the right thing here, but this measure is simply too wide-ranging because it sacrifices privacy of everyone in the process”.

Is there a way to stop Apple from scanning the photos?

According to Hijazi. “If you care about your privacy, you should assume that any iOS device is vulnerable to end-to-end encryption bypasses and full photo access,” he says. “There is no such thing as privacy online or on any connected device, and you should always assume that is the case and behave accordingly.”

Well, there is a way out by which you can stop Apple from scanning your iCloud photos and can protect your privacy. You can disable iCloud storage for your photos and can ensure your privacy.

Follow the steps to disable iCloud photos:-

  • Select Settings> Photos
  • Turn off  iCloud photos slider
  • Select DownloadPhotos & Videos in the pop-up windows to download the photos from your iCloud to device instead.

apple-photos

Conclusion:-

CSAM is a good step in preventing child sexual abuse but yes the fact not to be ignored is that the privacy is leaked in this way. Apple could have taken some other steps which do not infringe the privacy of users and at the same time, child sexual abuse could be prevented.

However, if an individual is clean and knows that there is nothing fishy in his data, then he should not give ears to any rumors and support in the noble cause and a step taken by Apple. Follow us on social media – FacebookInstagram and YouTube.

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe Now & Never Miss The Latest Tech Updates!

Enter your e-mail address and click the Subscribe button to receive great content and coupon codes for amazing discounts.

Don't Miss Out. Complete the subscription Now.