Apple Starts Scanning All Personal Photos And Images Uploaded To iCloud

Please Share This Story!
Apple is now taking on the role of policeman as it starts scanning all of your photos uploaded to its iCloud service. The first wave of extraction is photos representing child exploitation, and they will report you to the police! Their Technocrat AI algorithms will not be made public. You can turn off iCloud on your Apply devices, but you will lose the ability to sync them.

The excuse is to trap child abusers and sexual predators but in the process 100 percent of your photos must be scanned as well. Once the surveillance door is opened, any other type of photo can be targeted for any other purpose. This could include political protests, location tracking information, psychological profiles, ie., what you are taking pictures of, etc.  ⁃ TN Editor

Apple will report images of child exploitation uploaded to iCloud in the U.S. to law enforcement, the company said on Thursday.

The new system will detect images called Child Sexual Abuse Material (CSAM) using a process called hashing, where images are transformed into unique numbers that correspond to that image.

Apple started testing the system on Thursday, but most U.S. iPhone users won’t be part of it until an iOS 15 update later this year, Apple said.

The move brings Apple in line with other cloud services which already scan user files, often using hashing systems, for content that violates their terms of service, including child exploitation images.

It also represents a test for Apple, which says that its system is more private for users than previous approaches to eliminating illegal images of child sexual abuse, because it uses sophisticated cryptography on both Apple’s servers and user devices and doesn’t scan actual images, only hashes.

But many privacy-sensitive users still recoil from software that notifies governments about the contents on a device or in the cloud, and may react negatively to this announcement, especially since Apple has vociferously defended device encryption and operates in countries with fewer speech protections than the U.S.

Law enforcement officials around the world have also pressured Apple to weaken its encryption for iMessage and other software services like iCloud to investigate child exploitation or terrorism. Thursday’s announcement is a way for Apple to address some of those issues without giving up some of its engineering principles around user privacy.

How it works

Before an image is stored in Apple’s iCloud, Apple matches the image’s hash against a database of hashes provided by National Center for Missing and Exploited Children (NCMEC). That database will be distributed in the code of iOS beginning with an update to iOS 15. The matching process is done on the user’s iPhone, not in the cloud, Apple said.

If Apple then detects a certain number of violating files in an iCloud account, the system will upload a file that allows Apple to decrypt and see the images on that account. A person will manually review the images to confirm whether or not there’s a match.

Apple will only be able to review images that match content that’s already known and reported to these databases — it won’t be able to detect parents’ photos of their kids in the bath, for example, as these images won’t be part of the NCMEC database.

If the person doing the manual review concludes the system did not make an error, then Apple will disable the user’s iCloud account, and send a report to NCMEC or notify law enforcement if necessary. Users can file an appeal to Apple if they think their account was flagged by mistake, an Apple representative said.

The system only works on images uploaded to iCloud, which users can turn off, Apple said. Photos or other images on a device that haven’t been uploaded to Apple servers won’t be part of the system.

Read full story here…

About the Editor

Patrick Wood
Patrick Wood is a leading and critical expert on Sustainable Development, Green Economy, Agenda 21, 2030 Agenda and historic Technocracy. He is the author of Technocracy Rising: The Trojan Horse of Global Transformation (2015) and co-author of Trilaterals Over Washington, Volumes I and II (1978-1980) with the late Antony C. Sutton.
Notify of

Newest Most Voted
Inline Feedbacks
View all comments
Jeremy Lamb

Good. It’s about time that people wake up to what’s been going on the whole time. The only change now is they’ll start going after people to set a precedent. But if you think they don’t know what’s in your icloud or on your phone, you’re not very smart. They created the thing. And I worked in telecom, FYI. I could just go up the next floor and see the server room. At the time, the FBI apparently wanted to get in there…because the phones were of a particular type that was supposedly very useful for dealers and not of… Read more »

Last edited 1 year ago by Jeremy Lamb

Best response is to use a 3rd party cloud storage who has no interest in your private affairs. They’re out there but you need to look for them. You can encrypt your content with them. Just know all encrypted content cannot be retrieved by the company if you lose your password.

Jeremy Lamb

To be honest, you have no idea who these companies are. I’ve found, in doing research, that all “privacy” companies have some pretty shady funding. And that’s without exception.

[…] Read More: Apple Starts Scanning All Personal Photos And Images Uploaded To iCloud […]

bishop penster

Better not take any pictures of babies or kids taking baths or showers….you’re a pervert if you do now I guess.

[…] Apple Starts Scanning All Personal Photos And Images Uploaded To iCloud […]