With more children being online these days, there are more reports of kids being exploited online. It’s also hard to admit that there are times you can’t monitor all your child’s activity online. As a parent, protecting them from this kind of exploitation is your topmost priority.
Apple recently announced that it will start testing a system called neuralMatch that will automatically “match” photos of children being sexually abused. Once an image is detected as a red flag, a team from Apple reviews these photos, and if it’s a verified one, they will be alerting authorities.
How neuralMatch Works?
The system neuralMatch was instructed using 200,000 images from the National Center for Missing & Exploited Children. Scanned pictures on iCloud will be hashed and we will be compared to the database of photos of sexually abused children.
Once you upload images to iCloud, these photos will be given a “safety voucher”. These vouchers will contain information about the pictures including its hashes. If these photos gather multiple red-flag hashes, they will get a suspect mark. Apple will then decrypt these photos, and if it turns out that it’s illegal, they will be sending them to authorities.
This new system will be coming as an iMessage update feature of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey and will first roll out in the US. You can learn more about this feature from Apple here.
Privacy Advocates Concerns
The announcement created concerns for some privacy advocates. According to John Hopkins University professor and cryptographer Matthew Green, “This sort of tool can be a boon for finding child pornography in people’s phones” “But imagine what it could do in the hands of an authoritarian government?”
Another statement was released from Greg Nojeim, co-director of the Security & Surveillance Project at the Center for Democracy & Technology. “Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US but around the world.” He also added, “Apple should abandon these changes and restore its user’s faith in the security and integrity of their data on Apple devices and services.”
WHAT DO YOU THINK?
Apple always commits itself to security and privacy over its devices. But since this technology scans our photos stored in our devices and iCloud, the big question remains… Will our photos remain private?
What do you think? Is the trade-off of 100% privacy worth the attempt to stop child pornography on iPhones?
Help your friends and family be informed about Apple’s New Scanning System by using the Share Buttons below. I really appreciate you helping spread the word about my Free Daily iPhone Tips! 🙂
No. We cannot give up liberty and accept censorship. These are steps to the end of democracy.
Totally agree that my photos are mine! Do not want it happening to my phone and I have been and iphone person for several years – please don’t give me a good reason to leave.
Just what part of private don’t they understand?
Mission creep is inevitable! Slippery slope!
This is a bad idea and the potential for abuse and unforeseen issues is huge. I object to surveillance without cause, especially from Apple.
I can’t see how this will stop child porn and I can see how easily this can be abused and end up in a big mess. We thought our phones were secure by joining the Do Not Call list, and now we’re get bombarded by solicitations for car warrantees and IRS scams! This has proven that security isn’t so secure.
If folks are serious about porn, they’re not going to use iCloud ! There are all kinds of Secret Apps to store photos that aren’t ICloud and are secure. So this decision looks very much like a precursor to overall surveillance. And that scares me.
I agree with this
ABSOLUTELY, agreed!!!
I seriously doubt that this will help catch pornographers. What it will do is alienate the Apple customers who trust Apple’s security.
I totally agree with Ninah!! This has to much “Big Brother” in it for me.
Time to switch away from iPhone. They would not cooperate with our government to find criminals who killed, but they will open this feature to their left wing valley workers to monitor! Smacks of possible abuse by many government agencies.
This is just another fake claim by our government to take away our freedoms and implementing under the guise of protecting the children. It’s likened to the Taliban and Isis using children as human bombs!!’
I agree with Ninah; Apple has no reason to mess with my photos.
No, I don’t want anyone monitoring my phone or iCloud, Period!
No way! That is Too Much Big Brother
Guess I will not upgrade any more! I have nothing to hide, do not believe in child pornography, and think those that mistreat children should be punished to the max, including those that take child pornography pictures, but do not want my privacy invaded! I will not be using my phone for photography in the future, even tho it is handy! I have 4 great cameras that take better pictures than the iPhone and will go back to using them!
I won’t be upgrading my future Phones, iPads or Mac’s with any Apple products in the future if this is the case.
I agree with Greg. This is a potentially very slippery slope. It is Apple’s responsibility to provide device security, not policing users.
If monitoring pictures n it helps children, Yes I agree!!!
Did I see an IF in there? That’s the problem!!
Potential for abuse is high…do not like this.
I agree, Apple is not responsible for policing everyone only security on the iPhone, etc. this is an infringement on freedom & right of liberty!
I don’t need my phone monitored
No trade off in privacy is acceptable to me. You give them an inch and they’ll take a mile.
This would be the one reason to switch to android.
I think it is just one more way for Apple and or the government to surveil iPhone users. Like Ninah stated above, serious porn viewers/users aren’t going to use iCloud.. I absolutely disagree with Apple using surveillance to monitor our iCloud and photos, and our iPhones in general.
I agree!!!
This technology is amazing and could save millions of children worldwide from horrific abuse. That said, it belongs with law enforcement. Sell it to them because if you put it on iPhone the criminals committing these heinous crimes just got the notification of what Apple is proposing to do and as of now are ceasing iPhone activity and deleting iCoud photos anyway.
All this while losing law abiding customers over personal, legal privacy. Think Apple, think!
Well said!! Thanks!
It’s all about the money and government control! They give a rat’s a$$ about Jane or John Doe consumer.
Might be time for consideration. To replace IPhone apple service due to opening the customers phone to the world for potential identity theft & who know what else!
I don’t agree with this picture monitoring
So how do we get around this? Or once again are we being forced to go along with something we don’t agree with? Don’t load photos on to the cloud? Someone tell us something. This is Bull. Plain and simple.
They give you all these humanitarian reasons to do this but we all know they are blowing smoke. They want more control. Plain and simple.
Stop this now!!!’
I agree that this is not an effective way to spot traffickers. I am against this process.
I have a question to ask!
I keep getting threats on my iPhone 12 about. shutting down my iPhone if I don’t get rid of Scam Texts or Scam. Messages!
How do I get rid of my Text messages?
Please help me to find out what going on!!!