With more children being online these days, there are more reports of kids being exploited online. It’s also hard to admit that there are times you can’t monitor all your child’s activity online. As a parent, protecting them from this kind of exploitation is your topmost priority.
Apple recently announced that it will start testing a system called neuralMatch that will automatically “match” photos of children being sexually abused. Once an image is detected as a red flag, a team from Apple reviews these photos, and if it’s a verified one, they will be alerting authorities.
How neuralMatch Works?
The system neuralMatch was instructed using 200,000 images from the National Center for Missing & Exploited Children. Scanned pictures on iCloud will be hashed and we will be compared to the database of photos of sexually abused children.
Once you upload images to iCloud, these photos will be given a “safety voucher”. These vouchers will contain information about the pictures including its hashes. If these photos gather multiple red-flag hashes, they will get a suspect mark. Apple will then decrypt these photos, and if it turns out that it’s illegal, they will be sending them to authorities.
This new system will be coming as an iMessage update feature of iOS 15, iPadOS 15, watchOS 8, and macOS Monterey and will first roll out in the US. You can learn more about this feature from Apple here.
Privacy Advocates Concerns
The announcement created concerns for some privacy advocates. According to John Hopkins University professor and cryptographer Matthew Green, “This sort of tool can be a boon for finding child pornography in people’s phones” “But imagine what it could do in the hands of an authoritarian government?”
Another statement was released from Greg Nojeim, co-director of the Security & Surveillance Project at the Center for Democracy & Technology. “Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US but around the world.” He also added, “Apple should abandon these changes and restore its user’s faith in the security and integrity of their data on Apple devices and services.”
WHAT DO YOU THINK?
Apple always commits itself to security and privacy over its devices. But since this technology scans our photos stored in our devices and iCloud, the big question remains… Will our photos remain private?
What do you think? Is the trade-off of 100% privacy worth the attempt to stop child pornography on iPhones?
Help your friends and family be informed about Apple’s New Scanning System by using the Share Buttons below. I really appreciate you helping spread the word about my Free Daily iPhone Tips! 🙂