In The News

Apple Announced Their Big Tracking Plan And IMMEDIATELY Alarm Bells Went Off!

Tech giant Apple is now facing a firestorm of criticism after they recently announced plans to install software on their devices that is going to scan for child pornography and then report the offenders to law enforcement. Does this move have good intentions? Sure it does. But security experts are saying that this could very easily open up a dangerous gateway that would facilitate not only corporate abuse but also government overreach to violate people’s privacy rights and civil liberties.

“We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM),” Apple announced on their website last Thursday.

Apple is planning on rolling out these new features later this year in their Apple Desktop Computers, Apple Watches, iPhones, and iPads.

This new move is not only supposed to give parents more oversight of their children’s electronic devices, but it will give Apple the ability to analyze any image attachments to see if they are sexually explicit.

“This will enable Apple to report these instances to the National Center for Missing and Exploited Children,” the company said. “Siri and Search will also intervene when users try to search for CSAM-related topics.”

Of course, Apple continues to claim that this new software isn’t going to scan users’ private photos, videos, and messages, but there are many security experts that were alarmed at the potential for abuse – even though the goal of catching pedophiles is commendable.

Ross Anderson is currently a professor of security engineering at the University of Cambridge, and he commented on the issue to the Financial Times, calling it “an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of…our phones and laptops.”

Yes, the new software is designed to catch child sex abuse, and that is definitely a good thing. However, the problem is that this could easily be adapted to spy on private citizens for other reasons.

Matthew Green is currently a computer science professor at Johns Hopkins University who specializes in privacy-preserving protocols, and he described Apple’s new improvements as “a really bad idea.”

These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear,” he tweeted. He also warned that he believed that this technology could become “a key ingredient in adding surveillance to encrypted messaging systems” and he pointed out that “the ability to add scanning systems like this to E2E [end-to-end] messaging systems has been a major ‘ask’ by law enforcement the world over.” While that may seem all well and good to us as Americans, we have to realize that this isn’t necessarily a good idea if you are a citizen of a communist regime such as China or North Korea. The implication is there…authoritarian dictatorships could seriously misuse this to further oppress political dissenters and private citizens.

We’re heading into 1984, folks, and I’m not the only one who thinks so. Alec Muffett, who is a security researcher who previously worked at Facebook, described Apple’s latest foray into violating our privacy as a “tectonic” shift that would be a “huge and regressive step for individual privacy.”

.“Apple are walking back privacy to enable ‘1984,’” Muffett told the Financial Times. And what about when the parents are the abusers themselves? Heather Burns is a tech policy adviser, and she is all-too-familiar with many a drug-addicted, abusive parent who can “batter their child senseless during regular hours” and then put on an acting job that is worthy of an Oscar that would fool both teachers and social workers.

There is no doubt about it: Apple might have good intentions, but this is going to lead to some serious problems down the road.

To Top