Commentary

Apple will implement new tech to fight child porn. It’s cause for genuine privacy concerns.

Iphone XR in male hand and activated by voice Apple digital assistant Siri and text on smartphone screen: Go ahead, I'm listening...
Photo: Shutterstock

How much privacy do you have on your iPhone?

In light of new changes that Apple is implementing, not as much as you did.

Related: Here’s what Grindr is doing to figure out how a right-wing site outed a priest with app data

In an effort to fight child abuse, Apple is reworking the technology in its phones to identify any images uploaded to iCloud storage that it says are known child pornography. It will also allow parents to turn on a feature that will flag any nude images that turn up in their children’s messages.

In a final touch, Siri will “intervene” if someone is asks their phone an inappropriate question about child pornography or sexual abuse.

There’s no question that fighting child pornography is an important and worthwhile effort. The problem is that the technology that Apple is implementing is custom built for other, less noble purposes.

“We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” the Electronics Freedom Foundation, a nonprofit advocating for digital privacy, said. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

The EFF and other data privacy experts note that the technology that Apple is putting in place can be readily adapted to look for other things.

“Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone,” WhatsApp chief Will Cathcart tweeted. “That’s not privacy.”

WhatsApp said it will not duplicate Apple’s technology.

Governments looking to crackdown on political speech, or even LGBTQ people, could find the technology handy. All they would have to do is tweak the parameters of what the technology is looking for, and they can get all kinds of information.

The threat is hardly a theoretical one. A study released last year by Recorded Future, a cybersecurity firm, found that LGBTQ people outside the U.S. are often subject to government surveillance and tracking via apps and the internet. Egypt, Iran and Lebanon have all used technology to identify and trap LGBTQ people.

Apple insists that it would never allow the technology to be used for nefarious purposes. But Apple was also the tech company that took the hardest line on user privacy. It had long insisted that what happens on your iPhone stays on your iPhone. In fact, it even used that very line in advertising.

Of course, that was never as true as Apple wanted you to believe. Still, compared to companies like Google and Facebook, Apple has been been far better at protecting its users and far less interested in monetizing their every click.

Perhaps Apple will hold the line at child pornography, as it insists it will. But Apple has been willing to bend its rules for China, so there is precedent for exceptions.

In building technology to fight child pornography, Apple built a slippery slope. How far the company stays away from the slope’s edge will have to be seen.

Anti-gay hate group trying to boycott the Foo Fighters gets drive-by disco in response

Previous article

President Biden nominates out philanthropist to serve as Swiss ambassador

Next article