Unintended Risks Of Apple Child Protection Features | Avast
2021-08-07 06:40:35 Author: blog.avast.com(查看原文) 阅读量:47 收藏

, Aug 6, 2021 11:40:35 PM

Apple wants to protect children online. But could there be unintended consequences?

Apple is taking steps to combat child sex abuse materials (CSAM), including implementing technology that will detect known-CSAM uploaded to iCloud; an iMessage feature that will alert parents if their child sends or receives an image with nudity; and a block if someone tries to search for CSAM-related terms on Siri or Search. The changes, which Apple says will be released in the US later this year, were first leaked via a tweet thread by a Johns Hopkins University cryptography professor who heard about them from a colleague. Apple has since confirmed the reports. 

The scanning technology Apple is implementing — called NeuralHash — doesn’t compare images the way human eyes do. Instead, it creates a string of numbers and letters — called a “hash” — for each image and then checks it against their database of known CSAM, according to reporting by TechCrunch. And because editing an image traditionally changes a hash, Apple has added additional layers of scanning called “threshold secret sharing” so that “visually similar” images are also detected. The technology, Avast Global Head of Security Jeff Williams says, sounds similar to the Photo DNA project that was developed during his time at Microsoft.

“With threshold secret sharing, there’s some kind of scoring in place for an image,” Williams says. “Say two images are completely identical — that’s a 100 percent match. But if someone alters it — say they change it by cropping — then maybe it’s only a 70 percent match. Apple is saying that if it’s above a certain percentage score, they’ll move on to the next review phase.”

The potential unintended consequences of NeuralHash

NeuralHash processes the data on the users’ device, before it’s uploaded to iCloud. That method, Avast Chief Privacy Officer Shane McNamee says, is the right one.

“If you are going to check something on someone’s device, a really good way of doing that is to not pull that data off their phone and onto your servers,” McNamee says. “It’s really minimizing the data you’re sending. So, from a privacy perspective, that’s technically a very pro-privacy way to go.”

While combating CSAM is extremely important, privacy and security experts are concerned about the possible unintended consequences of this technology. McNamee questions whether companies should scan people’s devices at all. 

“Now that this is possible to have access, authorities will push for more access,” he says. “It’s like we’re peeking over your shoulder, but we’re wearing sunglasses and saying the sunglasses can only see bad things. And you have this little snooper on the device that’s just reading everything and checking it, not sending it to Apple unless you’re doing something wrong. That’s the problem — the definition of ‘doing something wrong’ could be broadened.”

Brianna Wu — a computer programmer, video game creator, online advocate, and Executive Director of Rebellion PAC who describes herself as “an Apple fan” — points out that the US government could theoretically create legislation giving them permission to use this technology without the general public ever knowing. There are “far less checks and balances” on behind the scenes deals between the US government and tech companies, in the name of national security, than the general public may believe.

“This would allow agencies to spy on our phones to find, say, pictures that the Pentagon says compromise national security or belong to terrorists,” Wu tells Avast. “And if you look at the specifics of Edward Snowden’s revelations, it’s clear that our national security agencies may stick to certain rules in the US, but outside there are no rules at all. I feel very confident this technology could be used to spy on people in other countries.”

According to Williams, the same thing happened when Photo DNA was publicly licensed in 2014. 

“As soon as we brought it out for CSAM, we heard ‘Why not use it for terrorism?’” he says. “Pragmatically, there’s no reason why they couldn’t. It’s a neutral technology that doesn’t require the image to be CSAM-related in order to work.”

However, Williams also points out that NeuralHash could actually weaken the push for backdoor encryption that has gained popularity in recent years. By taking both the CSAM and the process of detecting off the cloud, Williams says, Apple has effectively set up a platform that is end-to-end encrypted, “removing law enforcement’s excuse to go in [our devices].”

“I commend Apple for taking steps in this regard, both to protect individual privacy and to support law enforcement in a manner that does not require backdoor encryption,” he adds.

Stalkerware and danger to LGBTQIA+ kids

The second big change is that Apple will allow parents to implement a program on their children’s iMessages that would blur any images with nudity. It will also alert parents if the child chooses to view the image or send nude images themselves. While Wu says she “can live with the iCloud part” of these new changes, she feels that the scanning messages part leads down “a deeply Orwellian road” and she “would beg Apple to reconsider.” 

“The thought that any time you transmit a nude photo, your parents might be alerted to that? Obviously there are good use cases, like if a predator is involved,” Wu says. “But I can’t help thinking of the kid who’s going to be busted starting a romance with a school mate.” 

Wu points to the fact that the majority of US teens are sexually active before the age of 18 — and that “sexting” is not uncommon among teenagers. This technology, then, potentially infringes on teens’ right to sexual autonomy. It could also potentially open up charges of distributing child pornography against the children, if a parent reports, or the parents if they share the image with the other parents involved.

But even more concerning to Wu is the possibility that this technology could “out” LGBTQIA+ kids to their parents, potentially placing them in both psychological and physical danger. Drawing on her own experience as a queer kid in Mississippi in the ‘90s, she worries that “Silicon Valley forgets what the rest of America is like.” 

“When I was trying to figure out that I was queer, I would actually take my bike and go to the University of Southern Mississippi library just to read books and find more information,” Wu says. “I needed that space to figure out who I was. And, thank God, there are laws for libraries that mean it’s very hard for parents to find that information. This is taking away the privacy a child might have had 20 to 30 years ago in a library.”

She also points out that this tool could theoretically be used as stalkerware in intimate partner violence. 

“I love my husband dearly, but he knows nothing about technology,” Wu says. “I could install this on his phone tomorrow and he’d have no idea.” 

“Or let’s say a woman wants a divorce from her husband," she continues. "He grabs her phone, activates this, sees she’s sexting a new partner, and uses the images against her in revenge porn. I can promise you that will happen with this technology.”

Apple has hung their hat on the consumer right to privacy — and they’ve made significant moves to prove their commitment, from giving users the right to opt out of tracking to letting users hide their email addresses. But while all three experts consulted here agree that NeuralHash NeuralHash appears to protect user privacy to the best of Apple’s technological ability, the potential unintended consequences of the iMessage alerts likely outweigh the potential benefits.

“Name a product from Apple, I’ve got it,” Wu says. “I’m all-in on the Apple ecosystem because of privacy. I root for them to succeed, but this is by far the worst plan I’ve seen them put into effect. I think they’re going down a wrong path and it’s extremely concerning."


文章来源: https://blog.avast.com/apple-child-protection-features-unintended-risks-avast
如有侵权请联系:admin#unsafe.sh