EU plan to scan private messages for child abuse images puts crypto at risk

“If that’s the case, there is only one logical solution: client-side scanning where the content is scanned as it is decoded on the user’s machine for them to view/read,” says Woodward. Last year, Apple announced that it would introduce a client-side scan — one performed on people’s iPhones rather than Apple’s servers — to check photos for known CSAMs uploaded to iCloud. The move sparked protests from civil rights groups and even Edward Snowden about the possibility of surveillance, leading Apple to pause its plans a month after initially announcing it. (Apple declined to comment for this story.)

For tech companies, discovering CSAM on their platforms and scanning some connections is nothing new. Businesses operating in the US are required to report any CSAM they find or are reported by users to the National Center for Missing and Exploited Children (NCMEC), a US-based non-profit organization. More than 29 million reports, containing 39 million images and 44 million videos, were submitted to NCMEC last year alone. Under the new EU rules, the EU center will receive CSAM reports from technology companies.

“A lot of companies aren’t disclosing today,” Johansson said at a news conference to introduce the legislation. “This is not a proposal on encryption, this is a proposal on child sexual abuse material,” Johansson said, adding that the law was “not about reading communications” but about disclosing illegal abuse content.

Currently, tech companies find CSAM online in different ways. And the amount of CSAM found is growing as tech companies get better at detecting and reporting abuse — although some are much better than others. In some cases, AI is used to track down a previously unseen CSAM. Duplicates of existing offensive photos and videos can be detected using “hashing systems,” where the abuse content is fingerprinted and can be spotted when it is uploaded to the web again. More than 200 companies, from Google to Apple, use Microsoft’s PhotoDNA hash system to scan millions of files shared online. However, to do this, systems need to access messages and files people send, which is not possible when end-to-end encryption is in place.

“In addition to discovering CSAM, there will be commitments to detect child grooming (“grooming”), which can only mean that conversations must be read 24/7 in European digital rights,” says Diego Naranjo, head of policy at the civil liberties group. “This is a disaster for confidentiality of communications. Companies will be required (via disclosure orders) or incentivized (via risk mitigation measures) to provide less secure services to everyone if they are to comply with these obligations.”

Discussions about protecting children online and how this can be done using end-to-end encryption are highly complex, technical, and fused with the horrors of crimes against vulnerable young people. Research by UNICEF, the UN Children’s Fund, published in 2020, says encryption is necessary to protect the privacy of people – including children – but adds that it “hampers” efforts to remove content and identify who is sharing it. For years, law enforcement agencies around the world have pushed to find ways to bypass or weaken encryption. “I’m not saying privacy at any cost,” Woodward says, “and I think we can all agree that child abuse is abhorrent, but there needs to be a healthy, public and fair discussion about whether the risks that might arise are worth the real efficacy in fighting child abuse.” “.

Researchers and technology companies are increasingly focusing on the safety tools that can exist alongside end-to-end encryption. Suggestions include using metadata from encrypted messages—who, how, what and why from messages, not their content—to analyze people’s behavior and potentially detect criminality. A recent report by the nonprofit Business for Social Responsibility, commissioned by Meta, found that end-to-end encryption is an overwhelmingly positive force in support of people’s human rights. He proposed 45 recommendations for how to integrate encryption and security together and not include access to people’s communications. When the report was published in April, Lindsey Anderson, associate director of human rights at BSR, told WIRED: “Contrary to popular belief, there is actually a lot that can be done even without access to the letters.”

Leave a Comment