2 Visions Clash Over How to Fight Online Child Abuse in Europe
Except on Tuesdays when she’s in the Dutch Senate, Arda Gerkens spends her time helping tech companies delete child sexual abuse material buried in their platforms. For seven years, the senator has run the nonprofit foundation Online Child Abuse Expert Agency, known by its Dutch acronym EOKM. With her 20-person team, Gerkens offers judgment-free advice and free automated detection tools to businesses, from image to file hosting sites. Most companies want to keep their networks clean, she says. “They just need help.”
Lawmakers in the European Union, however, have lost patience with this coaxing approach and say platforms have failed to tackle this problem voluntarily. This week, the European Commission’s home affairs department put forward new rules that would enable courts to force tech companies to scan their users’ images, videos, and texts in search of child abuse or grooming. But the proposed law has no exemptions, meaning encrypted services like WhatsApp and Telegram could be forced to scan their users’ private messages.
Companies in the Netherlands host more child sexual abuse content than any other EU country. But Gerkens thinks the Commission’s proposal goes too far. She likes the idea of a central European Center to coordinate the crackdown. But she’s worried that scanning any platform for text would risk too many posts being flagged by mistake and that forcing encrypted services to scan private messages would compromise the security of some of the most secure spaces on the internet.
Encrypted messengers protect children as well as adults, she says. Every year, EOKM’s help line receives several pleas from minors who have been blackmailed by hackers to create and send explicit images after their non-encrypted social media accounts have been hacked. Gerkens is worried that breaking encryption would mean these cases become more common. “[If] you have a backdoor into encryption, it works both ways,” she says.
The debate over encrypted spaces exposes a deep rift in Europe about how to crack down on a problem that is only getting worse. Every year investigators find more child sexual abuse material online than the year before. From 2020 to 2021, the British nonprofit Internet Watch Foundation recorded a more than 60 percent jump in this type of content. The urgent need to address this growing problem has created extra tension in what is already a bitter debate hinged on one question: Is it disproportionate to scan everybody’s private messages to root out child sexual abuse?
“If you want to search somebody’s house as a police officer, you can’t just go and do that willy-nilly; you need good grounds to suspect [them], and in the online environment it should be exactly the same,” says Ella Jakubowska, a policy adviser at the Brussels-based digital rights group European Digital Rights.
Others see scanning tools differently. This technology operates more like a police dog in an airport, argues Yiota Souras, senior vice president and general counsel at the US National Center for Missing and Exploited Children. “That dog is not learning about what I have in my suitcase or communicating that in any way. It is alerting if it smells a bomb or drugs.”
via Wired https://ift.tt/PhHlI5Y
May 13, 2022 at 04:08AM