This article contains references to the sexual exploitation and abuse of children.
Eleven-year-old Iris was alone and looking for friends when she started chatting with a man online. She thought he was finally the one person she could trust. So when he threatened to cut off contact if she didn’t send him a nude photo of herself, Iris finally agreed – feeling the fear of loneliness again.
The man said that if Iris refused to send more photos he would send the photos to her family.
When Iris, whose name was changed to protect her identity, finally shared her story of sexual harassment and online abuse, she said, “For three years he did whatever he wanted, whenever he wanted.”
crack down on child abuse
His case and others are recorded on sheets of paper kept on school desks outside the EU headquarters in Brussels.
It is part of a rally by children’s rights campaigners who are angry that the bloc’s member states have once again delayed a decision on controversial online safety laws that would force tech companies to scan images, videos and links for evidence of child sexual abuse.
“In these three years of delay – that is, more than 1,200 days of negotiations – too many children have fallen into the hands of criminals,” Fabiola Bas Palomares, who leads policy work at the campaign group Eurochild, told DW.
Nearly two-thirds of all child sexual exploitation webpages discovered by the Internet Watch Foundation last year were found in an EU country, and more than 60 million images and videos linked to the sexual exploitation of minors were flagged online globally.
Privacy at risk?
If you scroll through TikTok or do a quick Google search, you’ll immediately find claims that the EU is about to start reading your texts. This is not the case – but Brussels’ proposal violates the bloc’s existing privacy norms.
Germany is among the EU states refusing to support the planned laws.
German Justice Minister Stefanie Hubig said in a statement on October 8, “Private communications should never be the subject of general suspicion. Nor can the state force messenger services to massively scan messages for suspicious content before sending them.”
Under the latest proposal authored by current EU president Denmark, high-risk tech companies could be ordered to scan all links, images and videos shared on their platforms – though not the text – to report incidents of suspected child sexual abuse material to law enforcement.
Encrypted content in headlines
Most controversially, these rules will also apply to content shared on messengers like WhatsApp, which are used encryption — A technical promise that your message will not be seen by anyone other than the person to whom it is sent.
The EU says the measures are necessary to catch predators exploiting the digital environment. When Facebook’s parent company Meta began encrypting some messages in 2023, it flagged 6.9 million fewer cases of suspected online child exploitation to the US watchdog than the previous year – although it remained the top incident reporter.
Now, encrypted messaging platform Signal Threatened to leave the EU market, Claiming that the EU’s latest proposals would amount to “mass surveillance”.
‘We cannot compromise’
While privacy campaigner Ella Jakubowska usually spends her days trying to rein in the power of big tech in Europe, this time, she finds herself on the same side of the debate.
“This is an unprecedentedly undemocratic proposal,” Jakubowska, head of policy work at the nonprofit European Digital Rights, told DW.
He said, “It’s going to undermine the vital digital security protections that we all rely on day in and day out. And that’s not a compromise we can make, even if it’s in the service of something that has a really important purpose.”
Jakubowska says she is tired of privacy and child protection being seen as an “either-or” choice.
And this view is expressed by Dorothy Hahn, 59, who coordinates a group of German childhood abuse survivors who actively campaign against the EU proposals. Hahn fears that if so-called chat control laws are approved, people will feel unsafe in the future sharing their stories or asking for help online.
“It would be like having a police officer standing next to you during a therapy session,” he told DW by phone.
technical difficulties
And it’s a concern that keeps coming up: Legitimate or harmless communications, like parents sharing photos of their kids at the beach, could end up being attributed to law enforcement.
Experts are divided on whether the EU’s plan is technically feasible. Officials working on the latest draft stress that the same detection mechanisms that could be used to scan for child abuse are already being used to target malware.
“Technology, the answers are there,” Swedish centre-left lawmaker Ivan Inkir told reporters at Monday’s children’s rights rally.
But cryptographer Bart Prenell of the Catholic University of Leuven is not convinced. he is one of the few 800 scientists and researchers Who wrote an open letter earlier this month speaking out against the laws.
“It is already very difficult for humans to distinguish between CSAM and legitimate content,” Preneel told DW over a video call. “We are very skeptical that AI can learn this,” he said, “We believe there is no technology – and there will be no technology in the next 10 years – that can do that.”
Prenell also warned that the technology the platform will use to scan files could leave users and authorities more vulnerable to hackers.
Legal vacuum looms
Children’s rights campaigner Fabiola Palomares is also fed up with framing “privacy versus child protection” as a binary choice.
“If we don’t have technology that is secure enough, that is privacy-preserving, then let’s focus on developing that technology rather than questioning the legal framework behind it,” he told DW.
“Without legal incentives to do so, they will not invest in developing the technology and developing the capability that they really need to actively seek out,” he said.
“If we don’t pass a regulation that covers end-to-end encrypted environments, the burden will once again be put on the victim – put on the child – to be able to report something, rather than being on a platform that is facilitating that crime in some way,” she said.
And Palomares also warned that a legal vacuum could be created for Europe.
For now, tech companies can choose to voluntarily search for and flag questionable content, falling foul of broader EU privacy rules. This exception is due to expire next April, increasing pressure on member states to decide.
What now for Europe?
EU member Denmark, which holds the bloc’s rotating presidency until 2026, has been in favor of stricter laws and now faces the challenge of mustering enough support among more skeptical states such as Germany.
Attempts to satisfy privacy concerns by restricting which content platforms must scan have so far failed to get Berlin on board, but Copenhagen may try again later this year.
Even if the laws eventually gain sufficient support, they will still take several years to be implemented – with parliamentary negotiations set to be the next step in the process.
Edited by: Rob Mudge
Leave a Reply