Pictures of abused children flood parts of the internet. The number of recorded depictions of sexual abuse in Germany increased by more than 100 percent year-on-year in 2021 to nearly 40,000. “Europe has now become a hub for trade in depictions of abuse,” said the federal government’s abuse officer, Kerstin Claus, of the German press agency. Looking at the rise in cases, the question arises “whether we can still do something to counter the gigantic sums offered on the internet”.
The European Commission will try it and is expected to present a bill in the middle of the week to combat depictions of sexual abuse on the Internet. But to what extent does the good cause justify the interference in citizens’ private communication?
The current temporary solution expires after three years
Facebook, Google and Co voluntarily scanned their users’ private messages for depictions of abuse until December 2020. They searched for images that were known from previous studies and which had been provided with a kind of digital fingerprint, a so-called hash. Hits were given to the U.S. Center for Missing and Exploited Children NCMEC, where they were checked and, if necessary, sent to authorities such as the Federal Criminal Police Office (BKA). From the end of 2020, however, the legal basis for this was temporarily lacking in the EU. According to the NCMEC, the number of tips dropped by 58 percent.
That is why the EU countries and the European Parliament agreed on a temporary solution just over a year ago, which expires after three years at the latest. Since then, the platforms have been allowed to scan their users’ messages for hash again. Now, however, the disclosure of so-called groomings also falls under the rules, which means adults’ approach to children on the Internet. EU Interior Commissioner Ylva Johansson wants to propose a permanent solution this week.
There may be an obligation to scan for companies
The details of the proposal are still unclear. However, Johansson has shown the way for a long time. She will propose a law that “obliges companies to identify, report and eliminate child sexual abuse,” she told Welt am Sonntag in January. Among other things, it is doubtful whether this obligation is limited to known representations. Grooming detection may also become mandatory in one form or another. The Commission is also likely to propose setting up an EU center to combat child abuse. The EU countries and the European Parliament then negotiate the proposals.
“Chat control would be mass surveillance for no reason”
Civil rights activists are concerned. In March, 47 organizations wrote a fire letter (PDF) to EU Commission President Ursula von der Leyen and Interior Commissioner Johansson. The association Digitale Gesellschaft, to which Tom Jennissen belongs, also signed. He warns that in future, any message sent via WhatsApp may be scanned by companies. This is a “very massive and disproportionate interference in communication” and is contrary to all the rule of law, he tells dpa.
Jennissen fears that, on the basis of a general suspicion, encrypted communication may itself be interfered with. Johansson, on the other hand, has already made clear what weighs heavier on her: Of course, data protection and encryption are important, she told “Welt am Sonntag”. “But the focus must be on protecting children first and foremost.” Instead of a law that could possibly be overturned by a court, more prevention and better equipment for the authorities are needed, Jennissen demands.
FDP MEP Moritz Körner also stresses that the fight against child pornography must not be abused as an excuse “to justify an unprecedented destruction of our privacy”. “Chat control would be mass surveillance for no reason.” Körner is also calling for better equipment for the police, the EU authority Europol and more cooperation between EU countries.
Proponents of her case have been working to make the actual transcript of this statement available online
For example, the US Thorn Foundation for the Protection of Children is committed to a comprehensive filtering commitment. Thorn develops its own filters that not only find known abuse material, but also new. The foundation is also working on an instrument to detect grooming. “Companies must be legally authorized to use targeted digital technologies to stop the viral spread of child sexual abuse material on their platforms,” Thorn said.
Abuse Commissioner Claus welcomes the fact that the European Commission’s proposal will create a binding legal framework for exchange and cooperation between EU states. The large number of reported cases means that law enforcement agencies have been working on their breaking point for years. An EU Center for Combating Child Abuse could “pre-sort reports on child pornography, for example, and then distribute them to the relevant EU countries for criminal prosecution”. “It would not only relieve the Member States, it would also make the work more efficient, speed up the prosecution and thus enable more cases to be successfully concluded in the future.”
Even the Child Welfare Association is critical of an intervention in encrypted messages. “Encrypted communication hardly plays a role in the proliferation of depictions of abuse,” said Joachim Türk of the dpa’s federal board. “We therefore consider random scans of encrypted communications to be disproportionate and ineffective.” What happens after the European Commission’s proposal will also depend on the federal government. The SPD, Greens and FDP promise “a right to encryption” in the coalition agreement.