EU watchdog questions secrecy around lawmakers’ encryption-breaking CSAM scanning proposal
The European Commission has again been urged to more fully disclose its dealings with private technology companies and other stakeholders, in relation to a controversial piece of tech policy that could see a law mandate the scanning of European Union citizens’ private messages in a bid to detect child sexual abuse material (CSAM).
The issue is of note as concerns have been raised about lobbying by the tech industry influencing the Commission’s drafting of the controversial CSAM-scanning proposal. Some of the information withheld relates to correspondence between the EU and private firms that could be potential suppliers of CSAM-scanning technology — meaning they stand to gain commercially from any pan-EU law mandating message scanning.
The preliminary finding of maladministration by the EU’s ombudsman, Emily O’Reilly, was reached on Friday and made public on its website yesterday. Back in January, the ombudsman came to a similar conclusion — inviting the Commission to respond to its concerns. Its latest findings factor in the EU executive’s responses and invite the Commission to respond to its recommendations with a “detailed opinion” by July 26 — so the saga isn’t over yet.
The draft CSAM-scanning legislation, meanwhile, remains on the table with EU co-legislators — despite a warning from the Council’s own legal service that the proposed approach is unlawful. The European Data Protection Supervisor and civil society groups have also warned the proposal represents a tipping point for democratic rights in the EU. While, back in October, lawmakers in the European Parliament who are also opposed to the Commission’s direction of travel proposed a substantially revised draft that aims to put limits on the scope of the scanning. But the ball is in the Council’s court as Member States’ governments have yet to settle on their own negotiating position for the file.
In spite of growing alarm and opposition across a number of EU institutions, the Commission has continued to stand behind the controversial CSAM detection orders — ignoring warnings from critics the law could force platforms to deploy client-side scanning, with dire implications for European web users’ privacy and security.
An ongoing lack of transparency vis-à-vis the EU executive’s decision-making process when it drafted the contentious legislation hardly helps — fueling concerns that certain self-interested commercial interests may have had a role in shaping the original proposal.
Since December, the EU’s ombudsman has been considering a complaint by a journalist who sought access to documents pertaining to the CSAM regulation and the EU’s “associated decision-making process”.
After reviewing information the Commission withheld, along with its defence for the non-disclosure, the ombudsman remains largely unimpressed with the level of transparency on show.
The Commission released some data following the journalist’s request for public access but withheld 28 documents entirely and, in the case of a further five, partially redacted the information — citing a range of exemptions to deny disclosure, including public interest as regards public security; the need to protect personal data; the need to protect commercial interests; the need to protect legal advice; and the need to protect its decision-making.
According to information released by the ombudsman, five of the documents linked to the complaint pertain to “exchanges with interest representatives from the technology industry”. It does not list which companies were corresponding with the Commission, but U.S.-based Thorn, a maker of AI-based child safety tech, was linked to lobbying on the file in an investigative report by BalkanInsights last September.
Other documents in the bundle that were either withheld or redacted by the Commission include drafts of its impact assessment when preparing the legislation; and comments from its legal service.
When it comes to info pertaining to the EU’s correspondence with tech companies, the ombudsman questions many of the Commission’s justifications for withholding the data — finding, for example in the case of one of these documents, that while the EU’s decision to redact details of the information exchanged between law enforcement and a number of unnamed companies may be justified on public security grounds there is no clear reason for it to withhold the names of companies themselves.
“It is not readily clear how disclosure of the names of the companies concerned could possibly undermine public security, if the information exchanged between the companies and law enforcement has been redacted,” wrote the ombudsman.
In another instance, the ombudsman takes issue with apparently selective info releases by the Commission pertaining to input from tech industry reps, writing that: “From the very general reasons for non-disclosure the Commission provided in its confirmatory decision, it is not clear why it considered the withheld ‘preliminary options’ to be more sensitive than those that it had decided to disclose to the complainant.”
The ombudsman’s conclusion at this point of the investigation repeats its earlier finding of maladministration on the Commission for refusal to give “wide public access” to the 33 documents. In her recommendation, O’Reilly also writes: “The European Commission should re-consider its position on the access request with a view to providing significantly increased access, taking into account the Ombudsman’s considerations shared in this recommendation.”
The Commission was contacted about the ombudsman’s latest findings on the complaint but at press time it had not provided a response.