Sunday, December 22, 2024
Technology

Meta faces more questions in Europe about child safety risks on Instagram

Meta has received another formal request for information (RFI) from European Union regulators seeking more details of its response to child safety concerns on Instagram — including what it’s doing to tackle risks related to the sharing of self-generated child sexual abuse material (SG-CSAM) on the social network.

The request is being made under the bloc’s recently rebooted online rulebook the Digital Services Act (DSA), which started applying for larger in-scope platforms (including Instagram) in late August.

The DSA puts obligations on Big Tech to tackle illegal content — including by having measures and protections in place to prevent misuse of their services. The regulation also has a strong focus on the protection of minors so it’s not surprising to see a number of early RFIs made by the European Commission concern child safety.

The latest Commission request to Meta comes hard on the heels of a report by the WSJ that suggests Instagram is struggling to clean up a CSAM problem it exposed this summer — when it reported Instagram’s algorithms were connecting a web of accounts which were being used for making, buying and trading underage-sex content.

In June, following the WSJ’s exposé, the EU warned Meta it faces a risk of “heavy sanctions” if it doesn’t act quickly to tackle the child protection issues.

Now, months later, another report by the WSJ claims Meta has failed to rectify the issues identified — despite the company setting up a child safety task force to try to stop “its own systems from enabling and even promoting a vast network of pedophile accounts”, as the newspaper puts it.

“Five months later, tests conducted by the Journal as well as by the Canadian Centre for Child Protection show that Meta’s recommendation systems still promote such content [i.e. accounts dedicated to producing and sharing underage-sex content],” it reports. “The company has taken down hashtags related to pedophilia but its systems sometimes recommend new ones with minor variations. Even when Meta is alerted to problem accounts and user groups, it has been spotty in removing them.”

Spotty performance by Meta on tackling the sharing of illegal CSAM/SG-CSAM and failing to act effectively on associated child safety risks could get very expensive for the company in the EU: The DSA empowers the Commission to issue fines of up to 6% of global annual turnover if it finds the regulation’s rules have been broken.

Already, just over a year ago, Meta was fined under half a billion dollars after Instagram was found to have violated the bloc’s data protection rules for minors.

“The Commission is requesting Meta to provide additional information on the measures it has taken to comply with its obligations to assess risks and take effective mitigation measures linked to the protection of minors, including regarding the circulation of SG-CSAM on Instagram. Information is also requested about Instagram’s recommender system and amplification of potentially harmful content,” the EU wrote in a press release today, announcing its latest intel-gathering step on the platform.

As well as the possibility of financial sanctions there could be reputational concerns for Meta if EU regulators are repeatedly seen questioning its approach to safeguarding minors.

This is the third RFI Meta has received since DSA compliance begun to apply on the company — and the second to focus on child safety on Instagram. (The EU has also asked Meta for more details of its handling of content risks related to the Israel-Hamas war; and about what it’s doing to ensure election security.)

So far the EU hasn’t announced any formal investigation proceedings under the DSA. But the early flurry of RFIs show it’s busy making assessments which could lead to such a step — opening up the risk of penalties down the line should any breaches be confirmed.

Meta has been given a deadline of December 22 to provide the Commission with the latest tranche of requested child safety data. Failures to comply with RFIs — such as by sending incorrect, incomplete, or misleading information in response to a request — can also attract DSA sanctions.

Meta was contacted for comment on the latest RFI.

source

Leave a Reply

Your email address will not be published. Required fields are marked *