EU to Zuckerberg: Explain yourself over Instagram pedophile network
'Meta’s voluntary code on child protection seems not to work,' says EU Commissioner Thierry Breton.
EU Internal Market Commissioner Thierry Breton wants Meta CEO Mark Zuckerberg to explain and take “immediate” action over a recently exposed large pedophile network on Instagram.
Instagram has been letting a vast network of accounts promoting and purchasing child sexual abuse material flourish on its platform, according to investigations by the Wall Street Journal and researchers released on June 7. The social media platform lets users search for explicit hashtags, and has offenders exploit its recommendation algorithms to promote illicit content.
“Meta’s voluntary code on child protection seems not to work,” Breton wrote Thursday on Twitter. “Mark Zuckerberg must now explain & take immediate action.”
Breton said he will discuss the issue with Zuckerberg at the Meta headquarters on June 23 during a trip to the U.S. The politician will travel later this month to see how social media companies including Twitter are preparing to comply with the EU’s flagship content moderation law, the Digital Services Act (DSA).
He said Meta will have to “demonstrate measures” to the European Commission after August 25 when the DSA starts applying to Big Tech platforms. Otherwise, the company could face sweeping fines of up to 6 percent of its global annual revenue. Under the DSA, platforms have to crack down on illegal content and ensure children are safe on a platform. Companies have to also assess and limit how their platforms and algorithms are contributing to major societal problems such as the dissemination of illegal content and the protection of minors.
A Meta spokesperson said the company has set up an internal task force to investigate and “immediately address” the recent findings from the Wall Street Journal and researchers.
The company works “aggressively to fight” child exploitation and support law enforcement track down criminals, the spokesperson said. Meta dismantled 27 “abusive networks” between 2020 and 2022 and disabled over 490,000 accounts for violating our child safety policies in January 2023, they added.