The European Commission (EC) sent X a formal request for information on Wednesday, following the spread of disinformation related to violence in Israel that is potentially illegal under European Union law.
The Digital Services Act (DSA), which went into effect in July, is a foundation of the European Union's digital strategy — outlining what the commission calls “an unprecedented new standard” for holding large platforms accountable for disseminating disinformation and hate speech, among other things. . The EC said its formal request for information in this case focuses on disinformation and illegal content spreading terrorist and violent content along with hate speech.
European Commissioner Thierry Breton sent a letter signaling official action to X owner Elon Musk on Tuesday, warning that DSA requires large online platforms like X to remove illegal content quickly.
"Given the urgency, I also expect you to be in contact with the relevant law enforcement authorities and Europol, and ensure that you respond promptly to their requests," Breton wrote.
He also urged Musk to “ensure a prompt, accurate and complete response to this request within the next 24 hours.”
Reports of violent and terrorizing content being spread across social media have captured the attention of U.S. leaders as well. House Energy and Commerce Committee Ranking Member Frank Pallone, Jr. (D-NJ) released a statement Thursday calling on X, Meta and YouTube to better enforce their terms of service.
“This threat must be taken seriously, right now, so that social media platforms aren’t used to spread or broadcast graphic acts of violence and terrorism,” Pallone’s statement said.
He called reports that Hamas has hacked into some victims’ social media accounts to spread terrorist content “particularly heinous, and platforms must act swiftly and decisively to stamp out these abuses.”
In one such instance, an Israeli girl discovered her grandmother had been slain after a militant broadcast the killing on her Facebook livestream. Meta did not reply to a request for comment.
Pallone said the violent and false information saturating social media sites highlights the need for platforms to maintain a “robust and fully supported content moderation staff.”
Musk has reportedly radically diminished the platform’s trust and safety team, which had been charged with overseeing global content moderation.
X did not reply to a request for comment.
The EC press release issued Thursday noted that as a “very large online platform” — defined by the commission as platforms or online services with more than 45 million users a month in the EU — X is required to comply with DSA provisions, including by assessing and mitigating risks related to the dissemination of illegal content and disinformation.
The commission is now investigating X's compliance with the DSA, specifically regarding policies and practices around illegal content, how complaints are managed, and how risks are considered and responded to, the press release said.
The EC gave X a deadline of October 18 to provide information on questions “related to the activation and functioning of X's crisis response protocol.” Additional information is due by Oct. 31.
X’s replies will be considered and from there fines could be levied, the release said.
Get more insights with the
Recorded Future
Intelligence Cloud.
No previous article
No new articles
Suzanne Smalley is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.