The European Commission has begun formal proceedings investigating whether X, formerly known as Twitter, has violated European rules on illegal content, content manipulation and data transparency.
The commission said Monday that under the once-in-a-generation Digital Services Act (DSA), it will look into X’s activities on risk management; content moderation; so-called dark patterns in its user interface; advertising transparency; and data access for researchers.
The commission said it decided to open a formal probe based on the substance of a “risk assessment report” X submitted in September, as well as the company’s recent transparency reports and its replies to formal requests for information specifically focused on how it allegedly allowed illegal content about Hamas' terrorist attacks against Israel to spread.
The investigation will focus on X’s compliance with DSA rules requiring it to stop spreading illegal content in the European Union (EU), with special attention paid to X’s “risk assessment and mitigation measures.” The commission also will study the company’s “notice and action mechanisms” for tracking illegal content in the EU, with an eye on X’s content moderation resources, the press release said.
Illegal content varies from country to country in the EU, but at the top level it includes categories such as terrorist content, hate speech or child sexual abuse material.
X’s efforts to squash information manipulation on the platform, along with its Community Notes system in the EU, will be assessed to determine if they are effective, particularly in terms of how they and related policies impact “risks to civic discourse and electoral processes.”
The EU also will hone in on whether X’s efforts to amplify the transparency of its platform have been effective.
“The investigation concerns suspected shortcomings in giving researchers access to X's publicly accessible data … as well as shortcomings in X's ads repository,” the press release said.
The Commission also said it will investigate Twitter for a potentially “deceptive design of the user interface”, citing the checkmarks accompanying subscriptions.
When asked for comment, an X spokesperson pointed to a post on its safety team’s account, saying the company “remains committed to complying with the Digital Services Act and is cooperating with the regulatory process.”
“X is focused on creating a safe and inclusive environment for all users on our platform, while protecting freedom of expression, and we will continue to work tirelessly towards this goal,” the statement added.
“Today’s opening of formal proceedings against X makes it clear that, with the DSA, the time of big online platforms behaving like they are ‘too big to care’ has come to an end,” Thierry Breton, the Commissioner for Internal Market, the division bringing the case, said in a statement. “We will make full use of our toolbox to protect our citizens and democracies.”
In October, when the Commission first warned X and other social media platforms they were potentially violating the DSA, experts said the action was poorly executed and came too soon for a commission still setting itself up.
The DSA is seen as a watershed law which will try to restructure the internet. As the Israel and Hamas conflict unfolded with a troubling and vast amount of disinformation surrounding it, some experts said that Breton acted too quickly to show muscle and demonstrate his new powers..
While well-intentioned, Breton overreached given the brand new DSA’s lack of structure, according to Rose Jackson, a disinformation scholar and co-author of a concurrent Atlantic Council Digital Forensic Research Lab (DFRLab) report.
The fledgeling DSA will not take full effect until February 2024,
The conflict between Israel and Hamas is forcing “the European Union to try to figure out how to have a test run of the DSA before they're really ready,” Jackson said in an interview with Recorded Future News at the time.
After proceedings formally open in the case against X, the Commission will keep collecting evidence and will potentially take more enforcement actions, such as interim measures, and non-compliance decisions, the press release said.
There is no legal deadline for the formal proceedings to conclude.
Get more insights with the
Recorded Future
Intelligence Cloud.
No previous article
No new articles
Suzanne Smalley is a reporter covering privacy, disinformation and cybersecurity policy for The Record. She was previously a cybersecurity reporter at CyberScoop and Reuters. Earlier in her career Suzanne covered the Boston Police Department for the Boston Globe and two presidential campaign cycles for Newsweek. She lives in Washington with her husband and three children.