The European Union is investigating the X social media app, formerly known as Twitter, over its content moderation practices amid the ongoing Israel–Hamas conflict.
Under existing EU laws, illegal online content can include a number of categories, such as content that incites or otherwise contributes to terrorism. Illegal content under EU laws can also include hate speech and incitement to violence. The European Commission did not specify exactly what types of illegal content appeared on X following the Hamas attacks that might constitute illegal online content, but it is known that images of the attacks were captured and shared on the platform.
The European Commission announced that its investigation will also delve into whether X's Community Notes feature has been an effective tool in combatting information manipulation on the platform.
Margrethe Vestager, the executive vice president of the European Commission for A Europe Fit for the Digital Age, said the commission has enough evidence "to formally open a proceeding against X."
In April, the European Commission named the X social media platform as one of 19 "very large online platforms" (VLOPs) that would need to comply with the new EU law, and compliance requirements went into effect in August.
"The higher the risk large platforms pose to our society, the more specific the requirements of the Digital Services Act are. We take any breach of our rules very seriously," Ms. Vestager said.
The investigation marks the first time the European Commission has initiated investigative proceedings under the DSA, which was enacted in October of 2022.
Musk Has Faced Past Warnings
The European Commission's decision to launch the investigation into X comes as the platform's owner, Elon Musk, has pushed back on the EU's content moderation requests.Following Mr. Musk's decision to pull X from the EU disinformation agreement, Mr. Breton wrote a warning post on X that the platform still must follow EU content moderation policies.
"You can run but you can’t hide," Mr. Breton's May 26 warning states. "Beyond voluntary commitments, fighting disinformation will be legal obligation under #DSA as of August 25. Our teams will be ready for enforcement."
On Oct. 10, Mr. Breton once again called on Mr. Musk to account for his platform's content moderation policies.
"You need to be very transparent and clear on what content is permitted under your terms and consistently and diligently enforce your own policies," Mr. Breton's Oct. 10 letter to Mr. Musk reads. "This is particularly relevant when it comes to violent and terrorist content that appears to circulate on your platform. Your latest changes in public interest policies that occurred over night left many European users uncertain."
X CEO Linda Yaccarino had responded to Mr. Breton in an Oct. 11 letter, insisting X has taken actions to stop illegal content from spreading on the platform. As of Oct. 14, Ms. Yaccarino said X had identified and suspended hundreds of Hamas-affiliated accounts, handled more than 80 law enforcement requests to remove content, and had applied its Community Notes fact-checking feature to more than 700 unique posts and thousands of reposts related to the Oct. 7 attacks.
"X is committed to serving the public conversation, especially in critical moments like this and understands the importance of addressing any illegal content that may be disseminated through the platform," Ms. Yaccarino wrote in October. "There is no place on X for terrorist organizations or violent extremist groups and we continue to remove such accounts in real time, including proactive efforts."
