European Union Launches Formal Investigation Into Elon Musk's Platform X Over Spread of 'Illegal Content'

https://icaro.icaromediagroup.com/system/images/photos/15946734/original/open-uri20231219-56-11r3zsj?1703020936
ICARO Media Group
Politics
19/12/2023 21h21

The European Union has initiated a formal investigation into tech mogul Elon Musk's platform, referred to as X, examining its alleged role in spreading 'illegal content' related to the Israel and Hamas war. The investigation comes in the wake of mounting criticism against X for hosting antisemitic material, resulting in the loss of several prominent advertisers.

On Monday, European regulators announced that the European Commission has commenced formal proceedings to assess whether X may have violated the Digital Services Act (DSA) in various areas including risk management, content moderation, dark patterns, advertising transparency, and data access for researchers.

The Digital Services Act broadly defines illegal content as any information that contravenes the laws of the European Union. However, the specific laws that X is believed to have breached have not been explicitly stated by the authorities. Nevertheless, the European regulators have singled out content related to the Israel and Hamas war as a key factor in their investigation.

This is not the first time Elon Musk's X has faced critique. In November, the White House admonished Musk for promoting "anti-semitic and racist hate" through his own X account. The platform claims to be committed to transparency and freedom of speech, as stated in its transparency report. However, this laissez-faire approach to content moderation, combined with Musk's own inflammatory remarks, has led to a significant exodus of advertisers, including Disney, IBM, and Apple.

While the United States has strong protections for freedom of speech, many European countries have strict penalties for antisemitic speech. For instance, Germany's penal code explicitly prohibits publicly denying the Holocaust and disseminating Nazi propaganda. Consequently, regulators investigating X may potentially find violations of German laws in connection to the widespread amplification of messages supportive of Hitler, featuring swastikas, or other antisemitic content.

A recent investigation by Media Matters revealed that advertisements from Apple and IBM appeared alongside pro-Nazi messaging on the X platform. As X is designated as a Very Large Online Platform (VLOP) under the DSA, European regulators will examine whether the platform has implemented reasonable measures to mitigate the amplification of illegal content. Additionally, they will scrutinize the advertisements on X, including which groups were targeted by specific ads. It is worth noting that Musk has taken legal action against Media Matters, filing a defamation lawsuit.

The investigation will also assess whether X has provided effective access to its platform data for researchers. Concerns have been raised by researchers and journalists who have encountered difficulties in verifying statistics presented by X. For instance, the claim that only 2 out of 500 million accounts witnessed antisemitic content next to Apple advertisements has been challenged. The Center for Countering Digital Hate (CCDH), an organization that investigates hateful disinformation, asserts that X has failed to address the "unmistakable surge of extremism" on the platform.

Imran Ahmed, the founder of CCDH, expressed his concern over the situation, stating, "Since Elon Musk completed his takeover of Twitter - and especially since the October 7 atrocities carried out by Hamas in Israel - bad actors have been empowered and encouraged to spew antisemitism, lies, and hate with impunity." It is notable that Musk previously sued CCDH in August, accusing them of scaring off advertisers.

As the investigation into X commences, the European Union aims to determine whether the platform has violated the DSA guidelines, specifically in relation to risk management, content moderation, dark patterns, advertising transparency, and data access for researchers. The outcome of this investigation could have significant implications for the future of content moderation and accountability within online platforms.

The views expressed in this article do not reflect the opinion of ICARO, or any of its affiliates.

Related