Elon Musk Under Fire for Reposting Manipulated Video Targeting Vice President Kamala Harris

ICARO Media Group
Politics
28/07/2024 21h27

In a controversial move, Elon Musk, the billionaire owner of the social media platform X, has come under scrutiny after reposting an edited campaign video featuring Vice President Kamala Harris. The video, which appears to have been digitally manipulated, changes the voice-over in a deceptive manner.

The altered video mimics Harris's voice but replaces her original words with statements claiming President Biden is senile, that she lacks the competence to run the country, and that she is the "ultimate diversity hire" as a woman and person of color. Images of former President Donald J. Trump and his running mate, Senator JD Vance of Ohio, have been removed, while images of President Biden have been added.

The reposted video on X does not contain a disclaimer, despite the original uploader @MrReaganUSA noting that it was a "parody." Musk, when reposting the video on his own account, made no such disclosure and simply commented, "This is amazing," accompanied by a laughing emoji. The post has since been viewed a staggering 98 million times, potentially conflicting with X's policies that prohibit the sharing of manipulated media that could mislead or confuse people.

Critics swiftly called attention to Musk's post, accusing it of violating X's policies on synthetic media and misleading identities. Some raised concerns about potentially allowing such violations during an election year. Both Musk and the owner of the @MrReaganUSA account, who appears to be conservative podcast host Chris Kohls, have yet to respond to the backlash.

The Harris campaign issued a statement condemning the reposted video, stating that the American people want the real freedom, opportunity, and security offered by Vice President Harris, not the "fake, manipulated lies" perpetuated by Musk and Trump.

The incident has reignited discussions on the dangers of deepfake videos. Pro-democracy groups have long warned about the negative impact of digitally manipulated content that spreads false information and potentially influences voter behavior. Instances like the use of AI-generated robocalls mimicking President Biden's voice during the New Hampshire primary have raised concerns about voter suppression and fraudulent misrepresentation.

The Federal Election Campaign Act, written in 1971, is currently ambiguous about addressing modern technologies like artificial intelligence. Efforts have been made to amend the law to clarify its application to deceptive AI campaign advertisements. However, opposition from the Republican National Committee, citing concerns about violating the First Amendment, has stalled progress.

Social media platforms have implemented their own policies to address manipulated media. Meta, the parent company of Facebook and Instagram, requires clear labeling and contextual information for manipulated media. Google has also enacted a policy that mandates disclosure for videos made with altered or synthetic media, including generative AI. X's current policy, implemented in April 2023, defines misleading media as content that is significantly altered, manipulated, or fabricated, and it must be labeled or removed.

As the most influential voice on X with 191 million followers, Musk's actions hold considerable weight. This is not the first time he has faced controversy, as he previously endorsed Trump shortly after an assassination attempt on the presumptive Republican nominee during a campaign rally.

The debate surrounding the reposted video has sparked discussion among X users, with some calling for Community Notes to be added to Musk's post to warn the public about AI-generated content that could confuse reality.

Ultimately, this incident highlights the urgent need for clearer regulations and guidelines regarding the use of deepfake technology in political campaigns, ensuring the preservation of the democratic process and protecting voters from deceptive content.

The views expressed in this article do not reflect the opinion of ICARO, or any of its affiliates.

Related