Consultant Faces Massive Fine and Criminal Charges for Impersonating President Biden in AI Robocall
ICARO Media Group
In a shocking turn of events, a Democratic political consultant is facing severe consequences for orchestrating a robocall that used artificial intelligence to impersonate President Biden. Steve Kramer, the mastermind behind the stunt, has recently been hit with a proposed $6 million fine from the Federal Communications Commission (FCC) and is now indicted on criminal charges in four New Hampshire counties.
The controversial robocall occurred on the eve of New Hampshire's January primary, targeting thousands of individuals. Recipients were greeted by an AI-generated voice that convincingly imitated President Biden, discouraging Democrats from casting their votes. Kramer has openly admitted to commissioning the fake call, leading to widespread outrage.
New Hampshire Attorney General John Formella, in a statement announcing the charges, expressed hope that their enforcement actions will serve as a deterrent to any potential election interference, be it through artificial intelligence or any other means.
This incident, commonly referred to as a deepfake, highlights the growing concerns surrounding the widespread and rapid proliferation of generative artificial intelligence. Capable of producing realistic audio, video, images, and text, such technology has opened new avenues for fraud, scams, and manipulation.
FCC Chair Jessica Rosenworcel expressed her unease regarding the New Hampshire robocall, emphasizing how callers who sound familiar can easily deceive unsuspecting individuals. She stated, "This is unnerving because when a caller sounds like a politician you know, a celebrity you like, or a family member who is familiar, any one of us could be tricked into believing something that is not true with calls using AI technology."
In response to this incident, the FCC unequivocally ruled that using AI-generated voices in robocalls is illegal. The announcement of the proposed fine highlights that Kramer violated federal law by spoofing the number of a local political figure. Kramer will have the opportunity to respond and provide evidence before the FCC reaches a final decision.
Additionally, the FCC also proposed a $2 million fine against Lingo Telecom, the company accused of transmitting the fraudulent robocall. However, neither Kramer nor Lingo has responded to requests for comment thus far.
Kramer claims that he created the deepfake as a warning about the dangers posed by AI technology. Nevertheless, his actions have brought attention to the urgent need for regulations to curb the misuse of AI and ensure the integrity of elections.
As the case against Kramer unfolds, it serves as a grim reminder of the potential threats posed by the misuse of AI-generated content. With technology advancing at an unprecedented pace, lawmakers are faced with the daunting task of safeguarding the public from the dangers of AI manipulation.