Comedian George Carlin's Estate Settles Lawsuit Over AI Deepfake
ICARO Media Group
The estate of renowned comedian George Carlin has reached a settlement in a lawsuit against the creators of a comedy podcast that claimed to use artificial intelligence (AI) to mimic Carlin's voice. This legal battle, one of the first in the United States to address the legality of deepfakes imitating a celebrity's likeness, has now come to a resolution.
The podcast in question, called Dudesy, featured former Mad TV comedian Will Sasso and writer Chad Kultgen as its creators. As part of the settlement, they have agreed to remove all versions of the podcast from the internet and permanently refrain from using Carlin's voice, likeness, or image in any future content.
While the terms of the settlement were not disclosed by either side, both Carlin's family and an attorney for his estate have expressed satisfaction with the outcome. Kelly Carlin, the comedian's daughter, stated in a release, "I am pleased that this matter was resolved quickly and amicably, and I am grateful that the defendants acted responsibly by swiftly removing the video they made."
The lawsuit was initiated after the Dudesy podcast, which claimed to incorporate AI into its comedy routines, posted an hour-long special on YouTube titled "George Carlin: I'm Glad I'm Dead." Carlin's estate argued that this video violated his rights of publicity and copyright, referring to it as "a casual theft of a great American artist's work."
The special was introduced by an AI character named "Dudesy" who claimed to have watched Carlin's work and then created a stand-up set in the same style. Although it is unclear which parts of the fake Carlin set were AI-generated, Sasso's spokesperson Danielle Del asserted that the character was not AI-generated, and Kultgen wrote the entire fake Carlin special without relying on previous works.
Even though the podcast did not use Carlin's comedy to train an AI, an attorney for the estate argued that creating an impersonation using AI still violated Carlin's rights. The potential harm of these deepfake videos was underscored, as clips could be stripped of context and misleadingly shared online, deceiving viewers into believing they were authentic Carlin content.
The settlement serves as a reminder of the entertainment industry's delicate relationship with AI. The emergence of readily available generative AI tools has raised concerns among creators about unauthorized imitations of both living and deceased artists. Recent deepfakes of celebrities like Taylor Swift have intensified the calls for stricter regulations to prevent malicious or non-consensual use of this technology.
In a related development, over 200 musicians signed an open letter urging developers and tech companies to cease producing AI tools that could undermine the rights of artists and steal their likenesses. Additionally, several states, including Tennessee, have passed legislation to address the use of deepfake technology without an artist's consent.
While this case settled expeditiously, it sheds light on the potential for future legal disputes regarding whether AI-generated imitations can be considered permissible parodies under fair use. Some argue that there is a fundamental difference between humans impersonating public figures and generative AI tools creating similar impressions.
"There's a big difference between using an AI tool to impersonate someone and make it appear authentic, versus someone wearing a wig and a jacket," explained Josh Schiller, a partner at Boies, Schiller, Flexner and lawyer for Carlin's estate. "You know that person is not George Carlin."
As AI technology continues to advance, it is crucial to strike a balance between creative expression and protecting the rights and legacy of acclaimed artists like George Carlin. The resolution of this lawsuit marks an important milestone in navigating the evolving landscape of AI's relationship with the entertainment industry.