13b
Can deepfakes be understood beyond frameworks of disinformation and deceit? With a growing concern for disinformation and fake news, deepfakes have come to occupy a politically significant place in 2024, especially because 60 countries in the world have elections due this year. Among the many issues raised by deepfakes such as disinformation, fraud, harassment and revenge porn, this note will focus on the relationship between death, populism and deepfakes in India and other parts of the world. Earlier this year, a video of the iconic Tamil politician M Karunanidhi (who passed away in 2018) appeared for the election campaigns in Tamil Nadu, India, where he was popularizing the work of his son MK Stalin, the current Chief Minister of the state. As reported by Nilesh Christopher for Al Jazeera: “Karunanidhi’s last public interview was in 2016, before his voice turned coarse, and his body frail. Nayagam [owner of the tech firm producing these videos] used publicly available data of Karunanidhi to train a speech model and recreated the 1990s likeness of the leader when he was much younger. The script for the prerecorded AI speech, he said, was supplied by the local DMK [Karunanidhi’s party] cadre, and was vetted by party personnel.”
Last year, a similar revival of the dead took place through deepfake songs in the voice of the beloved Punjabi singer Sidhu Moose Wala who was shot dead due to gang violence in Punjab. Moose Wala’s murder at the age 28 created an uproar in Punjab, and a bereaved fanbase created and circulated AI songs in his voice. His songs were often critiqued for their explicit display of weapons and violence, but also praised for their overtly anti-establishment tone. In addition to political concerns of disinformation, the technological resurrection of the voice of the dead has significant consequences for populist politics and celebrity culture in India and other parts of the world. As a particular case of synthetic media, audio deefpfakes in the cases mentioned above, can be understood better through the lens of ‘affect’ rather than ‘truth’. The purpose in these cases is not to disinform the public as people are aware of Moose Wala’s and Karunanidhi’s death. One could thereby argue that synthetic media are effective in such cases because of the mediatic shift in populism in the last decade or so. While twentieth century populism, especially in relation to charged publics and crowds, has been theorized as a physical phenomenon premised on the charisma of the leader, or architectural and spatial atmospherics, the logics governing popular sovereignty changed significantly with social media and the proliferation of mobile phones in the last decade. Publics in Tamil Nadu and Punjab shared an affective relationship with the voices of both Karunanidhi and Moose Wala, and voice cloning became a means to continue that relationship in an environment that was always already mediatized.
While audio deepfakes as deceitful media could be interpreted as part of larger debates on post-truth and statistical/predictive datalogical operations, these instances of voice cloning the necroworld present the conceptual register of affect as an equally significant domain of analysis to understand populism and celebrity culture. In their work on the revivification of dead celebrities by the music industry, Jason Stanyek and Benjamin Piekut label this phenomenon as an attribute of late capitalism: “What is “late” about late capitalism is the new arrangements of interpenetration between worlds of living and dead, arrangements that might best be termed intermundane” (16). They present the resurrection of the dead in the music industry as a form of co-laboring with the living, producing a synergy between embodied and disembodied voices. The affective impact of audio deepfakes, however, emerges from resurrecting voices that were already consumed through mediatized disembodiment, which is further amplified by the circulatory potential of social media platforms. Platforms of circulation are hence as central to the affective life of deepfakes as are voice cloning software and firms.
The fans and citizens in India are aware that they are watching or listening to speeches and music by celebrities and politicians who are already dead. Such a phenomenon cannot be understood through frameworks of truth or deceit. They rest on the continuity of a fan-celebrity or voter-politician relationship that was premised on the affective politics of mediatized personalities.
Experiment
Search for a recent audio deepfake that circulated on social media. Do a close listening of the deepfake and how it might have been produced. Trace social media posts, comments, news coverage and other related material you might find around it. Discuss your observations in class.
Resources
- Nilesh Christopher. “How AI is resurrecting dead Indian politicians as elections loom,” Al Jazeera, 12 February 2024.
- Simone Natale. Deceitful Media: Artificial Intelligence and Social Life After the Turing Test. Oxford University Press, 2021.
- Yashraj Sharma. “Fans use AI deepfakes to keep a slain Indian rapper’s voice alive,” rest of world, 6 June 2023.
- Jason Stanyek and Benjamin Piekut. “Deadness: Technologies of the Intermundane,” TDR/The Drama Review 54 no. 1 (205) (2010): 14-38.