Date: Wed, 19th Feb 2025
Time: 11am to 2pm
Venue: Monash University Museum of Art, Education Lab
PRESENTERS/ FACILITATORS:
Machine Listening
WORKSHOP DESCRIPTION:
Ego4D is a massive-scale egocentric video dataset and benchmark suite released by Facebook AI in 2021 to advance the automation of ‘egocentric perception’. This is the ‘first person’ perspective of virtual reality, robotics, smart glasses, and the Metaverse. Like all perspectives, it has to be constructed. So, Ego4D comprises 3,025 hours of daily-life activity video captured by 855 ‘unique camera wearers’ from 9 countries, and all painstakingly annotated by low-paid ‘narrators’ for machinic analysis.
In this workshop, we will examine Ego4D through a curated selection of clips, read excerpts from the paper announcing it, and experiment with annotating “egocentric” videos ourselves.
The discussion will depart from #C, a multichannel installation by Machine Listening that loops twenty 4-minute clips from Ego4D—just 0.03% of the dataset—each narrated by an ambiguously located synthetic voice and layered with audio and annotations from other videos in the dataset. None of this was ever meant for human eyes or ears. Ego4D is dataset cinema for an algorithmic audience. But #C does more than reveal what’s behind the curtain; it deliberately disorients. Video, audio, text, and voice are de-aligned, tweaked, and reconfigured, drawing out the dataset’s inherent weirdness, artifice, and voyeurism: a sense, perhaps, that perspective itself is being captured, mined, and commodified.
Advance Reading:
Ego4D: Around the World in 3,000 Hours of Egocentric Video
Presented as part of Recompositions and Image Economies.