ATLAS treats a live performance as a living organism — 1788 atoms suspended in a shared void, continuously reorganizing around the concepts you bring to them. The globe is not a visualization of data. It is the data, breathing.
Each set is decomposed across four layers of knowing: the sonic texture Gemini hears in the audio, the visual witness of 1fps video frames, the raw motion and bodily presence captured in native video clips, and the critical and cultural language written about the work. Curation happens at every layer simultaneously — in the tilt of a performer's body, the grain of a room, the way a critic reaches for metaphor. ATLAS makes those layers navigable as a single space.
When you enter a query, Gemini Embedding 2 Preview encodes the concept as a 768-dimensional vector and computes cosine similarity against every atom in real time. Matched atoms surface across all 16 molecules at once — you see not just where a concept lives, but whether the four kinds of knowing agree or diverge. A semantic mix then assembles the highest-matched video clips into a sequence, narrated by excerpts drawn from the text molecule of the same set — edited by meaning, not by time.
Built by
Kiru Mehari.