
Cadence, an Immersive Media Exhibit
Role: Creative Technologist, Fabricator, Sound Designer
Inspired by my avant-garde visions of interactive environments, I designed Cadence to dissolve the boundaries between spaces and their inhabitants.
In this living exhibit, your presence and movements coalesce with evolving visuals and soundscapes, creating a dynamic tapestry of emergent interaction.

Signal flow diagram

Concept sketch

Final set up
Goal
This was my first project in this domain, and my goal was to harness new technologies to create a playful interactive experience from conception to completion.

Sketching and vision

Learning the technologies

Prototyping and testing

4. Showcase
Unclear affordances and emergent systems
Testing showed that the control gestures weren't obvious, but I refrained from displaying instructions— this led users to interact with the space in raw and interesting ways.




The empty exhibit pulsed with a latent energy that awakened the moment someone stepped inside, prompting further exploration.

Activation on detection

Changing scenes based on player count (up to 6)
Ideation, trial, and error
Initially, I wanted to create a space that creates a trail of disappearing noise following the participant, inviting them to explore the themes of digital decay and revisitation.


Envisioned layout

Top down concept
I tried using stable diffusion for generating a trail, but quickly realised that the laptop we were given couldn't handle running the model.

StreamDiffusion in Touch Designer (gif from tutorial)
Bringing this exhibit to life meant tackling multiple smaller pieces, each of which I had to learn from scratch. Aditya handled the particle system while I shaped the overall vision—designing the soundscape, setting up skeleton tracking, crafting the projection surface, and designing interactions.

Skeleton tracking

Projection Mapping

Particle generation and physics

Soundscape designed with vertically layered elements
Interfacing with Ableton (Click to unmute)

Coding with my best friend
Testing
A proof of concept was presented to a panel from the performing arts center during an open house, doubling as a chance to test the interactions and fine-tune the experience in a real-world setting.



Participants recognized themselves in the particle system, and that their movements influenced both the particles and the soundscape. They were most thrilled during the bigger scene changes—reacting as people stepped in, and shifting visual and soundscape scenes based on the number of people in the crowd.
Fabrication and set-up
The files were set, the workflow was clear, and testing gave us a solid understanding of what to expect. It was finally time to assemble everything for the main event.

Ad-hoc creation of the projection surface

TouchDesigner master file

We were only given access to the venue 10 hours before the final showcase, so I got to work assembling the projection surface and projectors.


Taping projectors down to limit misalignment

"DON'T BLOCK THE PROJECTOR"
The Tech
TouchDesigner was the heart of the project. A Kinect Azure captured participants’ skeleton data, feeding it into TouchDesigner to render and manipulate particle systems, which were mapped onto the projection surface.
This data also influenced Ableton via TouchDesigner, shaping MIDI parameters within the vertically layered soundscape. In turn, Ableton sent MIDI note information back to TouchDesigner, dynamically altering particle behaviors based on the audio.
Development

Learning things
Touch Designer, Projection Mapping,
I wanted to do something with stream diffusion, but settled for a particle system generated with noise and affected with physics based on the movement of the users



Playtest
I wanted to do something with stream diffusion, but settled for a particle system generated with noise and affected with physics based on the movement of the users