TouchDesigner - Javascript - p5.js - Hydra - Chataigne
A thunderstorm simulation allowing the participants to use artifacts to control the rain, wind and lightning in the room.
As the Lead Programmer, I worked on transmitting data between Qualysis Motion Capture, Chataigne, TouchDesigner, Resolume, Ableton, and Unreal Engine in order to get real time reactions.
This project combines two prototypes that were developed for my Solo Studio Course at UT Austin. The initial prototypes were 2 looping graphics created using RayTK and a hand tracking experiment using MediaPipe, which have now been combined to create this interactive visual.
Programmed to allow the user to control visuals using hand controls.
By using the distance between my pointer and thumb I am calculating the distance between the two and can use it as a trigger or value to control different parameters.
The kaleidoscopic pattern and animated effects are all powered by RayTK.
Song - Golden Age by Yuuki Hori
At first I wasn't aware of how to utilize individual data points across the hand and only used spacial distance between hands as a control. I used the built in MediaPipe function to display markers at the points of the hand.
I figured out how to calculate any specific distance between points on the hand and utilized these controls for the final visuals. I learned how to chose specific points that I wanted to use for each specific action.
This project tested me ability to utilize a collection of data points in order to trigger events or control parameters. I learned a lot about the logic behind these types of actions and how to better organize and develop work within TouchDesigner.
With more time and computing power I would have loved to add in a few different visuals that you can switch between with different gestures.
At the University of Texas at Austin, the Arts and Entertainment Technologies classes worked together to create audio, video, and interactive installations for this event.
In my Advanced Creative Coding course, I developed a series of audio-reactive visuals using p5.js that I can adjust and control live using a MIDI controller and live coding with hydra.
I began by coding 4 unique, yet cohesive, canvases that are all audio reactive and use FFT to effect the ascii art.
I brought those canvases into Ted Davis P5.LIVE and used hydra to do more effects on top of those canvases. These effects included oscillation, modulation, pixelation, and more.
In order to give myself quicker control during the live performance, I mapped midi pads to the different canvases, allowing me to easily switch between them.
At the Audio Pixel Collider event I performed for a 15 minute slot as the event began and started to fill up. I live coded on 2 large laser projectors along the entry walls to the event.
Motion graphic for the CRVO x Nerve Damage EP
I coded audio reactive visuals that reacted to the song and resulted in a unique texture that was used by Sam Blumberg to create the final graphics for the track. The code was written in p5.js with a hydra overlay.
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.