Generative art that are interactive is something I have always loved, especially since visiting large scale exhibitions by studios like teamLab and Moment Factory. While the main purpose of computer graphics is to generate beautiful images, these visual experiences can be greatly elevated by integrating sensors and input systems to involve the audience in the art form. Recently, I have been playing around with TouchDesigner and the LeapMotion controller to create interactive art using hand motions. However, these experiences are thus far limiting in space. It is now time for me scale it to the next level both in terms of display output and the input system.
The theme of this project focuses on two important interests of mine: dancing and fashion. I recently discovered the music genre of Deep House, which is both a type of dance as well as the genre heavily used in the fashion industry for fashion runways or stores like ZARA. For example, check this song out to get a sense of the vibe. My plan is to integrate these two elements into my work as I design the overall audience experience and the visuals.
The goal of this project is to create an audio-and-motion-reactive visualization of Deep House music. Specifically, there are a few core objectives that I would like to meet:
- The visuals react to a playlist of Deep House music coherently. The overall mood established should follow the beats and pace of the music.
- The project will be rendered in real-time at an interactive rate (minimum 30 fps).
- The audience can interact with the visualizer by moving their bodies and dancing to the music.
- The project can be scaled to a larger environment, ideally displayed using a projector and have multiple people interact with it at the same time.
- The visuals must be polished with fine-tuned color schemes that feel in place with the overall theme of the project.
- BONUS: Install the project somewhere on campus for people to actually interact with, and record a video showing this.
*The above goals are dependent on my ability to source the hardware.
- TouchDesigner Artist: Bileam Tschepe - he has lots of cool patterns created in TouchDesigner that would go well with Deep House.
- Universe of Water Particles - teamLab - one of my favorite pieces from teamLab. Love the use of lines as waterfall. The color contrast between the waterfall and the flowers also works very well.
- TouchDesigner: Popping Dance w/ Particles - an example of the type of interactivity that's possible with TouchDesigner.
- TouchDesigner: Audio-Reactive Voronoi - the type of background visuals that would be good for this project. Nothing too complex.
- Taipei Fashion Week SS22 - Ultra Combos (Only available in Chinese) - love the aesthetics of the background for this fashion show.
This project will be implemented using TouchDesigner simplifying technical implementations. The inputs to the system will be a depth sensor camera such as ZED Mini or Kinect, as well as audio signals from the music. These signals will be used to drive the various parameters of the visualization.
The visualizer can be broken down into multiple visual layers which are composited together. Specifically there are four layers that are included in the project:
-
Background Layer: this layer will be the background for the final render. It will include procedurally generated patterns that are relatively simple, such that they do not overpower the foreground elements. These patterns will be audio-reactive.
-
Interaction Layer: this layer will contain procedural elements that are motion-reactive thus making it interactive. For example, a particle system can be included that are reactive to the motion of the audience. This should be the primary focus of the art for the audience.
-
Reprojection Layer (Stretch Goal): this layer is optional and will only be implemented if time permits. It will contain a stylized reprojection of the actor. This layer allows a clearer indication of where the actors are. This reprojection layer can be audio-reactive.
-
Post-processing Layer: this layer is for enhancing the visuals by applying post-processing effects to the previous layers. This layer is crucial in achieving the desired look, feel and mood.
Finally, the composited render will be displayed using a projector.
The project will explore many common procedural techniques, including but not limited to the following:
- Particles Simulation - particle systems will be used to add interactivity to the scene, and/or as an decorative element. These systems will be driven by custom forces that are guided by noise functions and input signals. The GPU-based particle systems will be used in TouchDesigner in order to meet the real-time requirement.
- Procedural Patterns - procedural patterns will be generated using noise and toolbox functions along with basic geometric shapes. The idea is to generate a simple audio-reactive background that complements the main interactive layer.
- Optical Flow - optical flow is a common technique used in TouchDesigner with a camera to affect the image output. The motion between frames captured from the camera will be converted to velocity signals that can drive other parameters in the scene.
- Noise and Toolbox Functions will be used everywhere!
- Post-processing Techniques - bloom effect, blur, feedback, distortion and edge detection will be experimented with to enhance the visuals.
- Coloring - a specific color palette will be selected that best describes the theme of the project. This is all about making it look pretty!
- Establish the color tone
- Curate the playlist to go with the visualizer
- Source the required hardware
- Create a simple audio-reactive pattern for the background layer
- Create a simple motion-reactive visual using the webcam for the interaction layer
- Fine tune the shape for the background layer
- Connect the depth sensor to the interaction layer and finalize the experience
- Add in post-processing layer
(Will not be working from 11/23-11/28)
- If time permitting, implement a simple reprojection layer
- Color grading and parameters tuning
- Install the project somewhere on campus
For Milestone 1, I experimented with various procedural patterns in order to find one that works best with the vibe of the music as the background layer. Below are some of the patterns that I was able to create following Bileam Tschepe's awesome tutorials. These patterns are currently all just black and white. The color and composition mode will be determined in the next milestone.
Tile Pattern | Texture Instancing |
---|---|
![]() |
![]() |
Texture Feedback | Line Feedback |
---|---|
![]() |
![]() |
Additionally, I have also tested out the optical flow tool inside TouchDesigner by feeding in a house dance video for testing. This particular video works well because the camera is static, and the framing of the actor is similar to how I would set up my camera sensor. Theoretically, I should be able to just swap the footage with a camera input later on and everything should work accordingly.
By processing the optical flow data using a feedback loop, we can get a smoothly motion-blurred silhouette of the actor. Finally, optical flow is then used to drive a simple particle system as shown below. The result is very promising as I may potentially not have to get a depth sensor camera and can drive everything using just a webcam.
Finally, if we composite one of the audio-reactive patterns with the interaction layer, we can get something interesting like this:
However, I'm still not happy with the overall look. I do have patterns that I can work with but none of them really speaks with the music. Therefore, I may take or discard elements from each pattern to create something more original and interesting. Moreover, I'm also not satisfied with the color tone. There are some interesting color palettes here based on NY fashion week that I will be exploring.
For Milestone 2, I did not get as far as I would have liked to, but I was able to tweak some of the parameters of the procedural elements and make more of the parameters audio-reactive. I changed the texture instancing pattern to have larger texture size such that more of the particle systems can be seen once overlayed by the procedural pattern. The particles are now spawned at the bottom, closer to the feet, since House dance focuses a lot on the bottom movements. I have brought the reprojection of the actor as a foreground element so it can be seen more clearly now. Finally, I have also changed the color tone to be cooler, although I'm still not 100% satisfied with it.
Here's a video of the current state:
milestone2_480p.mov
I was not able to test it with a depth sensor or projector as I was not able to acquire them on time. However, I will be getting the ZED mini this week and will rent a projector from the library to test the setup.
For the final version of the project, I have separated out the procedural patterns as individaul scenes instead of composing them all together. This allows the audience interactions to be more visible, while reducing the overall complexity of the scene, which was previously quite overwhelming. The four scenes are: flakes, lines, tiles, and waterfall.
For the live installation, I've set up the projector and ZED mini on a desk and projected the visuals on a plain wall in our lab. Since the space is limiting, the size of projection is roughly 3.2m x 1.8m. Ideally, it could be scaled up even further.