makingthenoise.com : music, art, code
January 4, 2014
off

Ableton + Processing for real-time audio visualization (Part 1)

I’ve often felt underwhelmed by abstract or generic VJ material that borrows nothing from the music it supports and lends nothing back. It’s for this reason that I’ve worked over the last few years on a series of projects that create a tighter link between my music and the visuals that the audience experience.

As an example, see:

More Buttons No Problems – live @ gallery 263 from makingthenoise on Vimeo.

However, in many of these projects I’ve swung too far in the other direction in that the visuals are so hand-crafted for a specific set or song that they can’t be re-used as my audio set evolves. What I’ve longed for is a suite of modular visualization elements that plug-and-play with my live set in Ableton. I want something that’s generic enough to remain functional as my songs change over time, but tightly coupled to the music enough to demonstrate to the audience that the visuals are aware of the current state of the music.

A simple example would be to ensure that all of my modular visuals can react to the downbeat of the song. Synching visuals to a click-track or BPM has been done by VJ’s for years and is one small step in making visuals more reactive to audio. However, a more advanced example would be to ensure that all of my modular visuals know how to react to a “filter sweep” that occurs in any song. The filter sweep is a popular musical trope that adds or releases tension – why shouldn’t the visuals respond to that tension?

Over the next couple posts I’ll show examples of what I’m working on and how Ableton & Processing are working together to create this experience. ┬áHere’s a small teaser:

mtn visuals 1