I’ve been inspired lately by some videos I’ve watched of Making The Noise & Altitude Sickness explaining their live visual setups. Both of them use the popular visual programming package Processing which can accept both midi and open sound control (OSC) to control their visuals. So I decided to give making my own visuals a go.

The picture above contains screen shots of some of the visuals I’ve made. I wanted a visualisation that directly corresponded to buttons presses on the launchpad so I simply made a grid or squares that light up when they receive the right midi note-on messages. Some buttons also trigger colour changes or create fades which can change the whole look of the visualisation when a new sample is triggered.

I’ve also been using the LiveGrabber Max4Live devices to send volume data from Ableton Live to Processing via OSC. In the bottom-left screenshot, bass frequencies determine the x-axis position of a circle and line, and mid frequencies control the y-axis position of a circle and line. High frequencies control the colours of the circles and background.

In the bottom right screenshot I edited the incredible Schizzo sketch which draws random city scapes in real time, so that certain buttons presses can determine the starting point at which the first building is drawn as well as wiping the sketch clean and starting again.

The sketches I made at the top are purely another way of visualising what buttons I’m pressing during a song (useful in situations where all of the audience may not be able to see my launchpad). Whereas the bottom two examples are more about creating a piece of art on the fly via music. My thinking behind this is that when I start the piece there will be a blank canvas on the screen, but when I’m finished the song it will be a mini work of art that will be unique to the song that was just played.

Me in 20 years

^ computers kick ass