Heres an instrument I designed as part of my final year Interaction Design module.

It takes heart rate and skin conductivity data and uses this to create and manipulate sounds.

Its been months of work and prototyping nearly every day, but it is finally finished.

Unfortunately the official performance which was being assessed as part of my coursework didn’t go so well as a power surge shut off the usb ports in my laptop during the performance. Possibly the most frustrated and angry I’ve been in as long as I can remember.

Many thanks go out to Adam Scott; His unique style of extended and experimental vocal techniques helped mould this instrument to what it is now, and I hope we can perform with the instrument in public again in the future.

Many thanks also to the other guys in the interaction design class. We’ve worked our asses off with what little we got from the highly frustrating module. Even though it brought out the worst in us at times it was everyones collective effort and determination that made going to the lab everyday bearable. 

Ableton Live Megaset Complete!

Whats in the picture only shows about a fifth of the whole set which contains 10 songs all midi mapped to my launchpad in one set (thanks to Autonome), around 1000 samples spread across 165 channels! Yip, 165 different tracks and it all runs smoothly with no blips or drop outs (thanks to no 3rd party vsts, no midi tracks - every clip is an audio file, and only using 2 plugins for effects for the whole set on two return channels).

The whole set runs at around 20% cpu meaning theres plenty of processing power left to run visuals in Processing simultaneously.

I was using Kapture to store the states of sends for different songs but because it had to read 165 track names through live’s slow a.p.i. it took around 20 minutes to load. No thanks. I’ll settle with slightly vaguer send levels so I can keep set loading times under 40 seconds.

As soon as I finished my blog post about ableton & lion compatibility they announced the release of Live 8.2.5 featuring full Lion multicore processing goodness as well as improved midi sync.

I’ve also been trying very hard to program visuals in Apple’s Quartz Composer but its just too frustrating and clumsy. I’m going to stick to Processing. I still have to close a window and open another between every song to get different visuals however which is annoying and I can’t for the life of me get my head around a mini-program that promises quick and easy switching between sketches called 'Mother' which is aimed at vj-ing in processing. My other alternative is to code something like the mother environment myself. This would involve putting each of my visual sketches into an individual class each and then place all of these classes in one processing sketch and a switching mechanism via key presses for example.

But thats harrrrrd to code. And all to save an audience seeing my operating systems desktop for a split second…. not worth it!

Sketches

^ a sketch I’ve adapted that receives volume data from Live to determine x-y positions of circles and changes colour of circles depending on button presses. Based on Caroline Kassimo-Zahnd's 'Simplicity 4'

I’ve been inspired lately by some videos I’ve watched of Making The Noise & Altitude Sickness explaining their live visual setups. Both of them use the popular visual programming package Processing which can accept both midi and open sound control (OSC) to control their visuals. So I decided to give making my own visuals a go.

The picture above contains screen shots of some of the visuals I’ve made. I wanted a visualisation that directly corresponded to buttons presses on the launchpad so I simply made a grid or squares that light up when they receive the right midi note-on messages. Some buttons also trigger colour changes or create fades which can change the whole look of the visualisation when a new sample is triggered.

I’ve also been using the LiveGrabber Max4Live devices to send volume data from Ableton Live to Processing via OSC. In the bottom-left screenshot, bass frequencies determine the x-axis position of a circle and line, and mid frequencies control the y-axis position of a circle and line. High frequencies control the colours of the circles and background.

In the bottom right screenshot I edited the incredible Schizzo sketch which draws random city scapes in real time, so that certain buttons presses can determine the starting point at which the first building is drawn as well as wiping the sketch clean and starting again.

The sketches I made at the top are purely another way of visualising what buttons I’m pressing during a song (useful in situations where all of the audience may not be able to see my launchpad). Whereas the bottom two examples are more about creating a piece of art on the fly via music. My thinking behind this is that when I start the piece there will be a blank canvas on the screen, but when I’m finished the song it will be a mini work of art that will be unique to the song that was just played.

Me in 20 years

^ computers kick ass