I COULD actually go on for ages about the visualisation system I’ve been developing for roughly two years now, as over time it grew from something rather simple to the much more flexible tool it is now, and it hasn’t always been an easy road. Chances are it would not have been all that interesting to anyone but myself, so instead I decided to post here a little bird’s eye view of my performance rig.
Here it goes, TADAA:
All audio and sequencing is handled by Live, via Max4Live devices sitting on individual tracks and sending UDP telegrams to ERG, which is a meta-effect I have written in C++. Different geometry configurations and transitions can be triggered perfectly in sync with the audio, and a number of parameters can be modulated in realtime. The VJ uses a 6 DOF controller to directly change the state of the system, with immediate, and often quite surprising results, since at the heart of ERG lies a video feedback loop which can exhibit chaotic behaviour. While ERG can operate on pure geometry and/or a webcam input, to make things more useful for wider range of occasions I have recently implemented texture sharing with an external application using the Spout library. In this case the external media server providing the video is Resolume, but it could also be a vvvv patch, Max/Jitter, Cinder code or any other software supported by Spout.