Tomorrow night I will be doing a performance at the launch event for the new issue of Yuck n Yum, a publication by a group of pretty hip artists in Dundee.
This show is going to be a bit of a departure for me, as I will be doing some new things that I have not yet tried in public! Hurray! Because of this, I thought it would be fun to give a sneak peek of just what exactly I will be doing. That’s what blogs are for, right?
This picture shows most of what I will be using, conveniently labeled for you (click to enlarge).
The equipment list is:
- 3 Wii remotes with Nunchuk attachments
- one-stringed guitar, fitted with ultrasonic and light sensors
- Arduino microcontroller
- mixer, preamps, etc
I’ve set out a process for creating sound out of all of this mess, with video as the starting point (another first for me).
The first step, then, is the video. I made a patch in Jitter for controlling two video streams independently, and overlaying them. The two videos can be warped, stretched, colored, and mixed together in real time. The output of each manipulated video stream is being analyzed, and sent to a synthesizer which is creating sound based on the visuals. Thus, by manipulating the videos, which are then generating sound, the video manipulation becomes an “instrument” of sorts.
In addition, the audio of each video can also be used independently of the visuals and passed through various effects.
The guitar, meanwhile, is fitted with various sensors that are plugged into an Arduino. These sensors will control sampling and playback of the guitar signal, creating textures underneath all of the sound from the videos.
Finally, the webcam on my laptop will also be activated, using live footage of myself to trigger more synthesis in Jitter.
Where do the Wii remotes fit into all of this? Well, rather than sitting in front of my laptop clicking through my Jitter patch, I will be using several wii remotes to control it all, with the ultimate goal of being able to do the whole performance without touching the computer at all. All of the video manipulation, audio effects, guitar sampling and playback, and synthesis will be controlled with the Wii remotes. The rotation of each video, for example, will be controlled by twisting and turning my left hand, while the playback speed of each video will be mapped to the movements of my right hand. A wii remote will be attached to the guitar, so the angle of the instrument will dictate the pitch of the sample playback.
Does that makes sense? My goal was to try and combine video mixing, Wii remotes, and music. If you’re in the Dundee area you should definitely come along. If not, the show will hopefully be recorded and I will definitely post the video as soon as I can.