Interactive video journal, week 1: Kinect & Max/MSP
My latest undertaking in music video production is in exploring the possibilities of capturing with the Kinect camera and manipulating the footage in Max and RGBDVisualize. I’m developing a music video as a collaborative project between myself and my brother-in-law (a musician) and his girlfriend (a dancer). This week I got the initial set-up running to use the kinect with Max/MSP, created a basic patch that I will add more complex effects to, and recorded a few video samples of the effects to show my collaborators. I also researched the set-up for RGBDVisualize, which is a self-contained ‘tool-kit’ created by a group of artists that essentially captures video and allows you to manipulate it in a 3D space. I got the software set up and started playing around with it. The system requires specific mounting hardware for the kinect and a DSLR camera, which I’ve ordered from the developers, and will get the mount on Tuesday and set up the camera calibration so I can begin capturing video. Ultimately I intend to capture video from both Max/MSP and RGBDVisualize, and edit those clips into the final music video.
Included in the images below are the Max patches I’ve been working with, and screen shots as well as video samples of different effects. Each video sample has a corresponding screen shot of the max patch, as well as the max patch itself. sample_blobs, sample_snake, and sample_distortion correspond with the max patches by the same name, while the other screen shots and the longer dancing video are from the max patch depth-freenect-canny. Test_2.mov is the best sample showing dancing movement with effects applied from the freenect-canny patch.
This week I’m working on improving the quality of the video output, and will be talking to Peter Elsea about processing optimization and my options using Cinder. I’m essentially figuring out how to enhance these effects, and combine them to get more interesting visuals. I want the color to change in someway that it seems to be affected by the dancer’s movement, so I’m going to play around with triggering different colorspaces at key moments in the song. Now that I’ve got patches working I’m going to start working directly with the song, considering the choreography and how to make interesting stuff happen in line with the music rather than focusing solely on the coding.