This video, via robotmakers, demonstrates using a Microsoft Kinect as a gestural controller for modular synthesizers.
Here are the technical details: Continue reading
The V Motion Project is a striking multimedia project that combines the talents of musicians, dancers, programmers, designers and animators to create a ‘visual instrument’ that uses the Microsoft Kinect to capture movement and translate it into music and visuals.
Here’s what the developers have to say about the V Motion Project and the video above:
We created and designed the live visual spectacle with a music video being produced from the results. We wanted it to be clear that the technology was real and actually being played live. The interface plays a key role in illustrating the idea of the instrument and we designed it to highlight the audio being controlled by the dancer. Design elements like real time tracking and samples being drawn on as they are played all add to authenticity of the performance.
The visuals are all created live and the music video is essentially a real document of the night.
Kinect MIDI Controller is an open source project designed to let you use a Microsoft Kinect as a MIDI controller.
The the project uses Microsoft Kinect SDK to track the skeletal data.
The Kinect SDK provides the logic to detect the X and Y Co-ordinates of a user’s hands. The X & Y Co-ordinates are then scaled and converted to MIDI messages. These MIDI messages are then sent to the MIDI output port. The tool contains a .NET Wrapper for the MIDI interfacing methods provided by winmm.dll Win32 API.
Details are available the project site.
Developer/producer Chris Vik explains a new work, Carpe Zythum, in which he uses a Microsoft Kinect to conduct a MIDI performance:
I’ve created my own software “Kinectar“, which allows the use of the Kinect to control MIDI devices, ie. playing notes through simple gestures and motion.
The Melbourne Town Hall Organ got a referb in the late 90s adding the ability of MIDI messages to active the notes… and so, this happened.
The Kinectar Performance Platform is a toolkit that allows you to use your Microsoft Kinect sensor as a fully-fledged MIDI controller.
This is an example performance of what you can do with Ryan Challinor’s Synapse for Kinect tools combined with Ableton Live and Quartz Composer.
Synapse is an app for Mac and Windows that allows you to use your Kinect to control Ableton Live, Quartz Composer, Max/MSP/Jitter, and any other application that can receive OSC events. It sends joint positions and hit events via OSC, and also sends the depth image into Quartz Composer. This allows you to use your whole body as an instrument.
This video demonstrates the 3D Vibes – an acoustic vibraphone extended by way of 3D gesture recognition using the Microsoft Kinect.
Kinect-Max/MSP (Kinect Datastream-OSC-MIDI)- controlling filter parameters in Ableton Live to modulate the acoustic audio input from the vibraphone in real-time.
Developed by Gabrielle Odowichuk and Shawn Trail @ MISTIC.