Z Vector is a new visualization tool that lets you use depth sensors, like the Microsoft Kinect and PrimeSense Carmine, “to sample reality in real-time, visualize it the way you like and explore it in 3D with Full HD resolution.”
Designed to be a live performance tool, rather than a programming environment, Z Vector can animate nearly every variable within the software, using integrated sound analysis algorithms, synthetic rhythms or via externally triggered MIDI/OSC signals. Continue reading
This video demonstrates Gestrument Kinect – an app that converts Kinect data to MIDI that can be used with Gestrument for iPad. Continue reading
This video, via robotmakers, demonstrates using a Microsoft Kinect as a gestural controller for modular synthesizers.
Here are the technical details: Continue reading
The V Motion Project is a striking multimedia project that combines the talents of musicians, dancers, programmers, designers and animators to create a ‘visual instrument’ that uses the Microsoft Kinect to capture movement and translate it into music and visuals.
Here’s what the developers have to say about the V Motion Project and the video above:
We created and designed the live visual spectacle with a music video being produced from the results. We wanted it to be clear that the technology was real and actually being played live. The interface plays a key role in illustrating the idea of the instrument and we designed it to highlight the audio being controlled by the dancer. Design elements like real time tracking and samples being drawn on as they are played all add to authenticity of the performance.
The visuals are all created live and the music video is essentially a real document of the night.
Kinect MIDI Controller is an open source project designed to let you use a Microsoft Kinect as a MIDI controller.
The the project uses Microsoft Kinect SDK to track the skeletal data.
The Kinect SDK provides the logic to detect the X and Y Co-ordinates of a user’s hands. The X & Y Co-ordinates are then scaled and converted to MIDI messages. These MIDI messages are then sent to the MIDI output port. The tool contains a .NET Wrapper for the MIDI interfacing methods provided by winmm.dll Win32 API.
Details are available the project site.
Developer/producer Chris Vik explains a new work, Carpe Zythum, in which he uses a Microsoft Kinect to conduct a MIDI performance:
I’ve created my own software “Kinectar“, which allows the use of the Kinect to control MIDI devices, ie. playing notes through simple gestures and motion.
The Melbourne Town Hall Organ got a referb in the late 90s adding the ability of MIDI messages to active the notes… and so, this happened.
The Kinectar Performance Platform is a toolkit that allows you to use your Microsoft Kinect sensor as a fully-fledged MIDI controller.