The V Motion Project is a striking multimedia project that combines the talents of musicians, dancers, programmers, designers and animators to create a ‘visual instrument’ that uses the Microsoft Kinect to capture movement and translate it into music and visuals.
Here’s what the developers have to say about the V Motion Project and the video above:
We created and designed the live visual spectacle with a music video being produced from the results. We wanted it to be clear that the technology was real and actually being played live. The interface plays a key role in illustrating the idea of the instrument and we designed it to highlight the audio being controlled by the dancer. Design elements like real time tracking and samples being drawn on as they are played all add to authenticity of the performance.
The visuals are all created live and the music video is essentially a real document of the night.
Developers have announced the Leap – a new $70 motion sensor that they say is ’two hundred times more accurate than any product currently on the market.”
Like the Microsoft Kinect, the Leap is designed to translate your gestures and movement into computer control. But the developers suggest that the Kinect is a toy, compared to the Leap:
This isn?t a game system that roughly maps your hand movements.
The Leap technology is 200 times more accurate than anything else on the market ? at any price point. Just about the size of a flash drive, the Leap can distinguish your individual fingers and track your movements down to a 1/100th of a millimeter.
Here’s a video introduction for the Leap:
Glidepro has released Ethero 2 – a gestural MIDI controller for iOS.
The traditional Theremin is usually controlled without any physical contact, with the player using a hand to control the pitch and volume of the sound. Ethero 2 works by using the camera of an iOS device and gestures to MIDI notes.
Ethero 2 doesn’t capture position, as some Kinect-based MIDI projects have done. Instead, it senses variations in the light hitting the camera sensor, so moving your hand or other object in front of the camera will vary the pitch of the notes.
You can use the MIDI output of Ethero 2 via either the Camera Connection Kit, MIDI over WiFi or virtual ports to control apps on the same device.
Developer/producer Chris Vik explains a new work, Carpe Zythum, in which he uses a Microsoft Kinect to conduct a MIDI performance:
I’ve created my own software “Kinectar“, which allows the use of the Kinect to control MIDI devices, ie. playing notes through simple gestures and motion.
The Melbourne Town Hall Organ got a referb in the late 90s adding the ability of MIDI messages to active the notes… and so, this happened.
The Kinectar Performance Platform is a toolkit that allows you to use your Microsoft Kinect sensor as a fully-fledged MIDI controller.