The Computer Orchestra is a crowdsourcing platform, created by Simon de Diesbach, Jonas Lacôte, Laura Perrenoud, that allows users to create and conduct their own orchestra.
Users can choose to upload their own music or download samples to integrate into their formation Once the ‘orchestra’ is configured, users can direct it with the movements of their body.
Here’s a demo of the Computer Orchestra: Continue reading
This video demonstrates Gestrument Kinect – an app that converts Kinect data to MIDI that can be used with Gestrument for iPad. Continue reading
This video, via robotmakers, demonstrates using a Microsoft Kinect as a gestural controller for modular synthesizers.
Here are the technical details: Continue reading
The V Motion Project is a striking multimedia project that combines the talents of musicians, dancers, programmers, designers and animators to create a ‘visual instrument’ that uses the Microsoft Kinect to capture movement and translate it into music and visuals.
Here’s what the developers have to say about the V Motion Project and the video above:
We created and designed the live visual spectacle with a music video being produced from the results. We wanted it to be clear that the technology was real and actually being played live. The interface plays a key role in illustrating the idea of the instrument and we designed it to highlight the audio being controlled by the dancer. Design elements like real time tracking and samples being drawn on as they are played all add to authenticity of the performance.
The visuals are all created live and the music video is essentially a real document of the night.
Developers have announced the Leap – a new $70 motion sensor that they say is ‘two hundred times more accurate than any product currently on the market.”
Like the Microsoft Kinect, the Leap is designed to translate your gestures and movement into computer control. But the developers suggest that the Kinect is a toy, compared to the Leap:
This isn’t a game system that roughly maps your hand movements.
The Leap technology is 200 times more accurate than anything else on the market — at any price point. Just about the size of a flash drive, the Leap can distinguish your individual fingers and track your movements down to a 1/100th of a millimeter.
Here’s a video introduction for the Leap:
Glidepro has released Ethero 2 – a gestural MIDI controller for iOS.
The traditional Theremin is usually controlled without any physical contact, with the player using a hand to control the pitch and volume of the sound. Ethero 2 works by using the camera of an iOS device and gestures to MIDI notes.
Ethero 2 doesn’t capture position, as some Kinect-based MIDI projects have done. Instead, it senses variations in the light hitting the camera sensor, so moving your hand or other object in front of the camera will vary the pitch of the notes.
You can use the MIDI output of Ethero 2 via either the Camera Connection Kit, MIDI over WiFi or virtual ports to control apps on the same device.
Developer/producer Chris Vik explains a new work, Carpe Zythum, in which he uses a Microsoft Kinect to conduct a MIDI performance:
I’ve created my own software “Kinectar“, which allows the use of the Kinect to control MIDI devices, ie. playing notes through simple gestures and motion.
The Melbourne Town Hall Organ got a referb in the late 90s adding the ability of MIDI messages to active the notes… and so, this happened.
The Kinectar Performance Platform is a toolkit that allows you to use your Microsoft Kinect sensor as a fully-fledged MIDI controller.
Yonac Software sent word to us of GhostGuitar – a new app for the iPad that takes the device into Kinect style augmented reality territory.
We’ve previously featured a variety of Kinect music hacks that explore the ‘augmented reality’ potential of the device.
GhostGuitar is a music game for iOS that explores similar territory, creating a virtual ‘air guitar’ that can actually be played – at least at music game level. Continue reading