Berlin-based Nagual Sounds has developed a new gestural musical instrument, Nagual Dance, based on the Microsoft Kinect.
The system is based on two main components:
- Control data – from sensors, cameras, files and all kinds of sources – is used to turn gestures and movement into sounds; and
- The Soundscape -an interactive piece of music that doesn’t consist of pre-recorded musical sequences, but of musical possibilities. The music decides the behavior of the used data source.
In the example above, a Kinect camera is used by Nagual Dance to control the Soundscape ‘Firedance’.
Here’s how it works:
The Kinect tracks your movements and sends it in form of data to a computer. The software, Nagual Dance, translates this data into music. The process happens in real time!.
The software works with one or two players. Specific sound elements (instruments) are assigned to certain limbs of your body. What you play with it is totally up to you.
The music is created in Soundscapes (interactive music pieces), a new music format. This Soundscape is called Firedance. The Soundscapes are not a set of pre-recorded loops, but are generative music rules that outline the musical possibilities of the composition.
In the video demo, the left player plays the drum elements.
He’s got Percussions on his feet, Kick + Snare on his right hand and Hi-Hat + Shaker on his right hand.
The right player plays the melodies. She’s got chords on her right foot+percussive synths on her left foot, the bass on her right hand and a Pad + Leadon her left hand.
Nagual Dance is still in development, but Nagual Sounds’ Head Of Design, Georgios Delkos, told us that the company is working on an IndieGoGo campaign to crowdfund the project.
One of the more interesting aspects of this project is the Soundscape format for generative compositions. We’ve followed up to Nagual Sounds to see if we can get more details.
More examples and details are available at the Nagual Sounds site.