Beatoven is a tangible music controller project that lets users ‘cook’ with beats, using objects as ‘ingredients’ to be mixed. Continue reading
The purpose was to create a new instrument that would be as nuanced, and as easy to hold, as the violin. At the same time we wanted to have a modern instrument with electronic sounds and also visual output. Design and usability issues were of equal importance.
Sormina consists of eight keys that are rolled by fingers of left hand and right hand. The sensory data of the keys is transferred wirelessly to the computer in the form of MIDI controller messages. The keys control the parameters of the sound software and video software that are created specifically for Sormina.
Here’s a short video that offers a demonstration of the Sormina:
This is a functional demo of the sensors of textile designer Lara Grant’s Ruffletron, a wearable musical interface.
The Ruffletron is a prototype of a wearable musical interface and an experiment in performative interaction. The project was created in collaboration with Cullen Miller.
This video, via Nay-Seven, is a demo of using a Rock Band 3 Fender Mustang as an alternative music controller:
Here a patch I’ve made in Usine to use the midi guitar Mustang from Rock Band 3. With this patch you can use open tuning, run effects or samples with the buttons of the guitar and send Y and Z accelerometers to effects.
Patch will be available as an addon on the website.
The Rock Band 3 Fender Mustang control sells for around $125.
Features for musicians include:
- 17-fret touch-sensitive neck with 6 buttons per fret – for 102 active finger positions
- 6 low-latency strings for “authentic” note strumming
- Advanced tilt sensor
- Use as MIDI Guitar Controller when not playing Rock Band (compatible with most MIDI sequencers)
This video is a demonstration of Richard Hoadley’s Gaggle, an experimental generative sound interface.
Here’s what Hoadley had to say about the Gaggle Generative Sound Interface:
The Gaggle prototype has been imagined, designed and developed in order to experiment personally with such interfaces, and primarily with the link between sensor (in this case ‘pings’), physical computing board (in this case Arduino) and SuperCollider audio language.
Gaggle provides an opportunity to investigate performance using Gaggle, including questions such as:
- Does the number of sensors affect the nature of the interface? Does increasing the number of sensors to a point where they are difficult to control consciously affect performativity?
- Does the relative position of the sensors affect the result. In particular these ultrasound sensors can interfere with each other, especially when designing for movement such as that created by dancers.
- How does the type of movement to be used with the interface affect the use and design of the interface? For instance, in this case, how is the direction of the sensors affected and what difference does this make?
- Interplay between physical implementation and software algorithms: for instance, does the physical nature of the interface need to be reflected in its performance results. Of course all the usual issues concerning algorithmic composition and structuring arise at this point.
Pat Arneson’s VidiSynth is a DIY video controlled synthesizer.
The light sensors create interesting and complex sounds, based on the intensity of different areas of the screen. If the sensors are attached to an LCD screen you get relatively normal square wave tones but if you use a CRT screen (TV or monitor) you get ‘extra noisy and buzzy goodness’ because of way the screen refreshes.
In the video, Paul Sobczak demonstrates his expanded version of the VidiSynth.
Details on the VidiSynth project are available here.