This is a little off topic, but it’s fascinating research that could have applications for music. Chris Harrison, a PhD Student, is researching using sound and gestures to turn any surface into an input device.
Here’s the summary of his project:
Scratch Input: Creating Large, Inexpensive, Unpowered and Mobile Finger Input Surfaces
We present Scratch Input, an acoustic-based input technique that relies on the unique sound produced when a fingernail is dragged over the surface of a textured material, such as wood, fabric, or wall paint. We employ a simple sensor that can be easily coupled with existing surfaces, such as walls and tables, turning them into large, unpowered and ad hoc finger input surfaces. Our sensor is sufficiently small that it could be incorporated into a mobile device, allowing any suitable surface on which it rests to be appropriated as a gestural input surface. Several example applications were developed to demonstrate possible interactions. We conclude with a study that shows users can perform six Scratch Input gestures at about 90% accuracy with less than five minutes of training and on wide variety of surfaces.
While this is still very experimental, it’s easy to imagine using microphones like this to very cheaply turn surfaces into giant controllers for electronic music.