Lessons In Digital Instrument Design From Star Trek

Organizers of the Audio Developer Conference – an annual event focusing on audio development technologies – shared this keynote presentation from ADC22, Where few designs have gone before: lessons in digital instrument design from Star Trek.

The presentation features Dr. Astrid Bin, a music technology researcher at Ableton and a founding developer of the Bela.io platform.

Bin offers an overview of the instruments of Star Trek and how they have been used as story-telling devices in each Star Trek series. Then Bin shares what she learned in recreating one of Star Trek’s fictional instruments as a digital musical instrument.


KEYNOTE: The Musical Instruments of Star Trek – Astrid Bin – ADC22

In the futuristic universe of Star Trek there are a lot of musical instruments, and many of them using far-future technology. The designers of these instruments never intended them to actually work, and were therefore led by their imaginations and not by the limitations of earthly technology – the opposite of the instrument design process today, where the design process tends to be heavily influenced by the affordances of the technology we have to hand.

In this talk music technology researcher and theorist Astrid Bin explains how she explored this imagination-first process of instrument design by recreating an instrument, as faithfully as possible, from the show. Through the process – from discovering the instrument, to getting input from the show’s original production designer, to figuring out how to make the instrument’s behaviour true to the original intentions (but using primitive 21st century embedded sensors and computers) – she describes what she learned about designing real digital musical instruments through trying to recreate an imaginary one.


15 thoughts on “Lessons In Digital Instrument Design From Star Trek

  1. TOS is the only startrek worth talking about. the “way to eden” was a great episode. Sevrin’s ears always gave me the willy’s.

      1. yeah, wesley crusher and tasha yar really hit it out of the park. heh. tng had waaaaay to many characters; most were bad. that transporter chief was so bad, they promoted him on DS9. borg were cool, and that one dyson sphere episode with scotty.

      2. It’s been a while since last watched, but the only musical instruments I can remember from TNG were that “Wurlitzer” played by the four armed lady (with singing by Worf son of Mogh – indicating that Bob Mogh was actually a Klingon, just like William Shakespeare), Riker’s trombone, and Picard’s recorder, all seemingly ancient type acoustic instruments. Maybe EDM finally fell out of favor after TOS, and the Federation finally became truly civilized.

        1. yeah, pretty silly premise to me: lets sit in one place and let everything go by us. it’ll be cheaper to produce, and all we need is a wormhole to introduce any soap opera plot we like. it was just babalon five all over again.

          TOS is still the best. Tranya Captain! get to a dentist!

  2. I’m not sure exactly why, but I really liked this woman’s “TED Talk”. I’m sure that some of it has to do with her effervescent personality and her obvious love of what she is doing (or maybe it’s just that I enjoy watching somebody who gesticulates more than I do when giving a lecture 🙂 ). Anyway, I was fascinated with the train of thought that gave rise to the development of this instrument. Although there isn’t much about the prototype that I wouldn’t consider crude, I think that the idea behind it, and the technology she is exploiting in its development, is extraordinary. While it is relatively easy to strap some electrodes to a person’s scalp and extract some consistent control signals from the underlying EEG, converting “mood” signals from peripheral body parts has not been well explored. Personally, I’m anxiously awaiting Apple’s upcoming VR/AR headset that, apparently, will have sensors that will allow pupilometry (i.e., will be able to follow gaze direction, gaze vector acceleration, and changes in pupil diameter). Somebody important once said that the eyes are the window to the soul (or something like that). As far as electronic instrument control goes, I think that Apple’s upcoming technology may really represent the next step in mind/brain interfaces, with nary one scalp electrode.

    1. John, have you ever tried to *do* anything with EEG? I spent a year with a trained psychologist and professional equipment; it was a lot easier just to push a button to get a reaction you could rely on. it’s was about as predictable as a human brain is.

      and this was just trying to get a blip on a screen to move in one cardinal direction – insanely hard, and mostly random. about the only time it did what I wanted, I was on the verge of falling asleep. lol.

      1. Your problem may have been the “trained psychologist with professional equipment” 🙂 To answer your question, I have had a lot of success in controlling things using signals extracted from EEG since about 1978. Two of the members of the board of the small “electronics and computer interfacing” company I was vice president and head software engineer of were also members of my band at the time, and were also neuroscience grad students with me. One night, we thought it would be a great idea if we built a device that allowed members of the audience to exert control of some of my on-stage synths using their EEG. Thus was born the CrystaLogic “MindMeld” (seriously, that’s what we called it and it WAS inspired by Spock’s telepathy abilities). Unfortunately, we never actually brought it to market, but we managed to build one very successful prototype. Since we were no way near the level of digital hardware and processing systems we have today, I designed the whole thing as a completely analog device. One of our hardware design people was a linear electronics engineering genius. He came up with a 4-channel headset electrode amplifier that used precision op-amps encased in the headband and sent noise-free signals back to the control box. Based on the requirements of using only power in EEG for the lowest predominant bands (e.g., Delta (0.5Hz – 2.5Hz), Theta (2.5 to 6.5Hz), Lo-Alpha (7.0-10Hz), and Hi-alpha (10Hz-12.5Hz), we developed band-pass filters that were giving us a Q slope of about 36dB/oct. The output of the filters were sent into current to voltage converters tuned for the specific ranges of control (e.g., the Delta signals were scaled to fit the 1v/oct range corresponding -C3 to -C2 on a Minimoog) and all derived control voltages were quantized to standard tuning. In theory, this should have allowed control of four synths, and even allowed the playing of chords. To be honest, although the device worked quite well, getting my EEG (I was the only person who tried it because I had all of the synths) to output anything that resulted in anything “musical” required an imense amount of training. Trust me, this was nothing like playing a Theremin, but the feedback principal is similar.

        Jump ahead about 42 years and an amazing little device I found on Amazon called the Neurosky “MindWave” that I use almost exclusively in my EEG lab at the university. Basically, it’s a single channel device with amazing accuracy (e.g., I’ve gotten much better and reliable signals from it than I was used to getting from any one channel of a $250,000 48-channel “research-grade” EEG machine that Uncle Sam purchased for me). The amazing thing about it is that it can be controlled using OPENVibe2 (i.e., a brain-computer interface software package developed under French government funding). Since the NeuroSky drivers are available in OPENVibe you can manipulate the one available EEG channel to do some pretty amazing things, because the software is built to control things (and synths are things). Also, because we now have MIDI, filter tuning, and all of the analog converters aren’t necessary, so it is pretty easy to recreate the MindMeld as a MIDI control device. So, the final answer is… yes I’ve tried it and it works (but still requires a lot of practice to get to to work in any useful way. However, controlling a synth with it isn’t any more complex than controlling a wheelchair or an electronic arm (which we have done in the lab).

  3. Finally someone made the presentation I’ve been waiting my whole life to hear!! Although Part Two needs to be the Instruments of the Jetsons. Star wars? Not so much…

Leave a Reply

Your email address will not be published. Required fields are marked *