Thought-Controlled Synthesis With A Bionic Arm

Synthesist Bertolt Meyer shared this video, which documents his quest to turn his prosthetic arm into a bionic modular synthesizer interface.

Prosthetic hands are useful for many tasks, but do not offer enough fine control to accurately and quickly change settings on a synthesizer.

Meyer was born without a lower left arm and has a robotic prosthetic hand that’s controlled by signals generated by the muscles in his arm. While it’s great for many tasks, though, the prosthetic arm doesn’t offer enough fine control to accurately and quickly change settings on a synthesizer.

Meyer decided to see if he could come up with a better solution and, with the help of Chrisi from KOMA Elektronik, created a unique tool for thought-controlled synthesis – the SynLimb.

Here’s what he has to say a out it:

Together with Chrisi from KOMA Elektronik and my husband Daniel, I am in the process of building a device (the “SynLimb”) that attaches to my arm prosthesis instead of the prosthetic hand.

The SynLimb converts the electrode signals that my prosthesis picks up from my residual limb into control voltages (CV) for controlling my modular synthesizer. The SynLimb thus allows me to plug my prosthesis directly into my synthesizer so that I can control its parameters with the signals from my body that normally control the hand. For me, this feels like controlling the synth with my thoughts.

Meyer’s prosthetic arm not only removes barriers to performing live, but also opens the possibility that he can explore new ways to directly connect with synthesizers and sound.

You can preview Meyer’s latest track below and hear more at his SoundCloud page:

14 thoughts on “Thought-Controlled Synthesis With A Bionic Arm

  1. Thank you for sharing this, it’s an amazing idea and very inspirational. It also highlights the community feel to modular and how makers are still enthusiasts who love a challenge, do post updates as this develops.

  2. You could also check out the nia headband interface that was made by OCZ the computer RAM people. It can be obtained on eBay cheaply sometimes. The headband can translate alpha and beta brainwaves, and eye and scalp muscular movements into variable level command signals for the computer that can be programmed to control anything inside or connected to the computer. It takes a bit of practice and was derived from technology to aid paraplegic people. It can be used for games but also for applications like VSTs and DAWs. Windows XP and up.

    1. I did 12 months of neuro-feedback training. Just making a dot on a monitor move – not in any specific direction mind you – but to have a repeatable effect, is both mystifying as well as difficult. I wish they would provide more information about the connection between the music and the project – people get the impression that the mind is doing this directly in detail. In practice, there’s more to it.

      But, yeah, a lot of respect for the rest of it – would like to see details of implementations to understand what’s going on.

      1. I suspect the closest understanding of neural feedback that most people have is from watching that episode of House M.D. with the physically comatose but mentally active bicycle accident victim. There is lots on PBS and in Internet videos but regretably more time is spent on music and cat videos.

  3. This post, video, and comments inspire me! I am an electrical engineer, programmer, and modular-synthesizer hobbyist. I was born with a partly paralyzed left hand. It has reduced sense of touch and the fingers don’t move independently. My right hand is normal. I am interested in alternate controllers and interfaces to help me and other people.


  4. Bertolt is a hero to tackle this. Nice work! Its like the earliest Roland guitar synthesizer, grainy but fascinating. I can just feel the whole thing straining to one day reach v.3.0, where it’ll be much more sensitive and perhaps even multi-channel. Look Mum, a bionic synthesist! It seems like a natural match for a modular rig.

Leave a Reply