Dan Lloyd, Brownell Professor of Philosophy at Trinity College, is using fMRI data from scans of the brain to synthesize “music”, and has found that different brains make very different melodies.
Lloyd created software which translates data from a brain scan to Musical Instrument Digital Interface (MIDI) data. This lets him use the information to control a synthesizer, resulting in brain-driven synthesis. According to Lloyd, this technology that may give new insights into the differences and similarities between normal and dysfunctional brains.
“The sounds work in unison and create melodies,” he said. “Different parts harmonize with each other to make not just sounds, but music.”
Lloyd used this technology to compare brain scans from people with dementia and schizophrenia to healthy subjects and found a noticeable difference in the music they created.
For more videos on Lloyd’s research, check his YouTube channel.
Do you think this qualifies as music, or is it just translating information from one timeline based form to another? Leave a comment with your thoughts!