Buckwalter has suffered from quadriplegia since a diving accident at the age of 16, which left him paralyzed from the chest down. Six chips in his brain, made by BlackRock Neurotech, read activity from his neurons and decode movement intent. They enable him to operate computers with his thoughts, regain feeling in his fingers that he had lost, and, most recently, create music with his mind.
This technology, known as a brain-computer interface, or BCI, is being developed by Paradromics, Synchrony, Elon Musk’s Neuralink and others to restore communication and movement in people with severe motor disabilities. But Buckwalter’s experience shows that technology can be used in ways that aren’t purely functional — for example, as an outlet for creative expression. Other BCI recipients are using their implants to create digital art with their ideas. The works of BCI recipients Nathan Copeland, James Johnson, and Jan Scheuerman were featured in the 2023 Gallery Exhibition at the American Association for the Advancement of Science in Washington, DC.
Buckwalter is working with Caltech graduate student Sean Darcy, who has developed an algorithm that allows him to create musical notes on a computer with his thoughts. Buckwalter, a longtime musician with the Los Angeles-based punk rock band Siggy, uses some of the lab-generated vocals in a song called “Wirehead”, which is also the name of the band’s latest album, released March 15.
WIRED talked to Buckwalter about what it’s like to make music from your own mind. This interview has been edited for length and clarity.
Wired: You have recently begun using your implants to produce musical notes. How did that come about?
Galen Buckwalter: Even before I got implanted, I saw this clip of mushrooms on YouTube, where if you put electrodes on a mushroom you get this biosonification. This will amplify the electrical activity going on in the mushroom, and you get these really cool sounds. I looked at it and thought, if a mushroom can chirp like that, I want to know what my brain sounds like. This was something that was on my agenda that I wanted to do with the Caltech team. From day one, I was talking to all the researchers about it, and this amazing graduate student, Sean Darcy, heard about it. He spent his time on weekends and nights with this software, converting my idea into the ability to manipulate tones.
So you are able to create musical notes just by thinking. How does that work?
Every neuron has a baseline firing rate. All of these neurons are active to some degree, but what we do is identify the neurons over which I have voluntary control. Each of my six implants has 64 independent channels to record, and we have a big screen with all 384 channels. So, if I think about wiggling my toe up and down, a bunch of channels will light up. It appears to have a directional set of neurons that receive the extension and flexion of my toe.
What the scene does is it specifies a tone for the baseline firing rate. If I activate that neuron, the pitch will go up, and if I suppress it, it will come back down. I think about moving my index finger, and then I think about moving my little finger, and I can do that for multiple channels that I have voluntary control over. Right now I can do two tones at once, but if you go above that it starts to feel like you’re rubbing your head and patting your stomach at the same time.
<a href