I became obsessed with the relationship between the brain and the mind after suffering a series of concussions playing football and rugby in college. I felt my mind change for years after. I was studying computers at the time, and it felt as though I had damaged my hardware and that my software was running differently. Over the following years, a close friend suffered a serious neck injury and multiple friends and family members were struggling with crippling mental health issues. All around me, people that I loved dearly were being afflicted by ailments of the nervous system or the mind.
I was grappling with all of this while pursuing an MFA in Design and Technology at Parsons when a friend and fellow student showed me an open-source tutorial on how to build a low-cost single-channel EEG system to detect brain activity. After a couple long nights of hacking and tinkering, I saw my brainwaves dancing across the screen for the very first time. And that moment changed my life. In that moment, I felt as though I had the possibility to help myself and the people I loved. And I also realized that I couldn't do it alone. I needed help.
So in 2013, in Brooklyn, with some like-minded friends, I started OpenBCI, an open-source neurotechnology company. In the beginning, our goal was to build an inward-pointing telescope and to share the blueprints with the world so that anybody with a computer could begin peering into their own brain. At first, we were an EEG-only company. We sold brain sensors to measure brain activity. I thought that's what people wanted. But over time, we discovered people doing very strange things with our technology. Some people were connecting the equipment to the stomach to measure the neurons in the gut and study gut-brain connection and the microbiome. Others were using the tools to build new muscle sensors and controllers for prosthetics and robotics. And some were designing new devices and peripheral add-ons that could be connected to the platform to measure new types of data that I had never heard of before.
What we learned from all of this is that the brain by itself is actually quite boring. Turns out brain data alone lacks context. And what we ultimately care about is not the brain, but the mind, consciousness, human cognition.
When we have things like EMG sensors to measure muscle activity or ECG sensors to measure heart activity, eye trackers and even environmental sensors to measure the world around us, all of this makes the brain data much more useful. But the organs around our body, our sensory receptors, are actually much easier to collect data from than the brain, and also arguably much more important for determining the things that we actually care about: emotions, intentions and the mind overall.
Additionally, we realized that people weren't just interested in reading from the brain and the body. They were also interested in modulating the mind through various types of sensory stimulation. Things like light, sound, haptics and electricity. It's one thing to record the mind, it's another to modulate it. The idea of a combined system that can both read from and write to the brain or body is referred to as a closed-loop system or bidirectional human interface. This concept is truly profound, and it will define the next major revolution in computing technology.
When you have products that not just are designed for the average user but are designed to actually adapt to their user, that's something truly special. When we know what the data of an emotion or a feeling looks like and we know how to make that data go up or down, then using AI, we can build constructive or destructive interference patterns to either amplify or suppress those emotions or feelings. In the very near future, we will have computers that we are resonantly and subconsciously connected to, enabling empathetic computing for the very first time.
In 2018, we put these learnings to work and began development of a new tool for cognitive exploration. Named after my friend Gael, who passed from ALS in 2016, we call it Galea. It’s a multimodal bio-sensing headset, and it is absolutely packed with sensors. It can measure the user’s heart, skin, muscles, eyes and brain, and it combines that capability with head-mounted displays or augmented and virtual reality headsets. Additionally, we're exploring the integration of non-invasive electrical neural stimulation as a feature. The Galea software suite can turn the raw sensor data into meaningful metrics. With some of the sensors, we're able to provide new forms of real-time interactivity and control. And with all of the sensors, we're able to make quantifiable inferences about high-level states of mind, things like stress, fatigue, cognitive workload and focus.
In 2019, a legendary neurohacker by the name of Christian Bayerlein reached out to me. He was actually one of our very first Kickstarter backers when we got started, early on. Christian was a very smart, intelligent, happy-go-lucky and easygoing guy. And so I worked up the courage to ask him, "Hey, Christian, can we connect you to our sensors?"
At which point he said, "I thought you would never ask."
(Laughter)
So after 20 minutes, we had him rigged up to a bunch of electrodes, and we provided him with four new inputs to a computer. Little digital buttons, that he could control voluntarily. This essentially doubled his number of inputs to a computer. Years later, after many setbacks due to COVID, we flew to Germany to work with Christian in person to implement the first prototype of what we're going to be demoing here today. Christian then spent months training with that prototype and sending his data across the Atlantic to us in Brooklyn from Germany and flying a virtual drone in our offices. The first thing that we did was scour Christian's body for residual motor function. We then connected electrodes to the four muscles that he had the most voluntary control over, and then we turned those muscles into digital buttons. We then applied some smart filtering and signal processing to adapt those buttons into something more like a slider or a digital potentiometer. After that, we turned those four sliders and mapped them to a new virtual joystick. Christian then combined that new joystick with the joystick that he uses with his lip to control his wheelchair, and with the two joysticks combined, Christian finally had control over all the manual controls of a drone.
I’m going to stop talking about it, and we’re going to show you. Christian, welcome.
(Applause)
At this point, I'm going to ask everybody to turn off your Bluetooth and put your phones in airplane mode so that you don't get hit in the face with a drone.
(Laughter)
How are you feeling, Christian?
Christian Bayerlein: Yeah, let's do it.
Conor Russomanno: Awesome. This is a heads-up display that's showing all of Christian's biometric data, as well as some information about the drone. On the left here, we can see Christian's muscle data. Christian is now going to attempt to fly the drone. How are you feeling, Christian, feeling good?
CB: Yes.
CR: All right. Rock and roll. Let's take this up for a joyride. Whenever you're ready.
CB: I'm ready.
(Applause and cheers)
CR: All right, take her up. And now let's do something we probably shouldn't do and fly it over the audience.
(Laughter)
(Cheers and applause)
Alright, actually, let’s do this. I'm going to ask for people to call out some commands in the audience. So how about you? Straight forward. Straight forward.
(Laughter)
Alright. How about you?
Man: Up!
(Laughter)
CR: Not down. Oh, he's doing what he wants right now. Amazing.
(Cheers and applause)
Alright, let’s bring it back. And what I'm going to do right now is take control of the controller so that you guys know that there isn't someone backstage flying this drone.
All right, Christian, you're alright with that?
CB: Yeah.
CR: Unplug. Forward. And we're going to land this guy now.
CB: I think I was better than you.
(Laughter)
(Applause)
CR: Amazing.
(Applause)
Now I'm going to unplug it so it doesn't turn on on its own. Perfect.
Christian has repurposed dormant muscles from around his body for extended and augmented interactivity. We have turned those muscles into a generic controller that in this case we've mapped into a drone, but what's really cool is that joystick can be applied to anything.
Another thing that's really cool is that even in individuals who are not living with motor disabilities, there exist dozens of dormant muscles around the body that we can tap into for augmented and expanded control interactivity.
And lastly, all the code related to that virtual joystick, we're going to open source so that you can implement it and improve upon it.
There's three things that have stood out to me from working on this project and many others over the years. One, we cannot conflate the brain with the mind. In order to understand emotions and tensions and the mind overall, we have to measure data from all over the body, not just the brain. Two, open-source technology access and literacy is one way that we can combat the potential ethical challenges we face in introducing neural technology to society. But that's not enough. We have to do much, much more than that. It's very important, imperative, that we set up guardrails and design the future that we want to live in. Three. It's the courage and resilience of trailblazers like Christian who don't get bogged down by what they can't do, but instead strive to prove that the impossible is in fact possible.
(Applause)
And since none of this would have been possible without you, Christian, the stage is yours.
CB: Yeah, hi, everybody.
Audience: Hi.
CB: I'm excited to be here today. I was born with a genetic condition that affects my mobility and requires me to have assistance. Despite my disability, I'm a very happy and fulfilled person. What truly holds me back are not my physical limitations. It's rather the barriers in the environment.
I'm a tech nerd and political activist. I believe that technology can empower disabled people. It can help create a better, more inclusive and accessible world for everyone.
This demonstration is a perfect example. We saw what's possible when cutting edge technology is combined with human curiosity and creativity. So let's build tools that empower people, applications that break down barriers and systems that unlock a world of possibilities. I think that's an idea worth spreading.
Thank you.
(Cheers and applause)