Nuro’s new brain-computer interface uses neurological signals to let incapacitated patients talk to doctors and family.
Sixteen years ago, Henry Evans was driving his kids to school when he suddenly felt sick. Evans, who was the chief financial officer of a Silicon Valley tech startup at the time, started slurring his words and began to have trouble seeing. When he returned home, he struggled to keep his balance. Hours later, struck by a rare, stroke-like brainstem disorder, he was in a coma; when he eventually came out of the coma, he was unable to move or speak. At first, he communicated by blinking “yes” or “no,” as his wife spoke letters, to spell out words. Later, he looked at a board covered in letters while his wife tried to guess what he was saying.
On a weekday in early April, Evans — who has become an advocate for using advanced technology to help people in similar situations — became an early tester of a new way to communicate. A headband wrapped around his forehead, along with another sensor on the bridge of his nose, monitored electrical signals from his brain and eyes in real time. By fusing those signals together and looking for particular signatures in the data, the system allowed Evans to control and navigate through a tablet in front of him without touching it or speaking.
He could select prewritten messages, such as “I need water” or “I’m feeling cold,” or type via a visual keyboard on the screen. On another screen with various Alexa commands, he told Alexa to start playing music. In the background, the technology measured his focus, level of calm, and other factors that could be used to send alerts in the case of an emergency.
Called Nuos, the system was designed by a startup called Nuro. Cofounder and CEO Francois Gand, a serial tech entrepreneur, started exploring assistive technology after an accident–and seven reconstructive surgeries on his right hand–left him unable to hold a mouse for several years. He saw that current solutions were limited. After a friend became partially paralyzed in an accident on a mountain trail, Gand saw his friend’s experience as further confirmation that he needed to find a better solution.
The Nuos system is particularly designed for people who are paralyzed and unable to talk, a situation that can happen after a stroke. In the most extreme cases, patients suffer from “locked-in syndrome,” unable to move anything other than their eyes. “At that point in time those people become entrapped,” says Gand. “They have a lot of difficulty to start interacting with the world, start interacting with their family members, [and] interacting with doctors, nurses, professional caregivers.”
This was Henry Evans’s situation at first, though he later regained some ability to move his head and some slight movement between his thumb and forefinger. As he and his wife slowly perfected the use of a “letter board,” they eventually became so adept that she could spell out words without holding up the board, just by watching his eyes move. But the process is laborious. Gand says that one doctor told him that some patients refuse to use this type of board. “They are so frustrated and so angry because they cannot express themselves,” he says. The inability to communicate can be a final straw for some patients. Some have gone to court to fight for the right to assisted suicide.
The new technology could help bring a small amount of control back to these patients. Because of ongoing patent applications, Gand won’t explain the details of how the system works. But by monitoring live EEG signals from the prefrontal cortex, along with electrical signals from the eyes–and then using algorithms to recognize specific patterns in the combined data–the system is able to give a patient control over a user interface.
Gand calls it a neural operating system. “Just like DOS worked with a keyboard, Windows with a mouse, iOS with touch, Nuos is another level of evolution where the human being is now able to communicate and compute using neurological signals,” he says. The system uses artificial intelligence to adapt to a patient. Someone who has just suffered a stroke, for example, would start with a simplified user interface that gradually becomes more advanced. It can be customized for various settings, from an ICU to someone’s home. The interface can allow someone to browse the internet, connect with external systems like robotics, and supports a wide range of other uses.
A Stanford-led study in 2017 showed that it was possible for patients with paralysis to type via brain control, but required a surgical implant rather than an external headband. Some other systems are at earlier stages. When Facebook announced in April 2017 that it wanted to allow people to type via a brain-machine interface, Gand says, Nuro had already had a functioning system for typing (and more) for nine months. The startup, which just completed four months at IndieBio, an accelerator in San Francisco, is now preparing for large-scale trials with health systems in the U.S. and Canada. While the technology can be used in other applications, the team is focused first on healthcare.
Evans, who is able to use a head-tracking device to type now–and who can communicate fluently with his family with a letter board–sees clear value in the system for others. “It is safe to say that Nuos, with a little fairly trivial development, would have been useful for me before I development head movement,” he said over email. “It is also safe to say that Nuos, together with the sensor, appears to be a breakthrough for those who can’t move their heads . . . it will allow you to control your environment and communicate complex thoughts (in addition to basic thoughts), and do so without much effort.”
Jane Evans, his wife, agrees. “Basically this opens the door for someone to talk,” she says. “Is it as fast as you and I talking? Of course not, it’s not even close. But it’s a door, a window, to the outside world. More important is, they’re hooked up and they’re doing it themselves. It’s about human dignity… there is something so powerful in that. You can’t take that away from a human being.”
|ABOUT THE AUTHOR|
|Adele Peters is a staff writer at Fast Company who focuses on solutions to some of the world’s largest problems, from climate change to homelessness. Previously, she worked with GOOD, BioLite, and the Sustainable Products and Solutions program at UC Berkeley.|
Source Fast Company
Inside the race to hack the human brain Wired