Why type when you can just think about typing?
From the earliest days of punch cards, interacting with computers has always been a pain. Whether it’s a keyboard and mouse, joystick or controller, getting the thoughts out of our heads and into the machine requires numerous, unintuitive processes. But until we start implanting USB ports into our brains and downloading our thoughts directly, we’ll have to make do with the neural signal-detecting wristbands being developed by CTRL-Labs.
“When your brain wants to go and effect something in these virtual spaces, your brain has to send a signal to your muscle, which has to move your hand, which has to move the device, which has to get picked up by the system, and turned into some sort of action there,” Mike Astolfi, head of interactive experiences at CTRL-Labs, explained to Engadget. “And we think we can remove not only the mouse or the controller from that equation, but also, almost your hand from the equation.”
The as-of-yet-unnamed device is essentially an EMG wristband. It senses the changes of electrical potential in the user’s arm muscles, “the signal that your motor neurons are sending… the impulses that it’s gonna send into the muscles in your arm that’ll pull on the tendons that connect to your fingers,” Astolfi said. This information is then fed back into a machine learning algorithm which enables the system to reconstruct what the hand is doing, whether it’s typing, swiping or gesturing.
Measuring the electrical impulses through your arm, rather than your scalp as traditional EEGs do, helps increase signal fidelity. “When you put electrodes on the head, you deal with all the other electrical signals that your brain is putting out. Static from consciousness and seeing, and getting sensations back from the body,” Astolfi explained. “When you go down lower to an area like the arm, your body has already done all of the filtering for you.” That is, the signals travelling through the arm are those signifying an intentional action, “so it actually gives a lot cleaner signal, and then a lot larger density of signal as we start to drill down into finer grain detecting of the neuron spikes.”
With a cleaner signal, the system doesn’t have to work as hard to interpret the user’s intentions, which in turn helps lower the learning curve needed to acclimate to using it. “You can learn how to do this in 30 seconds to a minute,” Astolfi said. Take virtual reality for instance. Most current VR systems (Leap Motion notwithstanding) still rely on handheld controllers to replicate the user’s hands in the virtual space. What’s more, these controllers only offer between 3 and 6 degrees of freedom, compared to the human hand’s 48.
In a VR application, “we’re working toward the ability for users to be able to walk up, put the band on, not have to do any training, and be able to roll right away,” Astolfi said. “They can start using it using sort of a generalized model.”
The wristband would enable users to leverage their hands as in-game controllers as well. “We have the ability to let the user actually customize the signal that they’re sending into the device,” he continued. “We call it adaptive learning. The idea would be that the device would learn whatever gesture the user’s doing, and use that to control something inside of the game.”
For example, one of CTRL-Labs earliest demos leverages your hands to aim and fire digital projectiles at a virtual target. “You can do whatever you want with your arm to generate that,” he said. “As long as you’re consistent, then the system will learn that.” Since the algorithm learns from scratch, the user is able to program any movement or gesture that suits their needs or capabilities.
Such a system could prove a boon for users with dexterity or mobility issues since, once the algorithm figures out which muscle signals translate into which onscreen actions, there’s no need to hold a controller or even make noticeable hand or arm movements. During a 2017 demo for Wired, the team showed off the ability to type on a virtual keyboard while barely moving their fingers and play an Asteroids clone without taking their palms off of the tabletop.
“Because we’re actually looking at the muscle signals and not tracking that actual finger movement, you can start to abstract away from actually needing to move your fingers,” he explained. “So, depending on what you train into the system, you might be able to train a little muscle twitch, or actually just the initial motor neuron spike, without actually resulting in any physical movement of the finger.”
This system could eventually lead to more responsive prosthetics as well, however, initially the company is focusing on three specific applications for the wristband: VR gaming, navigating 3D environments such as immersive AutoCAD or Autodesk models, and robotics.
“We see it as being a really good analog for use with robotics,” Astolfi said. “So anything where you want to guide the movement of a real-world object, being able to use your arms sort of guide that and having a meaningful manipulator on the end, your actual hand or, really, the signals that are driving that hand, can make that easier.”
Unfortunately, the company does not yet have a set date to release the wristband, though it does hope to begin releasing its dev kit sometime next year. Whenever the technology does come of age, “we think that this has the potential to really become the dominant way that you interact with computers in the future,” Astolfi concluded. “We think this is gonna be such a big leap forward in the way that you interact with machines that people will eventually stop learning how to type.”
|ABOUT THE AUTHOR|
|Andrew Tarantola has lived in San Francisco since 1982 and has been writing clever things about technology since 2011. When not arguing the finer points of portable vaporizers and military defense systems with strangers on the internet, he enjoys tooling around his garden, knitting and binge watching anime. @terrortola|
|Demo D303. CTRL-Labs: Hand Activity Estimation and Real-time Control from Neuromuscular Signals, Edward F Melcer, Michael T Astolfi, Mason Remaley, Adam Berenzweig, Tudor Giurgica-Tiron. CHI ’18: ACM CHI Conference on Human Factors in Computing Systems demos. ACM SIGCHI 2018 Montréal. Youtube Apr 8, 2018|
|CTRL-Labs has developed algorithms for determination of hand movements and forces and real-time control from neuromuscular signals. This technology enables users to create their own control schemes at run-time – dynamically mapping neuromuscular activity to continuous (real-valued) and discrete (categorical/integer-valued) machine-input signals. To demonstrate the potential of this approach to enable novel interactions, we have built three example applications. One displays an ongoing visualization of the current posture/rotation of the hand and each finger as determined from neuromuscular signals. The other two showcase dynamic mapping of neuromuscular signals to continuous and discrete input controls for a two-player competitive target acquisition game and a single-player space shooter game.|
Wristband Lets the Brain Control a Computer with a Thought and a Twitch in Scientific American
How Your Brain May One Day Control Your Computer in National Geographic
Helping Hand in The New Yorker