Meet the guy with four arms, two of which someone else controls in VR

These robotic limbs could someday help people work together when they’re far apart.

Fusion – Full Body Surrogacy for Collaborative Communication. Yamen Saraiji led the development of a pair of robotic arms that can be worn like a backpack and controlled by a remote operator. Keio University Graduate School of Media Design, The University of Tokyo

Rachel Metz, MIT Technology Review August 6, 2018

Yamen Saraiji has four arms, and two of them are giving him a hug.

The limbs embracing Saraiji are long, lanky, and robotic, and they’re connected to a backpack he’s wearing. The arms are actually controlled remotely by another person, who’s wearing an Oculus Rift VR headset, with which they can see the world from Saraiji’s perspective (cameras linked to the backpack ensure a good view), and wield handheld controllers to direct the non-human arms and connected hands.

After the hug, the robotic arms release Saraiji. Then the right hand gives him a high five, and Saraiji smiles.

Saraiji, an assistant professor at Tokyo-based Keio University’s Graduate School of Media Design, led the development of this robotic-arms-on-a-backpack project, called Fusion, to explore how people may be able to work together to control (or augment) one person’s body. Though some of the actions Saraiji shows me via video chat from his lab in Japan are silly, he thinks the device could be useful for things like physical therapy and instructing people from afar.

Besides hugging and high-fiving, the operator of the robotic arms and hands can pick things up or move around the arms and hands of the human wearing the backpack. The mechanical hands can be removed and replaced with straps that go around the backpack-wearer’s wrists if you want to truly remote control their arms. The device, which Saraiji created with colleagues at Keio University and the University of Tokyo, will be shown off at the Siggraph computer graphics and tech interaction conference in Vancouver in August.

Fusion: Full Body Surrogacy for Collaborative Communication. Effective communication is a key factor in social and professional contexts which involve sharing the skills and actions of more than one person. This research proposes a novel system to enable full body sharing over a remotely operated wearable system, allowing one person to dive into someone’s else body.

“Fusion” enables body surrogacy by sharing the same point of view of two-person: a surrogate and an operator, and it extends the limbs mobility and actions of the operator using two robotic arms mounted on the surrogate body. These arms can be used independently of the surrogate arms for collaborative scenarios or can be linked to surrogate’s arms to be used in remote assisting and supporting scenarios.

Using Fusion, we realize three levels of bodily driven communication: Direct, Enforced, and Induced. We demonstrate through this system the possibilities of truly embodying and transferring our body actions from one person to another, realizing true body communication. MHD Yamen Saraiji, Tomoya Sasaki, Reo Matsumura, Kouta Minamizawa, Masahiko Inami. Yamen Saraiji, Youtube Aug 11, 2018

There have been plenty of other efforts to create extra limbs that you can wear, and in fact this isn’t Saraiji’s first time making robotic limbs meant to attach to a human: he and most of the other Fusion researchers previously built a wearable set of arms and hands called MetaLimbs that a wearer controlled with their feet. yamen saraiji, Published on Aug 11, 2018

MetaLimbs: Multiple Arms Interaction Metamorphism (2017). Inami Hiyama Laboratory (The University of Tokyo) Published Youtube on May 24, 2017

Having the limbs controlled by someone else—someone who can be in another room or another country, and in VR to boot—is a little different, however. Saraiji says he wanted to see what would happen if someone else could, in a sense, dive into your body and take control.

The backpack includes a PC that streams data wirelessly between the robotic arm-wearer and the person controlling the limbs in VR. The PC also connects to a microcontroller, letting it know how to position the robotic arms and hands and how much torque to apply to the joints.

The robotic arms, each with seven joints, jut out of the backpack, along with a connected head, of sorts. The head has two cameras that show the remote operator, in VR, a live feed of everything the backpack-wearer is seeing. When the operator moves their head in VR, sensors track that motion and cause the robotic head to move in response (it can turn left or right, tilt up and down, and pivot from side to side, Saraiji says).

The wearable system is powered by a battery that lasts about an hour and a half. It’s pretty heavy, weighing in at nearly 21 pounds.

“Of course, it’s still a prototype,” Saraiji points out.

While I’m talking to him, Saraiji puts on the backpack and enlists a graduate student to wear the VR headset and help demonstrate how it works. I call out a few commands, such as asking the robot-limb operator to pick something up. At first, he fumbles with a squeaky yellow toy with cartoon eyes, then manages to grab it and hand it to Saraji; then one of the robot hands takes the toy back, and gives it back to Saraji again. At one point, Saraiji walks behind the guy operating the arms in VR, so the operator can tap himself on the shoulder with one of the robot’s fingers and give himself an abbreviated neck rub.

Different buttons on the Oculus Rift controllers enable different finger functions: the operator can move the pinky, ring, and middle finger of each robotic hand simultaneously with a single button, while the thumb and index finger each have their own controls.

Hermano Igo Krebs, a principal research scientist at MIT who has spent decades studying rehabilitation robotics, doesn’t think the project would be practical for rehab. But he can imagine it being helpful in a lot of different situations—to assist an astronaut in outer space, for instance, or a paramedic with an unfamiliar medical procedure.

Saraiji says that he’d like to turn the project into an actual product, and he and his collaborators are in the process of pitching it to a Tokyo-based startup accelerator.

Rachel Metz
As MIT Technology Review’s senior editor for mobile, I cover a wide variety of startups and write gadget reviews out of our San Francisco office. I’m curious about tech innovation, and I’m always on the lookout for the next big thing. Before arriving at MIT Technology Review in early 2012, I spent five years as a technology reporter at the Associated Press, covering companies including Apple, Amazon, and eBay, and penning reviews.

Source MIT Technology Review

Also see
MetaLimbs 2.0: Exploring Telepresence in ACM Siggraph Blog
Life with four arms looks pretty great in Fast Company
Here’s That Extra Pair of Robot Arms You’ve Always Wanted in IEEE Spectrum
Fusion: A Collaborative Robotic Telepresence Parasite That Lives on Your Back in IEEE Spectrum
The power of robotic arms gives this guy multiple VR-controlled limbs to work with in Designboom
These Doctor Octopus-style robot arms are controlled with your feet in The Verge

Call 403-240-9100