I Controlled a Wheelchair With My Mind (Well, I Think I Did)

I recently tried a user interface that did not involve clicks or taps on a screen or any words spoken to a voice assistant. Instead, I had an electrode-studded headset strapped to my head.

As part of its Upgrade 2025 conference in San Francisco, NTT show off some of its recent research into brain-machine interfaces, how to leverage AI to make them work better, and, ultimately, bring a degree of mobility to paralyzed people.

The first step was being fitted for a Wearable Sensing DSI-24 headset by NTT researchers Takuya Kanda, Takashi Isezaki, and Kengo Okitsu. Inside, the dry EEG electrodes tickled at first. As the team clamped it down and attached other sensors to my ears, my noggin felt more like a scientific payload. But at least this exercise didn’t involve having any data ports drilled into my skull, a reassuring contrast to Elon Musk’s Neuralink.

An array of dry electrodes in a Wearable Sensing DSI-24 headset.

An array of dry electrodes in a Wearable Sensing DSI-24 headset. (Rob Pegoraro)

As the NTT researchers wrote in a 2024 research paper, the idea here is to lean on machine learning to adapt to frequent variations in brain waves and yield more accurate responses. 

The training phase of this demonstration was to guide a robot through a short series of hallways. They instructed me to think about clenching my left or right fist to make the robot go left or right.

At first, the apparent lag between my actions and the robot avatar confused me enough that I started clenching the wrong fist to compensate, leaving my onscreen self spinning around haplessly as if it were the creature with mobility impediments.

Then it was time to sit down in a Whill powered wheelchair and try to steer that without touching or speaking to any controls. I thought about clenching my right hand but then couldn’t resist physically turning my head in that direction, which felt like cheating even as the chair obediently steered the same way. The researchers, however, seemed to think I did well.

NTT's poster board outlining this research project

NTT aims to use AI to yield more reliable readouts of brainwave activity. (Credit: Rob Pegoraro)

My own acquaintance with brain-machine interfaces is so limited that I don’t have a good feel for when they read my brainwaves properly. The only other time I can remember trying one, at CES 2013, the feedback loop was obvious. In that case, I donned a set of brainwave cat ears and thought of things that made me happy. That caused the ears to start to move, which made me laugh, causing the ears to wiggle a lot more. The NTT experience felt a lot more subtle.

Get Our Best Stories!


Newsletter Icon


Your Daily Dose of Our Top Tech News

Sign up for our What’s New Now newsletter to receive the latest news, best new products, and expert advice from the editors of PCMag.

By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.

Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

So, I stood back and watched other conference attendees go through the same demos. I noticed that the wheelchair demo seemed to work equally well whether people visibly clenched their firsts or kept their hands on their legs and decided I had to give it another try.

This time, the researchers tightened the electrodes more than the first time, in part because the top electrode among the dozen-plus inside the headset seemed to have trouble picking up brainwaves. But with everything painfully in place, I had better luck controlling the onscreen robot now that I knew to factor in about a second of lag in that interface. 

Then I got back into the chair and resolved to keep my upper body still and my eyes closed; I thought about clenching my left hand while trying not to move any muscles, felt the chair turn left, and then heard the researchers calling out an error.

Recommended by Our Editors

On a second try, I thought about tightening my right hand, the chair turned left, and the researchers called out another error. Finally, I tried lightly tensing my left hand, the chair obediently turned left, and nobody reported any errors. 

The long-term reward for this research is not to give tech journalists weird anecdotes to share but to develop machines that can assist those with mobility issues. That’s an important enough goal that multiple teams of researchers are pursuing it. 

In 2022, for example, a team at the University of Texas at Austin demonstrated how their own system enabled people with tetraplegia—”the inability to move their arms and legs due to spinal injuries”—to operate a powered wheelchair “in a cluttered, natural environment to varying degrees of success.” I hope at least some of this work pays off. 

(Disclosure: NTT covered airfare and lodging for journalists and analysts invited to this event, myself included.)

About Rob Pegoraro

Contributor

Rob Pegoraro

Rob Pegoraro writes about interesting problems and possibilities in computers, gadgets, apps, services, telecom, and other things that beep or blink. He’s covered such developments as the evolution of the cell phone from 1G to 5G, the fall and rise of Apple, Google’s growth from obscure Yahoo rival to verb status, and the transformation of social media from CompuServe forums to Facebook’s billions of users. Pegoraro has met most of the founders of the internet and once received a single-word email reply from Steve Jobs.


Read Rob’s full bio

Read the latest from Rob Pegoraro

This article was published by WTVG on 2025-04-23 08:30:00
View Original Post

Shopping cart0
There are no products in the cart!
Continue shopping
Scroll to Top