Neuroprosthetics: the mind is the pilot

Brain Machine Interface in action at the Rolex Learning Center

Brain Machine Interface in action at the Rolex Learning Center

Piloting a wheelchair using only one’s mind—José de R. Millán’s laboratory in the EPFL Neuroprosthetic Center is accomplishing this seemingly impossible task. An artificial intelligence system called “shared control” is being developed to render the brain-machine interface more usable.

Michele Tavella, an assistant in Millán’s laboratory at EPFL in Switzerland, sits on a wheelchair that he is controlling only with his mind. While the chair slowly moves around a room, avoiding obstacles along the way, Tavella remains eerily still and concentrated. His thoughts activate specific brain patterns that are recorded by electroencephalography (EEG) using a helmet with electrodes. These patterns are then interpreted by a computer that transmits a command to the chair.

“When I want to turn left, I imagine moving my left hand,” says Tavella. “And this is very natural and very quick; I can send a command in about a second.”

It took Tavella a couple of hours for his brain to adapt to the system, and in turn the system adapts to the particularities of the brain that is controlling the machine—it is a process of mutual apprenticeship between human and machine.

Artificial intelligence helping the brain

The current brain-machine interface allows for comfortable piloting of the wheelchair, but it remains rudimentary. To compensate for the simple but effective tertiary input (left/right/forward), and to take some of the pressure off the user, a system using artificial intelligence is used. Called "shared control", two small cameras, situated on each side of the chair, along with image-processing software help avoid obstacles.

Tom Carlson, from Imperial College London, has recently joined the Neuroprosthetic Center to improve the "shared control" piloting. “We are looking to improve the security and precision of the system,” explains the young research associate.

The system requires advanced artificial intelligence—it will need to distinguish between different types of objects: furniture, people, and doorways. Carlson explains that “if it is a cabinet, the chair should be directed around it. But if it is a desk, the chair will have to recognize it and approach it appropriately.”

In the future, the system will be able to interpret the user’s higher-level intentions. “We are trying to analyze different brain patterns, such as error-related potentials that may help to disambiguate the intentions of the user,” says Carlson. “Does the user want to avoid the desk or is it his, and should the chair pull up to it so he can work?”

Human and artificial intelligence are intricately intertwined and this complex interaction requires competences from varying fields: electronic, neuroscientific, computer programming. The Defitech Foundation Chair in non Invasive Brain Machine Interface at EPFL is currently working on other brain-machine projects that include a remote controlled robot and software allowing the user to write emails and surf the internet.


Author: Lionel Pousaz

Source: EPFL