“Humans are not disabled, humans are not broken. Our built environment, our technology is disabled. We need not accept our limitations.”
These were the parting words from bionics designer Hugh Herr at a TED talk he gave last year. Herr had both his legs amputated in 1982 after he suffered tissue damage from frostbite while on a mountain climbing expedition.
Since the accident he has devoted his time to seeking out new ways to overcome physical disability.
Bionics, as Herr puts it, is the “interplay between biology and design”. For persons with physical disabilities, it can be approached in many ways: prosthetics, electro-mechanics, neuro-implants, exoskeletons, etc.
All have their advantages and shortcomings, although some, like traditional prosthetics, are less advantageous than others.
Thoughts control
A recently completed multi-year research project from Ecole Polytechnique Fédérale de Lausanne (EPFL) in
Switzerland
involved a brain-machine approach allowing paralysed people to remotely control a robot from home with one’s thoughts.
The researchers at the Defitech Foundation Chair in Brain-Machine Interface (CNBI), headed by José del R Millán, tested 19 people – from Italy, Germany and Switzerland – with a 100 per cent success rate. For weeks, the subjects (some of whom were physically disabled, some were not) put on an electrode-studded hat capable of analysing brain signals. They then gave the robot, located at EPFL, instructions to move around. They were transmitting their instructions in real time via internet from their home country.
The robot had a video camera, screen and wheels and displayed the face of the remote pilot via Skype.
Not only that but the designers incorporated another technology into the robot. Its brain-machine (BMI) interface was able to avoid obstacles by itself, even without instruction.
The technology behind both thought-controlled and intuitive robotics is not so new. Putting them together in a real-world application with potential benefits for persons with disabilities is what makes this interesting.
"What they have managed to do at EFPL is novel in that it combines neural control instruments with autonomous devices," says Conor McGinn, assistant professor at the Department of Mechanical and Manufacturing Engineering in Trinity College Dublin. "They have taken something that can ascertain some basic signal commands (go straight, stop, go right, etc) from the brain using a somewhat off-the-shelf neural device and integrated it with an off-the-shelf mobile robotic system. The robot has some on-board intelligence, especially in its ability to detect obstacles in its path and avoid them.
“It’s about time some new breakthroughs were made. The wheelchair came out in 1910. One hundred years on and people with disabilities have gone without any serious innovation for too long.”
Non-invasive thought-controlled robotics, while impressive, is relatively basic. “The subject wears an electrode-covered hat which can crudely read brain signals,” says McGinn. “The electrodes can read signals but they’re very noisy and only allow for a very basic level of transmission. The brain is very complex. Without physically tunnelling into it, there’s only so much you can control.”
Invasive options Other, more invasive, experiments have been performed with even greater results. Two years ago a woman with spinocerebellar degeneration, and paralysed from the neck down, had electrodes injected into the surface of her brain allowing them to record the responses of neurons. Researchers from the John Hopkins University in Maryland implanted two electrodes along her motor strip: one was centred in the area where she would imagine using her hand, the second where she imagined using her shoulder. “The woman was able to control a robotic arm to feed herself a bar of chocolate,” says McGinn. “Attaining fine control like this is technically extremely difficult and very impressive, but also very invasive.”
There are three different types of electrodes used for controlling the brain through BMI – dry, wet and implants.
Dry electrodes are the ones used in the research from EPFL and are the least effective. Wet electrodes are so called because the sensors are filled with a conductive gel. They are more responsive but a rather sophisticated cap would be required. “Implants, like the ones used on the woman with spinocerebellar degeneration, are by far the most effective but not very practical,” says McGinn. “Holes must be drilled into a subject’s skull where little sensors can be stuck onto the brain. The more invasive the approach, the better sensor readings you’ll get but there are high risks and even higher costs, not to mention the complications associated with getting FDA approval.”
Even if it were cheaper and easier to carry out, not every person with a disability would necessarily agree to high-risk invasive surgery, regardless of the results. Anything non-invasive, like the findings from EFPL Switzerland, is still welcome even if the outcomes are somewhat limited.
Their results bring to an end a seven-year EU-funded project entitled Tools for Brain-Computer Interaction (TOBI). Whether such robots become a practical reality for people suffering from physical disabilities remains to be seen. According to the research leader, Prof Millán: “For this to happen, insurance companies will have to help finance these technologies.”
Personal service robots
Closer to home Conor McGinn and his team at Trinity have been working on the design and control of personal service robots.
In 2014, he designed and helped build a prototype robot for Cork teenager Joanne O’Riordan, who was born without limbs due to a rare congenital disorder known as total amelia syndrome.
“The key difference to the EPFL research and ours is that the robot is not controlled directly by the brain but instead through conventional measures such as voice, gesture or smartphone,” he says. “Since our first Robbie the Robot prototype, we have developed another significantly more advanced model and are in the process of making a third. I am confident that we will be ready to demonstrate autonomous functionality in a real home environment before the end of 2015.”
Neuro therapeutics
Tomas Ward
is a senior lecturer at NUI Maynooth’s Department of Electronic Engineering. He is also VP for engineering and chief operating officer at a San Diego-based start-up that is developing cloud-based brain computer interfaces. He believes BMI technology is already leagues ahead of the thought-controlled robotics being demonstrated in Switzerland.
“We’re developing a Siri for brain waves,” he says. “We can decode brain waves over the cloud. By wearing an electroencephalogram (EEG) cap with some sensors we can transmit info to a cell phone showing awareness of one’s inner state – things like attention, levels of alertness, even one’s emotional state.”
Ward’s research has focused on how brain-machine interfaces might assist stroke victims. “We can use this technology to assist people with physical disabilities as a means of altering plasticity in the brain,” he says. “It’s what’s known as neuro therapeutics. If the brain can experience its own behaviour, we can drive changes in how the brain works. BMI is just a means for seeing the brain’s activity in real time and that activity can be mapped onto whatever it is you’re trying to do.”