New research with an artificial robotic digit has yielded surprising results – in just a few days, people using the thumb were able to operate it naturally to perform complex tasks like building towers from wooden blocks, or stirring your coffee while holding it.
Not only that, but neural scans showed that the presence of the ‘third thumb’ had actually changed what was going on in the brain, even when the extra appendage was taken off: having a robo-thumb attached for a few days shifted the brain’s representation of flesh and blood fingers.
Understanding what’s going on here is crucial for improving our bodies’ relationships with tools, robotic devices, and prosthetics. While these methods of augmentation can be incredibly useful, we need to understand their impact on our brains.
Dani Clode with the robotic thumb. (UCL)
“Our study shows that people can quickly learn to control an augmentation device and use it for their benefit, without overthinking,” says designer and research technician Dani Clode from University College London (UCL) in the UK, who made the Third Thumb.
“We saw that while using the Third Thumb, people changed their natural hand movements, and they also reported that the robotic thumb felt like part of their own body.”
The 3D-printed thumb offers two degrees of movement, and is controlled wirelessly by pressure from the big toes. The study got together 20 people willing to take on an extra thumb, and they were asked to wear it for six hours a day for five days – both in preset training tasks and in the course of their everyday lives.
These routines focussed on motor control, coordination and dexterity, and were designed to teach them how to use the Third Thumb intuitively.
The volunteers managed to use the thumb even when distracted or blindfolded, and reported a strong sense of embodiment.
Some of the participants also underwent fMRI scans before and after the Third Thumb experiments, moving their fingers one by one in order as their brains were analyzed – although for safety reasons the extra thumb wasn’t used in the scanner. The hands that hadn’t been wearing the Third Thumb were compared as a control.
The scans showed that the brain’s representation of individual fingers on the hand with the extra thumb had become less distinct – the respective areas of neural activity in the sensorimotor cortex (where sensory and motor information is handled) had started to blur into each other.
“Evolution hasn’t prepared us to use an extra body part, and we have found that to extend our abilities in new and unexpected ways, the brain will need to adapt the representation of the biological body,” says neuroscientist Tamar Makin, from UCL.
While you might feel like a tennis racket or a screwdriver becomes part of your body after a while, up until now research looking at brain scans doesn’t exactly back that up – as far as the brain is concerned, these extra appendages are considered separate, and it can tell the difference between a hand and a tool.
Having something physically attached to your body and working in tandem with it is different though. The results the researchers got here are closer to what we see in expert piano players – long-term training leads to changes in the brain’s representation of the fingers, with less of a distinction between them.
That’s a relationship that will need to be investigated further as we develop more advanced robotics and prosthetics to augment what our natural bodies can do – there’s a lot of potential here, but more research is needed into the sort of shifts that it might lead to in our brains.
“Body augmentation could one day be valuable to society in numerous ways, such as enabling a surgeon to get by without an assistant, or a factory worker to work more efficiently,” says neuroscientist Paulina Kieliba, also from UCL.
“This line of work could revolutionize the concept of prosthetics, and it could help someone who permanently or temporarily can only use one hand, to do everything with that hand. But to get there, we need to continue researching the complicated, interdisciplinary questions of how these devices interact with our brains.”
The research has been published in Science Robotics.