By combining machine learning with advanced metals, researchers were able to discern the texture of surfaces touched by prototype prosthetic arms.
Controlling objects with our hands isn’t as straightforward as it might seem. Think about the process of cracking an egg and what goes into it. Too much pressure, and the egg smashes in your hand. Too little, and all that wrist action causes it to fly across the kitchen.
With more than 3,000 touch receptors per fingertip, we’re constantly assessing friction and pressure in order to gauge how much effort is required from us.
Humans rely heavily on this information from our fingertips when manipulating an object. This is why haptic (touch) feedback has been such an area of interest for prosthetic development.
Now, in new research, scientists have combined advanced liquid metal materials with machine learning to recreate this sense of touch in prosthetic limbs.
These researchers from Florida Atlantic University’s College of Engineering and Computer Science and other collaborators are the first to incorporate stretchable tactile sensors using liquid metal on the fingertips of a prosthetic hand.
“The loss of an upper limb can be a daunting challenge for an individual who is trying to seamlessly engage in regular activities. Although advances in prosthetic limbs have been beneficial and allow amputees to better perform their daily duties, they do not provide them with sensory information such as touch,” said Prof Stella Batalama, dean of the College of Engineering and Computer Science.
“They also don’t enable them to control the prosthetic limb naturally with their minds. With this latest technology from our research team, we are one step closer to providing people all over the world with a more natural prosthetic device that can ‘feel’ and respond to its environment.”
Encapsulated within silicone-based elastomers, the technology provides advantages over other sensors, including its high conductivity, compliance, flexibility and stretchability.
The researchers combined this material with a prosthetic limb and with four different types of machine learning to test its ability to distinguish between complex surfaces.
These AI systems were able to not only discern information from individual fingertips, but were then able to feed this back, incorporating the information and providing a more detailed, overall picture of the surface with a hand-level perspective.
The researchers also highlighted the advantages of AI in translating information from prosthetic limbs to human sensations.
They write that fundamental differences between robot sensors and human mechanoreceptors suggests an intermediary step of classification through AI could improve perception of a haptic display.
This could be achieved by mapping the artificial tactile sensations in a way that is more fitting to human senses, such as those in this study.
“Significant research has been done on tactile sensors for artificial hands, but there is still a need for advances in lightweight, low-cost, robust multimodal tactile sensors,” said Erik Engeberg, PhD and senior author on the paper.
“We believe that these tactile details could be useful in the future to afford a more realistic experience for prosthetic hand users through an advanced haptic display, which could enrich the amputee-prosthesis interface and prevent amputees from abandoning their prosthetic hand.”