An electronic skin which can learn from feeling 'pain' could help create a new generation of smart robots, according to boffins from Glasgow University
Engineers developed artificial skin with a new type of processing system based on synaptic transistors, which mimics the brain's neural pathways in order to learn.
A robot hand which uses the smart skin shows a remarkable ability to learn to react to external stimuli.
A new paper in the journal Science Robotics reveals how researchers made the breakthrough.
Scientists have been working for decades to build artificial skin with touch sensitivity, including spreading an array of contact or pressure sensors across the surface to allow it detect when it comes into contact with an object.
Data from the sensors is then sent to a computer to be processed and interpreted.
The sensors typically produce a large volume of data which can take time to be properly processed and responded to, introducing delays which could reduce the skin's potential effectiveness in real-world tasks.
The new skin draws inspiration from how the human peripheral nervous system interprets signals in order to eliminate latency and power consumption.
As soon as human skin receives an input, the peripheral nervous system begins processing it at the point of contact, reducing it to only the vital information before it is sent to the brain.
The reduction of sensory data allows efficient use of communication channels needed to send the data to the brain, which then responds almost immediately for the body to react appropriately.
To build an electronic skin, researchers printed a grid of 168 synaptic transistors made from zinc-oxide nanowires directly onto the surface of a flexible plastic surface and connected the synaptic transistor with the skin sensor present over the palm of a fully-articulated, human-shaped robot hand.
The sensor registers a change in its electrical resistance to mimic the way sensory neurons work in the human body.
In earlier generations of electronic skin, that input data would be sent to a computer to be processed.
Instead, a circuit built into the skin acts as an artificial synapse, reducing the input down into a simple spike of voltage whose frequency varies according to the level of pressure applied to the skin, speeding up the process of reaction.
The team used the varying output of that voltage spike to teach the skin appropriate responses to simulated pain, which would trigger the robot hand to react.
By setting a threshold of input voltage to cause a reaction, the team could make the robot hand recoil from a sharp jab in the centre of its palm.
The development is the latest breakthrough by the university’s Bendable Electronics and Sensing Technologies (BEST) Group, led by Professor Ravinder Dahiya.
Professor Dahiya, of the James Watt School of Engineering, said: "We all learn early on in our lives to respond appropriately to unexpected stimuli like pain in order to prevent us from hurting ourselves again.
"We believe that this is a real step forward in our work towards creating large-scale neuromorphic printed electronic skin capable of responding appropriately to stimuli."