top of page

Can Robots Feels Pain? The Science Behind Artificial Skin

The Science Behind Artificial Skin

Neurons firing electrical impulses to stimulate a reflex action is what arguably makes us human. Now, however, robots can react to external stimuli much like you and I. Recent scientific breakthroughs have showcased the ability of robots to react the same way humans do, sensing pain to external stimuli through various methods of machine learning. Through the development of biomaterials to create human-like flesh in combination with different sensors that make up the robots brain, scientists have birthed the new generation of bots, soft robots and electronic skin (e-skin).


ROBOTIC FLESH

When thinking of robots, we often think of machines made from hard metal and a variety of polymers connected to copper wires. Scientists at Wei Gao lab at the California Institute of Technology were able to divert this stereotype and have developed a robotic hand that mimics the feel of soft human flesh. Through the use of an inkjet printer, they have printed a gelatinous hydrogel that consists of sensors to allow the robot to feel its surroundings.


The sensors within the hydrogel are printed onto the material much like ink would be on a piece of paper. Mostly consisting of a base of silver nanowires, carbon and polyimide along with useful sensing nanomaterials embedded within, the sensors serve a purpose of detecting substances that are of potential danger to human health. In an article written about this discovery, Gao describes the idea behind printing the materials.


“Inkjet printing has this cartridge that ejects droplets, and those droplets are an ink solution, but they could be a solution that we develop instead of regular ink.” Purposefully, using a printer to produce the human-robotic hand allows for a faster and low-cost method of mass production. Thus, it allows for an efficient method of designing and integrating new sensors to detect chemicals such as TNT and pathogens. Even though definite progress has been seen in the field of research of creating materials that allows for robots to adopt the soft flesh of human skin, the most notable progression is seen in the technology of each robot's software.


HOW ARE ROBOTS BEHAVING LIKE HUMANS?

Humans are powered by a reinforcement and punishment system. It is just simply how the human brain learns what to do and what not to do. Robots are now able to do the same.


For a machine to learn, it has to have an input and an output. Data needs to be processed in order for the machine to make a decision. Traditional methods of developing eskin focus on pressure sensors that are then sent to a computer to process. This not only reduces reaction time, but also reduces the ability of the machine to make conscious judgments.


Another team known as the Bendable Electronics and Sensing Technologies (BEST) at the University of Glasgow, led by Professor Ravinder Dahiya, developed a robotic hand that allows the robot to get in touch with their feelings.


Bringing the robotic hand to life, they have mimicked the human peripheral nervous system (PNS) consisting of 168 zinc oxide synaptic transistor nanowires printed onto flexible plastic, connected to an array of sensors and transmitters on the palm of the robotic hand.


Have you ever wondered why when touching something hot, we immediately take our hand away? In humans, skin contact triggers our PNS to begin processing information, filtering out unnecessary data and sending only relevant signals to the brain.


The process of eliminating sensory data creates an increase in efficient communication channels within the body. The process of the PNS is exactly what the BEST team implemented in their robots’ software. When the robotic sensors come in contact with external stimuli, the sensors respond via a change in electrical resistance - a light touch registers a smaller change of electrical resistance, while a hard touch generates a larger change.


However, the ability of the robot to sense pain is done by connecting a circuit onto the surface skin of the bot, allowing associative learning to take place. Long-term memory is stored within the robot by generating an output of voltage spikes (whose frequency depends on the pressure of the object pressing onto the skin) stimulating pain. The level of pain is detected by setting a threshold for the voltage, allowing the hand to rapidly react when a sharp object is in contact with the skin of the bot.


By replicating how the human nervous system processes information, robots can now learn to move away from situations of distress, behaving the same way a human would.


Future Implications Of Sensing Robots

Even though the idea of robots feeling pain might be eerie to some, we have officially tapped into the world of human robots. Arguably, this opens the potential to improve various aspects of healthcare and manufacturing.


We have yet to perfect prosthetic limbs, however, with the grand scientific breakthrough of robotic feelings, revolutionising the manufacturing of prosthetic limbs to feel pain might give a new perspective to those who require the wear of prosthetics.


In an article written about the BEST’s team discovery, professor Dhayia mentioned how impactful their discovery would be in the future.


“In the future, this research could be the basis for a more advanced electronic skin which enables robots capable of exploring and interacting with the world in new ways, or building prosthetic limbs which are capable of near-human levels of touch sensitivity.”


Not to mention, through robotic feelings we are able to investigate new methods of testing such as examining new pathogens and how the human system reacts to harmful organisms. This can tap into mechanisms that may revolutionise drug discovery and the way scientists view pathogens and their side effects. As well as testing new manufacturing goods such as explosives and objects that may be harmful to test on humans which was one of the objectives of the Wei Gao Lab.


Does this mean robots are becoming more human-like? We are far from creating robots that exactly mimic human systems. Despite the fact that life-sized robots that look and act like humans exist, we still haven't developed a replica that breathes. Soft robots and e-skin technologies are still at its early stages and have yet to re-invent testing mechanisms during manufacturing and healthcare with the objective of improving human health and preventing injury to society.


References



6 views0 comments
bottom of page