Linking sense of touch to facial movement inches robots toward ‘feeling’ pain

If robots can experience pain themselves, they might understand human pain better, too

robot resembling a child's head

Touch sensors help Affetto, a robot built to resemble a child’s head, detect a signal that would cause a human pain.

Osaka University

SEATTLE — A robot with a sense of touch may one day “feel” pain, both its own physical pain and empathy for the pain of its human companions. Such touchy-feely robots are still far off, but advances in robotic touch-sensing are bringing that possibility closer to reality.

Sensors embedded in soft, artificial skin that can detect both a gentle touch and a painful thump have been hooked up to a robot that can then signal emotions, Minoru Asada reported February 15 at the annual meeting of the American Association for the Advancement of Science. This artificial “pain nervous system,” as Asada calls it, may be a small building block for a machine that could ultimately experience pain (in a robotic sort of way). Such a feeling might  also allow a robot to “empathize” with a human companion’s suffering.

Asada, an engineer at Osaka University in Japan, and his colleagues have designed touch sensors that reliably pick up a range of touches. In a robot system named Affetto, an unsettlingly realistic-looking child’s head, these touch and pain signals can be converted to emotional facial expressions (SN: 7/2/19).

A touch-sensitive, soft material, as opposed to a rigid metal surface, allows richer interactions between machine and world, says neuroscientist Kingson Man of the University of Southern California in Los Angeles. Artificial skin “allows the possibility of engagement in versatile and truly intelligent ways.”

Such a system, Asada says, might ultimately lead to robots that can recognize the pain of others, a valuable skill for robots designed to help care for elderly people, for instance.

But there is an important distinction between a robot that responds in a predictable way to a painful thump and a robot that’s capable of approximating an internal feeling, says Antonio Damasio, a neuroscientist also at the University of Southern California. In a recent article, he and Man argue that such an artificial sense of feeling might arise if robots were programmed to experience something akin to a mental state such as pain (SN: 11/10/19). 

A robot with tactile sensors that can detect touch and pain is “along the lines of having a robot, for example, that smiles when you talk to it,” Damasio says. “It’s a device for communication of the machine to a human.” While that’s an interesting development, “it’s not the same thing” as a robot designed to compute some sort of internal experience, he says.

Laura Sanders is the neuroscience writer. She holds a Ph.D. in molecular biology from the University of Southern California.