Robots capable of displaying human emotion have always been a staple of science fiction. However, Japanese researchers are now studying real human facial expressions to bring these stories closer to reality.
In a recent study published in the Mechanical Engineering Journal, a research team led by Osaka University has started mapping out the complexities of human facial movements. By attaching 125 tracking markers to a person’s face, the researchers closely examined 44 distinct facial actions, such as blinking or raising the corner of the mouth.
Each facial expression involves various local deformations as muscles stretch and compress the skin. Even the simplest motions can be surprisingly intricate. Our faces consist of different tissues beneath the skin, including muscle fibers and fatty adipose, all working together to convey our emotions. This level of detail makes facial expressions subtle and nuanced, making them challenging to replicate artificially. Until now, simpler measurements of overall face shape and motion of selected points on the skin before and after movements have been relied upon.
“Our faces are so familiar to us that we often overlook the fine details,” explains Hisashi Ishihara, the main author of the study. “However, from an engineering perspective, they are incredible information display devices. By observing facial expressions, we can discern when a smile hides sadness or determine if someone is tired or nervous.”
The information gathered from this study can aid researchers working with artificial faces, whether digital or physical android robots. Precise measurements of human faces, understanding the tensions and compressions in facial structure, will enable these artificial expressions to appear more accurate and natural.
“The facial structure beneath our skin is complex,” says Akihiro Nakatani, the senior author. “The deformation analysis in this study can explain how sophisticated expressions, which involve both stretched and compressed skin, can result from seemingly simple facial actions.”
This research also has applications beyond robotics, such as improved facial recognition and medical diagnoses. Currently, medical diagnoses rely on doctors’ intuition to detect abnormalities in facial movement.
Although this study has only examined the face of one individual so far, the researchers hope to build upon their work to gain a deeper understanding of human facial motions. This research could not only help robots recognize and convey emotions but also enhance facial movements in computer graphics, such as those used in movies and video games, to avoid the unsettling “uncanny valley” effect.