A robotic feeding system has been developed by Cornell researchers that utilizes computer vision, machine learning, and multimodal sensing to safely feed individuals with severe mobility limitations, such as those with spinal cord injuries, cerebral palsy, and multiple sclerosis.
“Feeding individuals with severe mobility limitations with a robot is challenging, as many cannot lean forward and require food to be placed directly inside their mouths,” stated Tapomayukh “Tapo” Bhattacharjee, assistant professor of computer science at the Cornell Ann S. Bowers College of Computing and Information Science and the senior developer of the system. “The challenge becomes even greater when feeding individuals with additional complex medical conditions.”
A paper on the system titled “Feel the Bite: Robot-Assisted Inside-Mouth Bite Transfer using Robust Mouth Perception and Physical Interaction-Aware Control” was presented at the Human Robot Interaction conference in Boulder, Colorado and received a Best Paper Honorable Mention recognition. Additionally, a demo of the research team’s broader robotic feeding system received a Best Demo Award.
Bhattacharjee, a leader in assistive robotics, and his EmPRISE Lab have dedicated years to teaching machines how humans feed themselves. Teaching a machine this complex process, from identifying food items on a plate to transferring them inside a care recipient’s mouth, is a significant challenge.
“The last 5 centimeters, from the utensil to inside the mouth, is extremely challenging,” stated Bhattacharjee.
Some care recipients may have very limited mouth openings, measuring less than 2 centimeters, while others experience involuntary muscle spasms that can occur unexpectedly. Moreover, some can only bite food at specific locations inside their mouth, which they indicate by pushing the utensil using their tongue.
“Current technology only looks at a person’s face once and assumes they will remain still, which is often not the case and can be very limiting for care recipients,” said Rajat Kumar Jenamani, the paper’s lead author and a doctoral student in computer science.
To address these challenges, researchers equipped their robot with two critical features: real-time mouth tracking that adjusts to users’ movements, and a dynamic response mechanism that enables the robot to detect the nature of physical interactions as they occur, and react appropriately. This enables the system to differentiate between sudden spasms, intentional bites, and user attempts to manipulate the utensil inside their mouth.
The robotic system successfully fed 13 individuals with diverse medical conditions in a user study conducted at three locations. Users found the robot to be safe and comfortable.
“This is one of the most extensive real-world evaluations of any autonomous robot-assisted feeding system with end-users,” Bhattacharjee noted.
The team’s robot is a multi-jointed arm holding a custom-built utensil that can sense the forces being applied on it. The mouth tracking method combines data from two cameras positioned above and below the utensil to precisely detect the mouth. The physical interaction-aware response mechanism uses visual and force sensing to perceive how users are interacting with the robot.
“We’re empowering individuals to control a 20-pound robot with just their tongue,” said Jenamani.
User studies demonstrated significant emotional impacts of the robot on care recipients and their caregivers. One session witnessed a daughter with schizencephaly quadriplegia successfully feeding herself using the system, resulting in emotional reactions from her parents.
While further work is needed to explore the system’s long-term usability, the promising results suggest potential to enhance care recipients’ independence and quality of life.
“It’s amazing and very fulfilling,” Bhattacharjee expressed.
Co-authors of the paper include: Daniel Stabile, M.S. ’23; Ziang Liu, a doctoral student in computer science; Abrar Anwar of the University of South California, and Katherine Dimitropoulou of Columbia University.
This research was primarily funded by the National Science Foundation.