91探花

Skip to content

 

The mechanics of eating are more complex than they might appear. For about a decade, researchers in the at the 91探花 have been working to build a robot that can help feed people who can鈥檛 eat on their own.

Researchers鈥 first breakthrough, back when the lab was at Carnegie Mellon University, was getting a robotic arm to use a fork to feed someone a marshmallow. Since then, the robot graduated from feeding users fruit salads to full meals composed of nearly anything that can be picked up with a fork. Researchers also investigated .

Up until recently, this work has mostly been evaluated in the lab. But last year, researchers deployed the assistive feeding arm in a pair of studies outside the lab. In the first, six users with motor impairments used the robot to feed themselves a meal in a 91探花cafeteria, an office or a conference room. In the second study, one of those users, , a community researcher and co-author on the research, used the system at home for five days, having it feed him ten meals.

March 5 at the ACM/IEEE International Conference on Human-Robot Interaction in Melbourne.

鈥淥ur past studies have been in the lab because, if you want to evaluate specific system components in isolation, you need to control all other aspects of the meal,鈥 said lead author , a 91探花doctoral student in the Paul G. Allen School of Computer Science & Engineering. 鈥淏ut that doesn鈥檛 capture the diverse meal contexts that exist outside the lab. At the end of the day, the goal is to enable people to feed themselves in real environments, so we should also evaluate the system in those environments.鈥

The system, which researchers dubbed ADA (Assistive Dexterous Arm), consists of a robotic arm that can be affixed to something nearby, such as a power wheelchair or hospital table. The user specifies what bite they want through a web app, and the system then feeds the user that bite autonomously (though users can stop the arm with a 鈥渒ill button鈥). The arm has a force sensor and camera to distinguish between foods and to get the food to the user鈥檚 mouth.

In both studies, users successfully fed themselves their meals. In the first study, the robot acquired entrees with around 80% accuracy, which users in another study found to be the threshold for success. In the second study, the home鈥檚 varied circumstances and environments 鈥 Ko could be eating while watching TV in low light or while working in bed 鈥 hindered the system鈥檚 default functionality. But researchers designed the system to be customizable, so Ko was able to control the robot and still feed himself all meals.

The team plans to continue improving the system for effectiveness and customizability.

鈥淚t was a really important step to take the robot out of the lab,鈥 Ko said. 鈥淵ou eat in different environments, and there are little variables that you don鈥檛 think about. If the robot is too heavy, it might tilt a table. Or if the lighting isn鈥檛 good, the facial recognition could struggle, but lighting is something you really don鈥檛 think about when you鈥檙e eating.鈥

Additional co-authors include , , and , all doctoral students in the Allen School; , a lecturer in the Allen School; , the late president of the Tyler Schrenk Foundation and a community researcher; , an occupational therapy clinical research lead at Hello Robot; , a high school intern in the Allen School while completing this research; , , , , and , all undergraduate students in the Allen School while doing this research; , a research scientist assistant in the Allen School while completing this research; and , a research scientist in the Allen School while completing this research. and , both professors in the Allen School, are senior authors. This research was funded in part by a 91探花CREATE student grant; 91探花Allen School Postdoc Research award; the National Science Foundation; the DARPA RACER program; the National Institute of Biomedical Imaging and Bioengineering; and the Office of Naval Research.

For more information, contact Nanavati at amaln@cs.washington.edu.