Researchers from Cornell University have taken a significant step forward in the field of assistive technology and robotics. They have successfully developed a robotic feeding system that employs computer vision, machine learning, and multimodal sensing to safely feed individuals with severe mobility impairments. The system caters specifically to people with spinal cord injuries, cerebral palsy, and multiple sclerosis.
The project, enormously daunting and complex due to the intricate physical nature of feeding, was led by Professor Tapomayukh "Tapo" Bhattacharjee from the Ann S. Bowers College of Computing and Information Science. Bhattacharjee explains, "The challenge intensifies when feeding individuals with additional complex medical conditions."
In a bid to address these concerns, the researchers developed a robot equipped with two key features: real-time mouth tracking that adjusts to users' movements and a dynamic response mechanism. This innovative system distinguishes between sudden spasms, intentional bites, and attempts to manipulate the utensil inside the mouth.
The team tested the successful prototype with 13 patients presenting divergent medical issues. The trial was conducted across three different locations, including the EmPRISE Lab on the Cornell Ithaca campus, a medical center in New York City, and an individual's home in Connecticut.
In addition to increasing the user's level of independence, the trials had a profound emotional impact. The parents of a girl suffering from schizencephaly quadriplegia, a rare birth defect, watched their daughter feed herself using the robotic system. This touching event triggered celebration and joy among the parents witnessing their daughter's newfound independence.
The project, designed to extend the boundaries of what AI can accomplish, posed a significant challenge. Still, the team's robot, a multi-jointed arm holding a custom-built utensil, demonstrates that technology can indeed fill crucial gaps in human patience. Bhattacharjee joyfully proclaimed, "It's amazing and very fulfilling."
Despite the success of the initial trials, the team acknowledges that further research is needed to explore the system's long-term usability. However, the promising results already highlight the AI's potential to significantly improve the quality of life of people with mobility limitations.
Co-authors of the project paper were Daniel Stabile, M.S. '23, Ziang Liu, a doctoral student in the field of computer science, Abrar Anwar of the University of Southern California, and Katherine Dimitropoulou of Columbia University. The project primarily received funding from the National Science Foundation.
Disclaimer: The above article was written with the assistance of AI. The original sources can be found on ScienceDaily.