Skip to content
Harnessing the Power of AI to Train Robotic Dogs

Harnessing the Power of AI to Train Robotic Dogs

Researchers from around the globe are joining forces to redefine the future of pet ownership with advancements in artificial intelligence (AI) and edge computing technology, a convergent field known as edge intelligence.

This innovative project, backed by a seed grant for a year from the Institute for Future Technologies (IFT), is a partnership between the New Jersey Institute of Technology (NJIT) and Ben-Gurion University of the Negev (BGU). It aims to breathe life into mechanical dogs, facilitating advanced, tailor-made interactions between dog and owner based on owner's traits and emotions.

At the helm of the research at NJIT's Ying Wu College of Computing, is Assistant Professor Kasthuri Jayarajah. Her ongoing work involves designing a socially assistive model of her Unitree Go2 robotic dog that would adjust its behavior based on the characteristics of the human it interacts with.

The prime objective here is to animate the robotic dog using sensory devices wearables, enabling it to sense and respond to individual personality traits like introversion and transitory states of its human counterpart, such as pain and comfort levels.

This revolutionary innovation has far-reaching potential, particularly in personal and healthcare circles, where it could combat loneliness in the elderly and assist in therapy and rehabilitation. Jayarajah's preliminary work on enabling robotic dogs to comprehend and react to partner's gestural cues will be showcased at the International Conference on Intelligent Robots and Systems (IROS) later this year.

The project also features Shelly Levy-Tzedek, Associate Professor in the Department of Physical Therapy at BGU, as a co-principal investigator. Levy-Tzedek, a well-known researcher and leader in rehabilitation robotics, primarily studies the impact of age and disease on bodily control.

The researchers underscore the increasing accessibility and potential applicability of wearable devices. Commodities like earphones can be reoriented to discern subtle details such as brain activity and micro expressions. The project aims to amalgamate these wearables with traditional robot sensors, like visual and audio, offering a comprehensive, passive insight into user attributes.

Jayarajah states that while the conception of socially assistive robots is intriguing, their long-term sustainability is marred by issues related to cost and size. She raises doubts about the readiness of robots like Unitree Go2 for extensive AI undertakings given their limited processing prowess, sparse memory, and restricted battery life.

The initial phases of the project are based on established sensor fusion along with the exploration of meticulously designed deep-learning-based architectures. The latter would aid in developing user-friendly, affordable wearable sensors to gauge user attributes and adapt motion commands.

Disclaimer: The above article was written with the assistance of an AI. The original sources can be found on ScienceDaily.