Thanks to a team of skilled engineers at the University of California San Diego, humanoid robots have taken one more "step" towards human-like functionality. The research involves training the robots to effortlessly learn and replicate expressive movements. This covers simple dance routines and gestures such as waving, high-fiving, and hugging. All this while maintaining an unwavering balance across a spectrum of terrains.
The robot's increased expressiveness and agility opens new doors for enhancing human-robot interactions in various settings. Environments like factory assembly lines, hospitals, homes, and even hazardous spaces such as laboratories or scenes of disaster could view an infusion of robot assistance.
The project is led by Xiaolong Wang, a professor in the Department of Electrical and Computer Engineering at the UC San Diego Jacobs School of Engineering. He expressed high hopes in advancing robot-human coexistence through indulging the robot's expressiveness. "Through expressive and more human-like body motions, we aim to build trust and showcase the potential for robots to co-exist in harmony with humans", said Professor Wang.
On the mechanical end of the spectrum, the expressiveness is made possible by training the robot on a diverse range of human body motions. This ensures that the robot can quickly generalize new motions and mimic them with relative ease. Just like a dance student would quickly adapt to new dance routines and gestures.
To achieve this goal, the team used motion capture data and dance videos. They split the training into two stages – upper and lower body. The upper body mimics the reference motions, and the legs ensure steady walking and balance across different terrains.
Both the upper and lower body of the robot operate through a unified policy. This guarantees a perfect synergy between the two compartments, enabling the robot to perform complex movements with the upper body while strolling confidently across different surfaces.
The robot started as a virtual humanoid and then transferred to a real robot, during which it showcased the ability to execute both learned and new movements under real-world conditions.
Currently, the robot’s movements are under a human controller's supervision using a game controller. However, the engineering team foresees a future model equipped with a camera that can undertake tasks and navigate environments all by itself.
The team's immediate focus is towards refining the humanoid model's design for handling more intricate and fine-grained tasks. "By extending the capabilities of the upper body, we can expand the range of motions and gestures the robot can perform", concluded Wang.
Disclaimer: The above article was written with the assistance of AI. The original sources can be found on ScienceDaily.