Robotics has transcended beyond the ordinary with the development of advanced intelligent robots. These robots hold the power to precisely recognize objects via tactile and visual data. Such tactile information, collated through sensors, when coupled with machine learning algorithms, empowers robots to identify objects they have interacted with before.
In spite of these advancements, robots can still get confounded when presented with objects that are similar in shape or size, or unknown to the robot. There are other inhibiting factors like background noise or the same object existing in varied shapes and sizes which further complicate robot perception.
Addressing these difficulties, researchers at Tsinghua University have made a breakthrough in robotic recognition of common yet intricate items. Drawing inspiration from humans’ thermal sensing, a type of touch sensing that enables us to perceive varying temperatures and gain insights into various matter types, they have attempted to replicate this facility for more robust and precise object detection in robots.
"We aim to use spatiotemporal tactile sensing during hand grasping to extend the robotic function and ability to perceive multiple attributes of the grasped object - including thermal conductivity, thermal diffusivity, surface roughness, contact pressure, and temperature," elucidates author Rong Zhu.
The researchers developed a layered sensor featuring material detection at the surface, pressure sensitivity at the bottom, and a porous middle layer sensitive to thermal variations. This sensor was paired with an efficient cascade classification algorithm that sequentially precludes object types from easy to hard, beginning with straightforward categories like empty cartons before progressing to complex items like orange peels or cloth scraps.
The team tested their method by creating an intelligent robot tactile system for garbage sorting. The robot successfully managed to identify various common waste objects like empty cartons, bread scraps, plastic bags, bottles, napkins, sponges, orange peels, and expired drugs, and sorted them into respective containers. This system exhibited a remarkable classification accuracy of 98.85%, demonstrating the potential to significantly reduce manual labor and expand the applicability arena for smart technologies.
Future explorations will seek to further enhance the autonomous implementation and embodied intelligence capabilities of robots. "In addition, by combining this sensor with brain-computer interface technology, tactile information collected by this sensor could be converted into neural signals acceptable to the human brain, potentially restoring tactile perception capabilities for people with hand disabilities," shares Zhu.
Disclaimer: The above article was written with the assistance of AI. The original sources can be found on ScienceDaily.