Imagine the convenience of AI-equipped robots that can interact and interpret the surrounding world not only through vision but also through 'touch'. That's precisely what researchers at Duke University are developing, a technology that lets robots identify materials, understand shapes, and recognize objects just as human hands would by 'listening' to vibrations. These abilities aim to significantly enhance a robot's sensory capabilities.
This innovative solution, which was shared at the Conference on Robot Learning (CoRL 2024) taking place in Munich, Germany, is called SonicSense. It provides robots with the ability to interact with their environment in a manner previously thought to be exclusively human.
"Presently, most robots primarily depend on visual information to make sense of the world," said Jiaxun Liu, the leading author of the research paper and a first-year Ph.D. student in Boyuan Chen's laboratory. "Our goal was to design a system that could work with a variety of objects found on a daily basis, substantially enhancing robots' ability to 'feel' and understand their surroundings."
SonicSense is built on a robotic hand, boasting four fingers - each outfitted with a contact microphone embedded in the fingertip. These devices record the vibrations produced when the robot manipulates an object, blocking out ambient noise and enabling the robots to identify the object's material and structure.
SonicSense extracts frequency features and employs AI advancements to identify the shape and substance of an object. If a robot encounters an object not in its database, it may take up to 20 interactions to deduce the object. However, objects previously logged in its database can be quickly identified in as few as four interactions.
Several features uniquely position SonicSense ahead of its predecessors. It utilizes four finger sensors instead of one and employs touch-based microphones and sophisticated AI algorithms to cancel ambient noise. These features enable SonicSense to identify complex objects, demonstrating its utility in perceiving objects in dynamic, unstructured environments. What's more, at a production cost of a little over $200, SonicSense proves to be a cost-effective solution.
The SonicSense team is continually working on improvements, such as the system's recognition capabilities for multiple objects. Furthermore, new design implementations could provide SonicSense-equipped robots with dexterous manipulation skills, allowing these autonomous agents to execute tasks requiring a nuanced sense of touch.
Supported by the Army Research Laboratory STRONG program and DARPA's FoundSci program and TIAMAT, SonicSense promises a future where robots could explore the world around them in much the same way humans do, offering a leap towards human-like adaptability in AI technology.
Disclaimer: The above article was written with the assistance of AI. The original sources can be found on ScienceDaily.