Robot Hand With Sensitive Touch Can Grasp Fragile Objects Like Raspberries & Chips

0 comments

Robotic Hand with Human-Like Touch Could Revolutionize Healthcare and Manufacturing

A recent robotic hand developed at the University of Texas at Austin demonstrates a level of tactile sensitivity previously unseen in robotics, capable of grasping objects as delicate as a potato chip or a raspberry without causing damage. This breakthrough, dubbed Fragile Object Grasping with Tactile Sensing (FORTE), combines advanced tactile sensing with soft robotics and promises to improve robot performance in industries requiring a gentle touch, such as healthcare and manufacturing.

The Challenge of Delicate Manipulation

Although robots excel at repetitive tasks and large movements, they often struggle with the fine motor skills and delicate touch required for handling fragile objects. “Right now, robotics is starting to be able to do large motions around the house, but struggles with really fine and delicate movements,” explains Siqi Shang, lead author of a research paper published in IEEE Robotics and Automation Letters and a doctoral student at UT Austin’s Cockrell School of Engineering. “Robots can fold a shirt but may struggle to carefully pick up your glasses or unpack fruit from your groceries. We believe sensing signals will give robots a sense of touch to handle these objects carefully.”

Inspired by Nature: The Fin-Ray Effect

The FORTE technology’s design draws inspiration from the fin-ray effect, a structural principle found in fish fins. The robotic fingers are created using advanced 3D-printing techniques and incorporate internal, empty air channels that function as tactile sensors. As the fingers grasp an object, these air channels shift, causing changes in air pressure. Small, commercially available sensors detect these pressure changes, providing the robot with real-time force feedback and alerting it to any slippage.

Exceptional Performance and Precision

Researchers tested the robotic hand on 31 different objects, ranging from fragile items like raspberries and potato chips to slippery objects like jam jars and billiard balls, and everyday items like soup cans and apples. In single-trial grasping experiments, the system achieved a 91.9% success rate, significantly outperforming traditional grippers that rely solely on visual feedback. The system accurately identified 93% of slip events with 100% precision, ensuring the robot adjusts its grip only when necessary to prevent damage.

“Humans pick up objects with just the right amount of force; too much and you’ll crush it, but too little and it’ll slip out of your hand,” says Lillian Chin, assistant professor of electrical and computer engineering at UT Austin. “Most current force sensors aren’t fast or accurate enough to provide that Goldilocks level of detail. In particular, our sensors operate closer to the timescales of human hand sensors.”

Durability and Customization

Beyond its speed and accuracy, the FORTE system offers increased durability and customization options. Because the sensors are 3D-printed, they can be easily adapted to various shapes and sizes. The slip-sensing capability is a key differentiator, as few existing robotic gripping technologies include this feature, and those that do often lack FORTE’s speed and responsiveness.

Potential Applications Across Industries

The FORTE technology has the potential to transform several industries:

  • Food Processing: More sensitive machinery could reduce waste and improve efficiency when handling delicate produce and baked goods.
  • Healthcare: Robots could precisely handle medical instruments and fragile biological samples.
  • Manufacturing: The technology could be used to manipulate delicate components, such as electronics or glassware.

Open-Source Approach and Future Development

To accelerate innovation in the field, the researchers have made the hardware designs and algorithms for FORTE publicly available. Ongoing research focuses on improving the sensors’ resistance to temperature changes and enhancing the robot’s ability to catch slipping objects.

This research was supported by the Texas Robotics Industrial Affiliate Program, the National Science Foundation, the Office of Naval Research, the DARPA TIAMAT program, and South Korea’s Institute of Information & Communications Technology Planning & Evaluation.

Source: University of Texas at Austin News

Related Posts

Leave a Comment