oai:arXiv.org:2406.09332
Computer Science
2024
19.06.2024
This paper introduces RoTipBot, a novel robotic system for handling thin, flexible objects.
Different from previous works that are limited to singulating them using suction cups or soft grippers, RoTipBot can grasp and count multiple layers simultaneously, emulating human handling in various environments.
Specifically, we develop a novel vision-based tactile sensor named RoTip that can rotate and sense contact information around its tip.
Equipped with two RoTip sensors, RoTipBot feeds multiple layers of thin, flexible objects into the centre between its fingers, enabling effective grasping and counting.
RoTip's tactile sensing ensures both fingers maintain good contact with the object, and an adjustment approach is designed to allow the gripper to adapt to changes in the object.
Extensive experiments demonstrate the efficacy of the RoTip sensor and the RoTipBot approach.
The results show that RoTipBot not only achieves a higher success rate but also grasps and counts multiple layers simultaneously -- capabilities not possible with previous methods.
Furthermore, RoTipBot operates up to three times faster than state-of-the-art methods.
The success of RoTipBot paves the way for future research in object manipulation using mobilised tactile sensors.
All the materials used in this paper are available at \url{https://sites.google.com/view/rotipbot}.
;Comment: 20 pages, 21 figures
Jiang, Jiaqi,Zhang, Xuyang,Gomes, Daniel Fernandes,Do, Thanh-Toan,Luo, Shan, 2024, RoTipBot: Robotic Handling of Thin and Flexible Objects using Rotatable Tactile Sensors