Enhancement of real-time grasp detection by cascaded deep convolutional neural networks

Yaoqing Weng, Ying Sun, Du Jiang, Bo Tao, Juntong Yun, Ying Liu, Dalin Zhou

Research output: Contribution to journalArticlepeer-review

337 Downloads (Pure)

Abstract

Robot grasping technology is a hot spot in robotics research. In relatively fixed industrialized scenarios, using robots to perform grabbing tasks is efficient and lasts a long time. However, in an unstructured environment, the items are diverse, the placement posture is random, and multiple objects are stacked and occluded each other, which makes it difficult for the robot to recognize the target when it is grasped and the grasp method is complicated. Therefore, we propose an accurate, real‐time robot grasp detection method based on convolutional neural networks. A cascaded two‐stage convolutional neural network model with course to fine position and attitude was established. The R‐FCN model was used as the extraction of the candidate frame of the picking position for screening and rough angle estimation, and aiming at the insufficient accuracy of the previous methods in pose detection, an Angle‐Net model is proposed to finely estimate the picking angle. Tests on the Cornell dataset and online robot experiment results show that the method can quickly calculate the optimal gripping point and posture for irregular objects with arbitrary poses and different shapes. The accuracy and real‐time performance of the detection have been improved compared to previous methods.
Original languageEnglish
JournalConcurrency and Computation: Practice and Experience
Early online date1 Sept 2020
DOIs
Publication statusEarly online - 1 Sept 2020

Fingerprint

Dive into the research topics of 'Enhancement of real-time grasp detection by cascaded deep convolutional neural networks'. Together they form a unique fingerprint.

Cite this