SURF based 3D object recognition for robot hand grasping

This paper proposes a 3D object recognition method based on 3D SURF and the derivation of the robot space transformations. In a previous work, a three fingered robot hand had been developed for grasping task. The reference position of the robot hand was programmed based on predetermined values for g...

Full description

Bibliographic Details
Published in:Pertanika Journal of Science and Technology
Main Author: Remeli N.H.; Shauri R.L.A.; Yahaya F.H.; Salleh N.M.; Nasir K.; Yassin A.I.M.
Format: Article
Language:English
Published: Universiti Putra Malaysia Press 2017
Online Access:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85049132223&partnerID=40&md5=86fad1f2a2e9bdcbfab22f9992d7a113
Description
Summary:This paper proposes a 3D object recognition method based on 3D SURF and the derivation of the robot space transformations. In a previous work, a three fingered robot hand had been developed for grasping task. The reference position of the robot hand was programmed based on predetermined values for grasping two different shapes of object. The work showed successful grasping but it could not generate the reference position on its own since no external sensor was used, hence it is not fully automated. Later, a 2D Speed-Up Robust Features (SURF) and 3D point cloud algorithm were applied to calculate the object’s 3D position where the result showed that the method was capable of recognising but unable to calculate the 3D position. Thus, the present study developed 3D SURF by combining recognised images based on 2D SURF and triangulation method. The identified object grasping points then are converted to robot space using the robot’s transformation equation which is derived based on dimensions between robot and camera in the workplace. The result supported the capability of the SURF algorithm for recognising the target without fail for nine random images but produced errors in the 3D position. Meanwhile, the transformation has been successful where the calculated object positions are inclined towards the directions of actual measured positions accordingly related to robot coordinates. However, maximum error of 3.90 cm was observed due to the inaccuracy of SURF detection and human error during manual measurement which can to be solved by improving the SURF algorithm in future. © 2017 Universiti Putra Malaysia Press.
ISSN:1287680