This paper presents a novel approach for detecting multipleinstances of the same object for pick-and-place automation.The working conditions are very challenging, with complex objects, arranged at random in the scene, and heavily occluded. This approach exploits SIFT to obtain a set of correspondences between the object model and the current image. In order to segment the multiple instances of the object, the correspondences are clustered among the objects using a voting scheme which determines the best estimate of the object’s center through mean shift. This procedure is compared in terms of accuracy with existing homography-based solutions which make use of RANSAC to eliminate outliers in the homography estimation.
Multiple Object Segmentation for Pick-and-Place Applications / Piccinini, Paolo; Prati, Andrea; Cucchiara, Rita. - ELETTRONICO. - (2009), pp. 361-366. (Intervento presentato al convegno IAPR CONFERENCE ON MACHINE VISION APPLICATIONS MVA2009 tenutosi a Yokohama, Japan nel May 20-22, 2009).
Multiple Object Segmentation for Pick-and-Place Applications
PICCININI, PAOLO;PRATI, Andrea;CUCCHIARA, Rita
2009
Abstract
This paper presents a novel approach for detecting multipleinstances of the same object for pick-and-place automation.The working conditions are very challenging, with complex objects, arranged at random in the scene, and heavily occluded. This approach exploits SIFT to obtain a set of correspondences between the object model and the current image. In order to segment the multiple instances of the object, the correspondences are clustered among the objects using a voting scheme which determines the best estimate of the object’s center through mean shift. This procedure is compared in terms of accuracy with existing homography-based solutions which make use of RANSAC to eliminate outliers in the homography estimation.Pubblicazioni consigliate
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris