A fast and interactive implementation for camera pose registration and 3D point reconstruction over a physical surface is described in this paper. The method (called SRE—Smart Reverse Engineering) extracts from a continuous image streaming, provided by a single camera moving around a real object, a point cloud and the camera’s spatial trajectory. The whole per frame procedure follows three steps: camera calibration, camera registration, bundle adjustment and 3D point calculation. Camera calibration task was performed using a traditionalapproach based on 2-D structured pattern, while the Optical Flow approach and the Lucas-Kanade algorithm was adopted for feature detection and tracking. Camera registration problem was then solved thanks to the Essential Matrix definition. Finally a fast Bundle Adjustment was performed through the Levenberg-Marquardt algorithm to achieve the best trade-off between 3D structure and camera variations. Exploiting a PC and a commercial webcam, an experimental validation was done in order to verify precision in 3D data reconstruction and speed. Practical tests helped also to tune up several optimization parameters used to improve efficiency of most CPU time consuming algorithms, like Optical Flow and Bundle Adjustment.The method showed robust results in 3D reconstruction andvery good performance in real-time applications.

Real-time 3D features reconstruction through monocular vision / A., Liverani; Leali, Francesco; Pellicciari, Marcello. - In: INTERNATIONAL JOURNAL ON INTERACTIVE DESIGN AND MANUFACTURING. - ISSN 1955-2513. - STAMPA. - Volume 4 / Issue 2:(2010), pp. 103-112. [10.1007/s12008-010-0093-5]

Real-time 3D features reconstruction through monocular vision

LEALI, Francesco;PELLICCIARI, Marcello
2010

Abstract

A fast and interactive implementation for camera pose registration and 3D point reconstruction over a physical surface is described in this paper. The method (called SRE—Smart Reverse Engineering) extracts from a continuous image streaming, provided by a single camera moving around a real object, a point cloud and the camera’s spatial trajectory. The whole per frame procedure follows three steps: camera calibration, camera registration, bundle adjustment and 3D point calculation. Camera calibration task was performed using a traditionalapproach based on 2-D structured pattern, while the Optical Flow approach and the Lucas-Kanade algorithm was adopted for feature detection and tracking. Camera registration problem was then solved thanks to the Essential Matrix definition. Finally a fast Bundle Adjustment was performed through the Levenberg-Marquardt algorithm to achieve the best trade-off between 3D structure and camera variations. Exploiting a PC and a commercial webcam, an experimental validation was done in order to verify precision in 3D data reconstruction and speed. Practical tests helped also to tune up several optimization parameters used to improve efficiency of most CPU time consuming algorithms, like Optical Flow and Bundle Adjustment.The method showed robust results in 3D reconstruction andvery good performance in real-time applications.
2010
Volume 4 / Issue 2
103
112
Real-time 3D features reconstruction through monocular vision / A., Liverani; Leali, Francesco; Pellicciari, Marcello. - In: INTERNATIONAL JOURNAL ON INTERACTIVE DESIGN AND MANUFACTURING. - ISSN 1955-2513. - STAMPA. - Volume 4 / Issue 2:(2010), pp. 103-112. [10.1007/s12008-010-0093-5]
A., Liverani; Leali, Francesco; Pellicciari, Marcello
File in questo prodotto:
File Dimensione Formato  
Real-time_3D_features_reconstruction.pdf

Accesso riservato

Tipologia: Versione dell'autore revisionata e accettata per la pubblicazione
Dimensione 524.99 kB
Formato Adobe PDF
524.99 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/641368
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 18
  • ???jsp.display-item.citation.isi??? ND
social impact