Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/19665
Title: Multi-view 3D data acquisition using a single uncoded light pattern
Authors: Cristina, Stefania
Camilleri, Kenneth P.
Galea, Thomas
Keywords: Computer vision
Robotics
Three-dimensional imaging
Issue Date: 2011
Publisher: SciTePress
Citation: Cristina, S., Camilleri, K. P., & Galea, T. (2011). Multi-view 3D data acquisition using a single uncoded light pattern. 8th International Conference on Informatics in Control, Automation and Robotics, ICINCO 2011, Noordwijkerhout. 317-320.
Abstract: This research concerns the acquisition of 3-dimensional data from images for the purpose of modeling a person's head. This paper proposes an approach for acquiring the 3-dimensional reconstruction using a multiple stereo camera vision platform and a combination of passive and active lighting techniques. The proposed one-shot active lighting method projects a single, binary dot pattern, hence ensuring the suitability of the method to reconstruct dynamic scenes. Contrary to the conventional spatial neighborhood coding techniques, this approach matches corresponding spots between image pairs by exploiting solely the redundant data available in the multiple camera images. This produces an initial, sparse reconstruction, which is then used to guide a passive lighting technique to obtain a dense 3-dimensional representation of the object of interest. The results obtained reveal the robustness of the projected pattern and the spot matching algorithm, and a decrease in the number of false matches in the 3-dimensional dense reconstructions, particularly in smooth and textureless regions on the human face.
Description: This work is part of the project ’3D-Head’ funded by the Malta Council for Science and Technology under Research Grant No. RTDI-2004-034.
URI: https://www.um.edu.mt/library/oar//handle/123456789/19665
ISBN: 9789898425751
Appears in Collections:Scholarly Works - FacEngSCE



Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.