Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/106707
Title: Scene segmentation and understanding for context-free point clouds
Other Titles: Pacific graphics short papers
Authors: Spina, Sandro
Debattista, Kurt
Bugeja, Keith
Chalmers, Alan
Keywords: Computer graphics
Image segmentation
Cloud computing
Issue Date: 2014
Publisher: The Eurographics Association
Citation: Spina, S., Debattista, K., Bugeja, K., & Chalmers, A. (2014). Scene segmentation and understanding for context-free point clouds. In J. Keyser, Y. J. Kim, & P. Wonka (Eds.), Pacific Graphics Short Papers (pp. 7-12). The Eurographics Association.
Abstract: The continuous development of new commodity hardware intended to capture the surface structure of objects is quickly making point cloud data ubiquitous. Scene understanding methods address the problem of determining the objects present in a point cloud which, dependant on sensor capabilities and object occlusions, is normally noisy and incomplete. In this paper, we propose a novel technique which enables automatic identification of semantically meaningful structures within point clouds acquired using different sensors on a variety of scenes. The representation model, namely the structure graph, with nodes representing planar surface segments, is computed over these point clouds to help with the identification task. In order to accommodate for more complex objects (e.g. chair, couch, cabinet, table), a training process is used to determine and concisely describe, within each object’s structure graph, its important shape characteristics. Results on a variety of point clouds show how our method can quickly discern certain object types.
URI: https://www.um.edu.mt/library/oar/handle/123456789/106707
Appears in Collections:Scholarly Works - FacICTCS

Files in This Item:
File Description SizeFormat 
Scene_segmentation_and_understanding_for_context_free_point_clouds_2014.pdf
  Restricted Access
3.71 MBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.