Please use this identifier to cite or link to this item: https://www.um.edu.mt/library/oar/handle/123456789/16681
Title: Fast disparity estimation for multi-view plus depth video coding
Authors: Micallef, Brian W.
Debono, Carl James
Farrugia, Reuben A.
Keywords: Video compression
Three-dimensional display systems
Rate distortion theory
Telecommunication -- Standards
Issue Date: 2011
Publisher: Institute of Electrical and Electronics Engineers Inc.
Citation: Micallef, B. W., Debono, C. J., & Farrugia, R. A. (2011). Fast disparity estimation for multi-view plus depth video coding. Visual Communications and Image Processing (VCIP), Tainan. 1-4.
Abstract: Disparity estimation is used in Multi-view Video Coding (MVC) to remove the inter-view redundancies present in both color and depth multi-view video sequences. The standard H.264/MVC achieves high compression efficiency by deriving the optimal disparity vector through the exhaustive calculation of the Rate-Distortion cost function for all the possible search points. This makes disparity estimation highly computational expensive. This paper proposes an efficient technique that exploits both the multi-view and the epipolar geometries to determine the optimal search area, resulting in a reduction of search points and thus computations. Simulation results show that this technique can save up to 95% of the computational cost for disparity estimation, with negligible loss in coding efficiency for both the color and the depth multi-view video coding.
Description: This research work is partially funded by STEPS-Malta and partially by the EU–ESF 1.25.
URI: https://www.um.edu.mt/library/oar//handle/123456789/16681
Appears in Collections:Scholarly Works - FacICTCCE

Files in This Item:
File Description SizeFormat 
OA Conference paper - Fast disparity estimation for Multi-view plus depth video coding.2-5.pdf
  Restricted Access
Fast disparity estimation for multi-view plus depth video coding327.48 kBAdobe PDFView/Open Request a copy


Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.