• DocumentCode
    579349
  • Title

    User assisted stereo image segmentation

  • Author

    Tasli, H. Emrah ; Alatan, A. Aydin

  • Author_Institution
    Dept. of Electr. & Electron. Eng, Middle East Tech. Univ., Ankara, Turkey
  • fYear
    2012
  • fDate
    15-17 Oct. 2012
  • Firstpage
    1
  • Lastpage
    4
  • Abstract
    The wide availability of stereoscopic 3D displays created a considerable market for content producers. This encouraged researchers to focus on methods to alter and process the content for various purposes. This study concentrates on user assisted image segmentation and proposes a method to extend previous techniques on monoscopic image segmentation to stereoscopic footage with minimum effort. User assistance is required to indicate the representative locations of an image as object and background regions. An MRF based energy minimization technique is utilized where user inputs are applied only on one of the stereoscopic pairs. A key contribution of the proposed study is the elimination of dense disparity estimation by introducing a sparse feature matching idea. Segmentation results are evaluated by objective metrics on a ground truth stereo segmentation dataset and it can be concluded that competitive results with minimum user interaction have been obtained even without dense disparity estimation.
  • Keywords
    Markov processes; image matching; image segmentation; minimisation; stereo image processing; MRF; Markov random field; dense disparity estimation; energy minimization technique; monoscopic image segmentation; objective metrics; sparse feature matching; stereoscopic 3D display; user assisted stereo image segmentation; user interaction; Computer vision; Conferences; Estimation; Image segmentation; Minimization; Stereo image processing;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON), 2012
  • Conference_Location
    Zurich
  • ISSN
    2161-2021
  • Print_ISBN
    978-1-4673-4904-8
  • Electronic_ISBN
    2161-2021
  • Type

    conf

  • DOI
    10.1109/3DTV.2012.6365447
  • Filename
    6365447