• DocumentCode
    254333
  • Title

    Visual Tracking Using Pertinent Patch Selection and Masking

  • Author

    Dae-Youn Lee ; Jae-Young Sim ; Chang-Su Kim

  • Author_Institution
    Sch. of Electr. Eng., Korea Univ., Seoul, South Korea
  • fYear
    2014
  • fDate
    23-28 June 2014
  • Firstpage
    3486
  • Lastpage
    3493
  • Abstract
    A novel visual tracking algorithm using patch-based appearance models is proposed in this paper. We first divide the bounding box of a target object into multiple patches and then select only pertinent patches, which occur repeatedly near the center of the bounding box, to construct the foreground appearance model. We also divide the input image into non-overlapping blocks, construct a background model at each block location, and integrate these background models for tracking. Using the appearance models, we obtain an accurate foreground probability map. Finally, we estimate the optimal object position by maximizing the likelihood, which is obtained by convolving the foreground probability map with the pertinence mask. Experimental results demonstrate that the proposed algorithm outperforms state-of-the-art tracking algorithms significantly in terms of center position errors and success rates.
  • Keywords
    object tracking; probability; block location; bounding box; center position errors; foreground appearance model; foreground probability map; likelihood maximization; nonoverlapping blocks; optimal object position estimation; patch-based appearance models; pertinent patch masking; pertinent patch selection; success rates; visual tracking; Algorithm design and analysis; Color; Computational modeling; Feature extraction; Histograms; Image color analysis; Target tracking;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computer Vision and Pattern Recognition (CVPR), 2014 IEEE Conference on
  • Conference_Location
    Columbus, OH
  • Type

    conf

  • DOI
    10.1109/CVPR.2014.446
  • Filename
    6909841