• DocumentCode
    38626
  • Title

    Touch Saliency: Characteristics and Prediction

  • Author

    Bingbing Ni ; Mengdi Xu ; Nguyen, Troy V. ; Meng Wang ; Congyan Lang ; Zhongyang Huang ; Shuicheng Yan

  • Author_Institution
    Adv. Digital Sci. Center, Singapore, Singapore
  • Volume
    16
  • Issue
    6
  • fYear
    2014
  • fDate
    Oct. 2014
  • Firstpage
    1779
  • Lastpage
    1791
  • Abstract
    In this work, we propose an alternative ground truth to the eye fixation map in visual attention study, called touch saliency. As it can be directly collected from the recorded data of users´ daily browsing behavior on widely used smart phone devices with touch screens, the touch saliency data is easy to obtain. Due to the limited screen size, smart phone users usually move and zoom in the images, and fix the region of interest on the screen when browsing images. Our studies are two-fold. First, we collect and study the characteristics of these touch screen fixation maps (named touch saliency) by comprehensive comparisons with their counterpart, the eye-fixation maps (namely, visual saliency). The comparisons show that the touch saliency is highly correlated with the eye fixations for the same stimuli, which indicates its utility in data collection for visual attention study. Based on the consistency between both touch saliency and visual saliency, our second task is to propose a unified saliency prediction model for both visual and touch saliency detection. This model utilizes middle-level object category features extracted from pre-segmented image superpixels as input to the recently proposed multitask sparsity pursuit (MTSP) framework for saliency prediction. Extensive evaluations show that the proposed middle-level category features can considerably improve the saliency prediction performance when taking both touch saliency and visual saliency as ground truth.
  • Keywords
    feature extraction; smart phones; user interfaces; MTSP framework; eye fixation map; image browsing; image superpixel; middle-level object category features; multitask sparsity pursuit framework; saliency prediction performance; smart phone devices; touch saliency; touch screen fixation maps; unified saliency prediction model; user daily browsing behavior; visual attention study; Computational modeling; Data collection; Educational institutions; Feature extraction; Predictive models; Smart phones; Visualization; Fixations; middle-level object category features; touch saliency; visual saliency;
  • fLanguage
    English
  • Journal_Title
    Multimedia, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1520-9210
  • Type

    jour

  • DOI
    10.1109/TMM.2014.2329275
  • Filename
    6826510