DocumentCode :
3752144
Title :
A near-duplicate video retrieval method based on Zernike moments
Author :
Tang-You Chang;Shen-Chuan Tai;Guo-Shiang Lin
Author_Institution :
Institute of Computer and Communication Engineering, National Cheng Kung University
fYear :
2015
Firstpage :
860
Lastpage :
864
Abstract :
In this paper, a near-duplicate video retrieval method developed based on invariant features was proposed. After shot change detection, Zernike moments are extracted from each key-frame of videos as invariant features. We obtain the key-frame similarity by computing the difference of Zernike moments between key-frames of the query and test videos. To achieve near-duplicate video retrieval, each key-frame is considered as an individual sensor and then evaluating all of key-frames is considered as multiple sensors. The results of key-frames are fused to obtain a better performance of near-duplicate video retrieval. The experimental results show that the proposed method can not only find the relevant videos effectively but also resist to the possible modifications such as re-scaling and logo insertion.
Keywords :
"Feature extraction","Computational complexity","Information filters","Electronic mail","YouTube","Principal component analysis"
Publisher :
ieee
Conference_Titel :
Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2015 Asia-Pacific
Type :
conf
DOI :
10.1109/APSIPA.2015.7415393
Filename :
7415393
Link To Document :
بازگشت