DocumentCode
2931345
Title
User generated video annotation using Geo-tagged image databases
Author
Abdollahian, Golnaz ; Delp, Edward J.
Author_Institution
Video & Image Process. Lab. (VIPER), Purdue Univ., West Lafayette, IN, USA
fYear
2009
fDate
June 28 2009-July 3 2009
Firstpage
610
Lastpage
613
Abstract
In this paper we propose a system that annotates a user generated video based on the associated location metadata, by exploiting user-tagged image databases. An example of such a database is a photo sharing Web site such as Flickr where users upload their images and annotate them with various tags. The goal is to find the tags that have high probability of being relevant to the video without any complex object or action recognition being done to the video sequence. A video is first segmented into camera views and a set of keyframes are selected to represent the video. We will describe the concept of camera view as the basic element of user generated videos which has special properties suitable for the video annotation application. The keyframes are used to retrieve the most relevant images in the database. A ldquotag processingrdquo step is then used to tag the video.
Keywords
image representation; image segmentation; image sequences; meta data; probability; video retrieval; video signal processing; visual databases; Flickr; action recognition; camera view; geo-tagged image database; location metadata; object recognition; photo sharing Web site; probability; user generated video annotation; video representation; video retrieval; video segmentation; video sequence; Cameras; Data engineering; Image databases; Image processing; Image segmentation; Laboratories; Motion analysis; Supervised learning; Video sequences; Video sharing; Video annotation; geo-location; tagging; user generated tags; user generated video;
fLanguage
English
Publisher
ieee
Conference_Titel
Multimedia and Expo, 2009. ICME 2009. IEEE International Conference on
Conference_Location
New York, NY
ISSN
1945-7871
Print_ISBN
978-1-4244-4290-4
Electronic_ISBN
1945-7871
Type
conf
DOI
10.1109/ICME.2009.5202570
Filename
5202570
Link To Document