Title :
An unsupervised approach for comparing styles of illustrations
Author :
Furuya, Takahiko ; Kuriyama, Shigeru ; Ohbuchi, Ryutarou
Author_Institution :
Grad. Sch. of Med. & Eng., Univ. of Yamanashi, Kofu, Japan
Abstract :
In creating web pages, books, or presentation slides, consistent use of tasteful visual style(s) is quite important. In this paper, we consider the problem of style-based comparison and retrieval of illustrations. In their pioneering work, Garces et al. [2] proposed an algorithm for comparing illustrative style. The algorithm uses supervised learning that relied on stylistic labels present in a training dataset. In reality, obtaining such labels is quite difficult. In this paper, we propose an unsupervised approach to achieve accurate and efficient stylistic comparison among illustrations. The proposed algorithm combines heterogeneous local visual features extracted densely. These features are aggregated into a feature vector per illustration prior to be treated with distance metric learning based on unsupervised dimension reduction for saliency and compactness. Experimental evaluation of the proposed method by using multiple benchmark datasets indicates that the proposed method outperforms existing approaches.
Keywords :
feature extraction; image retrieval; unsupervised learning; vectors; distance metric learning; feature vector; heterogeneous local visual feature extraction; illustration retrieval; illustration styles; illustrative style; style-based comparison; stylistic labels; supervised learning; unsupervised approach; unsupervised dimension reduction; visual styles; Accuracy; Feature extraction; Histograms; Image color analysis; Measurement; Quantization (signal); Training; Local image feature; illustration style feature; illustration style tag; unsupervised distance metric learning;
Conference_Titel :
Content-Based Multimedia Indexing (CBMI), 2015 13th International Workshop on
DOI :
10.1109/CBMI.2015.7153615