Title :
Supervised Learning of Quantizer Codebooks by Information Loss Minimization
Author :
Lazebnik, Svetlana ; Raginsky, Maxim
Author_Institution :
Dept. of Comput. Sci., Univ. of North Carolina at Chapel Hill, Chapel Hill, NC
fDate :
7/1/2009 12:00:00 AM
Abstract :
This paper proposes a technique for jointly quantizing continuous features and the posterior distributions of their class labels based on minimizing empirical information loss such that the quantizer index of a given feature vector approximates a sufficient statistic for its class label. Informally, the quantized representation retains as much information as possible for classifying the feature vector correctly. We derive an alternating minimization procedure for simultaneously learning codebooks in the Euclidean feature space and in the simplex of posterior class distributions. The resulting quantizer can be used to encode unlabeled points outside the training set and to predict their posterior class distributions, and has an elegant interpretation in terms of lossless source coding. The proposed method is validated on synthetic and real data sets and is applied to two diverse problems: learning discriminative visual vocabularies for bag-of-features image classification and image segmentation.
Keywords :
image classification; image segmentation; learning (artificial intelligence); Euclidean feature space; bag-of-features image classification; feature vector; image segmentation; information loss minimization; quantizer codebooks; supervised learning; Pattern recognition; clustering; computer vision; information theory; pattern recognition; quantization; scene analysis; segmentation; segmentation.; Algorithms; Artificial Intelligence; Data Compression; Numerical Analysis, Computer-Assisted; Pattern Recognition, Automated; Reproducibility of Results; Sensitivity and Specificity; Signal Processing, Computer-Assisted;
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
DOI :
10.1109/TPAMI.2008.138