Title :
Multi-task Learning for Bayesian Matrix Factorization
Author_Institution :
Siemens Corp. Res., Princeton, NJ, USA
Abstract :
Data sparsity is a big challenge for collaborative filtering. This problem becomes more serious if the dataset is newly created and has even fewer ratings. By sharing knowledge among different datasets, multi-task learning is a promising technique to address this issue. Most prior work methods directly share objects (users or items) across different datasets. However, object identities and correspondences may not be known in many cases. We extend the previous work of Bayesian matrix factorization with Dirichlet process mixture into a multi-task learning approach by sharing latent parameters among different tasks. Our method does not require object identities and thus is more widely applicable. The proposed model is fully non-parametric in that the dimension of latent feature vectors is automatically determined. Inference is performed using the variational Bayesian algorithm, which is much faster than Gibbs sampling used by most other related Bayesian methods.
Keywords :
belief networks; collaborative filtering; knowledge representation; learning (artificial intelligence); matrix decomposition; sampling methods; Bayesian matrix factorization; Dirichlet process; Gibbs sampling; collaborative filtering; data sparsity; knowledge sharing; latent feature vectors; multitask learning; multitask learning approach; variational Bayesian algorithm; Bayesian methods; Covariance matrix; Matrix decomposition; Motion pictures; Principal component analysis; Training; Vectors; co-clustering; collaborative filtering; matrix factorization;
Conference_Titel :
Data Mining (ICDM), 2011 IEEE 11th International Conference on
Conference_Location :
Vancouver,BC
Print_ISBN :
978-1-4577-2075-8
DOI :
10.1109/ICDM.2011.107