Title :
Learning in Gibbsian fields: how accurate and how fast can it be?
Author :
Zhu, Song Chun ; Liu, Xiuwen
Author_Institution :
Dept. of Stat. & Comput. Sci., California Univ., Los Angeles, CA, USA
fDate :
7/1/2002 12:00:00 AM
Abstract :
Gibbsian fields or Markov random fields are widely used in Bayesian image analysis, but learning Gibbs models is computationally expensive. The computational complexity is pronounced by the recent minimax entropy (FRAME) models which use large neighborhoods and hundreds of parameters. In this paper, we present a common framework for learning Gibbs models. We identify two key factors that determine the accuracy and speed of learning Gibbs models: The efficiency of likelihood functions and the variance in approximating partition functions using Monte Carlo integration. We propose three new algorithms. In particular, we are interested in a maximum satellite likelihood estimator, which makes use of a set of precomputed Gibbs models called "satellites" to approximate likelihood functions. This algorithm can approximately estimate the minimax entropy model for textures in seconds in a HP workstation. The performances of various learning algorithms are compared in our experiments
Keywords :
Bayes methods; Markov processes; Monte Carlo methods; computational complexity; entropy; image processing; learning (artificial intelligence); minimax techniques; Bayesian image analysis; FRAME models; Gibbsian fields; Markov random fields; Monte Carlo integration; computational complexity; learning; learning algorithms; likelihood function approximation; maximum satellite likelihood estimator; minimax entropy model; minimax entropy models; Bayesian methods; Computational complexity; Computational modeling; Entropy; Image texture analysis; Markov random fields; Minimax techniques; Monte Carlo methods; Partitioning algorithms; Satellites;
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
DOI :
10.1109/TPAMI.2002.1017626