Title :
Diffusion network architectures for implementation of Gibbs samplers with applications to assignment problems
Author :
Ting, Pei-Yih ; Iltis, Ronald A.
Author_Institution :
Center for Inf. Process. Res., California Univ., Santa Barbara, CA, USA
fDate :
7/1/1994 12:00:00 AM
Abstract :
In this paper, analog circuit designs for implementations of Gibbs samplers are presented, which offer fully parallel computation. The Gibbs sampler for a discrete solution space (or Boltzmann machine) can be used to solve both deterministic and probabilistic assignment (association) problems. The primary drawback to the use of a Boltzmann machine for optimization is its computational complexity, since updating of the neurons is typically performed sequentially. We first consider the diffusion equation emulation of a Boltzmann machine introduced by Roysam and Miller (1991), which employs a parallel network of nonlinear amplifiers. It is shown that an analog circuit implementation of the diffusion equation requires a complex neural structure incorporating matched nonlinear feedback amplifiers and current multipliers. We introduce a simpler implementation of the Boltzmann machine, using a “constant gradient” diffusion equation, which eliminates the need for a matched feedback amplifier. The performance of the Roysam and Miller network and the new constant gradient (CG) network is compared using simulations for the multiple-neuron case, and integration of the Chapman-Kolmogorov equation for a single neuron. Based on the simulation results, heuristic criteria for establishing the diffusion equation boundaries, and neuron sigmoidal gain are obtained. The final CG analog circuit is suitable for VLSI implementation, and hence may offer rapid convergence
Keywords :
Boltzmann machines; analogue processing circuits; computational complexity; neural chips; nonlinear network synthesis; Boltzmann machine; Chapman-Kolmogorov equation integration; Gibbs samplers; VLSI implementation; analog circuit designs; association problems; computational complexity; constant gradient diffusion equation; deterministic assignment; diffusion equation emulation; diffusion network architectures; discrete solution space; heuristic criteria; neuron sigmoidal gain; optimization; probabilistic assignment; rapid convergence; sequential neuron updating; Analog circuits; Analog computers; Character generation; Circuit simulation; Computational complexity; Computer architecture; Concurrent computing; Feedback amplifiers; Neurons; Nonlinear equations;
Journal_Title :
Neural Networks, IEEE Transactions on