Title :
Maximized mutual information using macrocanonical probability distributions
Author_Institution :
Appl. Phys. Lab., Johns Hopkins Univ., Laurel, MD, USA
Abstract :
A maximum entropy formulation leads to a neural network which is factorable in both form and function into individual neurons corresponding to the Hopfield neural model. A maximized mutual information criterion dictates the optimal learning methodology using locally available information
Keywords :
Hopfield neural nets; learning (artificial intelligence); maximum entropy methods; probability; statistical analysis; Hopfield neural model; locally available information; macrocanonical probability distributions; maximized mutual information; maximum entropy formulation; neural network; optimal learning methodology; Biological system modeling; Biology computing; Degradation; Entropy; Equations; Mutual information; Neural networks; Neurons; Physics; Sampling methods;
Conference_Titel :
Information Theory and Statistics, 1994. Proceedings., 1994 IEEE-IMS Workshop on
Conference_Location :
Alexandria, VA
Print_ISBN :
0-7803-2761-6
DOI :
10.1109/WITS.1994.513892