DocumentCode :
2344750
Title :
Maximized mutual information using macrocanonical probability distributions
Author :
Fry, Robert L.
Author_Institution :
Appl. Phys. Lab., Johns Hopkins Univ., Laurel, MD, USA
fYear :
1994
fDate :
27-29 Oct 1994
Firstpage :
63
Abstract :
A maximum entropy formulation leads to a neural network which is factorable in both form and function into individual neurons corresponding to the Hopfield neural model. A maximized mutual information criterion dictates the optimal learning methodology using locally available information
Keywords :
Hopfield neural nets; learning (artificial intelligence); maximum entropy methods; probability; statistical analysis; Hopfield neural model; locally available information; macrocanonical probability distributions; maximized mutual information; maximum entropy formulation; neural network; optimal learning methodology; Biological system modeling; Biology computing; Degradation; Entropy; Equations; Mutual information; Neural networks; Neurons; Physics; Sampling methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory and Statistics, 1994. Proceedings., 1994 IEEE-IMS Workshop on
Conference_Location :
Alexandria, VA
Print_ISBN :
0-7803-2761-6
Type :
conf
DOI :
10.1109/WITS.1994.513892
Filename :
513892
Link To Document :
بازگشت