DocumentCode :
3410417
Title :
Coding theory and regularization
Author :
Connor, Jerome T. ; Atlas, Les E.
Author_Institution :
Bellcore, Morristown, NJ, USA
fYear :
1993
fDate :
1993
Firstpage :
158
Lastpage :
167
Abstract :
This paper uses two principles, the robust encoding of residuals and the efficient coding of parameters, to obtain a new learning rule for neural networks. In particular, it examines how different coding techniques give rise to different learning rules. The storage space requirements of parameters and residuals are considered. A `group regularizer´ is derived from encoding of the parameters as a whole group rather than individually
Keywords :
data compression; encoding; learning (artificial intelligence); neural nets; coding techniques; efficient coding of parameters; group regularizer; learning rule; model selection; neural networks; robust encoding of residuals; storage space requirements; Codes; Encoding; Neural networks; Predictive models; Robustness; Statistics; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Data Compression Conference, 1993. DCC '93.
Conference_Location :
Snowbird, UT
Print_ISBN :
0-8186-3392-1
Type :
conf
DOI :
10.1109/DCC.1993.253134
Filename :
253134
Link To Document :
بازگشت