Title of article :
Sparse coding for layered neural networks
Author/Authors :
Katsuki Katayama، نويسنده , , Yasuo Sakata، نويسنده , , Tsuyoshi Horiguchi، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2002
Pages :
15
From page :
532
To page :
546
Abstract :
We investigate storage capacity of two types of fully connected layered neural networks with sparse coding when binary patterns are embedded into the networks by a Hebbian learning rule. One of them is a layered network, in which a transfer function of even layers is different from that of odd layers. The other is a layered network with intra-layer connections, in which the transfer function of inter-layer is different from that of intra-layer, and inter-layered neurons and intra-layered neurons are updated alternately. We derive recursion relations for order parameters by means of the signal-to-noise ratio method, and then apply the self-control threshold method proposed by Dominguez and Bollé to both layered networks with monotonic transfer functions. We find that a critical value αC of storage capacity is about 0.11a ln a−1 (a 1) for both layered networks, where a is a neuronal activity. It turns out that the basin of attraction is larger for both layered networks when the self-control threshold method is applied
Journal title :
Physica A Statistical Mechanics and its Applications
Serial Year :
2002
Journal title :
Physica A Statistical Mechanics and its Applications
Record number :
867846
Link To Document :
بازگشت