Title of article :
Vapnik-Chervonenkis dimension of recurrent neural networks Original Research Article
Author/Authors :
Pascal Koiran، نويسنده , , Eduardo D. Sontag، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 1998
Pages :
17
From page :
63
To page :
79
Abstract :
Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimension of such networks. Several types of activation functions are discussed, including threshold, polynomial, piecewise-polynomial and sigmoidal functions. The bounds depend on two independent parameters: the number w of weights in the network, and the length k of the input sequence. In contrast, for feedforward networks, VC dimension bounds can be expressed as a function of w only. An important difference between recurrent and feedforward nets is that a fixed recurrent net can receive inputs of arbitrary length. Therefore we are particularly interested in the case k ⪢ w. Ignoring multiplicative constants, the main results say roughly the following:
Journal title :
Discrete Applied Mathematics
Serial Year :
1998
Journal title :
Discrete Applied Mathematics
Record number :
884774
Link To Document :
بازگشت