• DocumentCode
    2697989
  • Title

    Scaling of back-propagation training time to large dimensions

  • Author

    Wilensky, Gregg D. ; Neuhaus, Joseph A.

  • fYear
    1990
  • fDate
    17-21 June 1990
  • Firstpage
    239
  • Abstract
    The training time for the back-propagation neural network algorithm is studied as a function of input dimension for the problem of discriminating between two overlapping multidimensional Gaussian distributions. This problem is simple enough (it is linearly separable for distributions which are not centered at the same point) to allow an analytic determination of the expected performance, yet it is realistic in the sense that many real-world problems have distributions of discriminants which are approximately Gaussian. The simulations are carried out for input dimensions ranging from 1 to 1000 and show that, for large enough N, the training time scales linearly with input dimension, N, when a constant error criterion is used to determine when to terminate training. The slope of this linear dependence is a function of the error criterion and the ratio of the standard deviation to separation of the two Gaussian distributions. The closer the separation, the longer is the required training time. For each input dimension, a full statistical treatment was implemented by training the network 400 times, with a different random initialization of weights and biases each time. These results provide insight into the ultimate limitations of a straightforward implementation of back-propagation
  • Keywords
    learning systems; neural nets; back-propagation training time; error criterion; large dimensions; neural network algorithm; overlapping multidimensional Gaussian distributions; simulations;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1990., 1990 IJCNN International Joint Conference on
  • Conference_Location
    San Diego, CA, USA
  • Type

    conf

  • DOI
    10.1109/IJCNN.1990.137851
  • Filename
    5726809