• DocumentCode
    328291
  • Title

    Fast training of backpropagation networks employing threshold logic transform

  • Author

    Tateishi, Masahiko ; Tamura, Shin ichi ; Matsumoto, Muneaki ; Akita, Shigeyuki

  • Author_Institution
    Res. Lab., Nippondenso Co. Ltd., Aichi, Japan
  • Volume
    1
  • fYear
    1993
  • fDate
    25-29 Oct. 1993
  • Firstpage
    577
  • Abstract
    A backpropagation network model employing the threshold logic transform (TLT) for faster training is proposed. TLT, a simple mathematical transformation, is inserted between the input layer and the hidden layer to speed up extraction of complex features of input. When comparing the conventional method versus adding TLT the study of three classification tasks revealed that with TLT the convergence speed is 5 to 20 times faster. The rate of convergence is also much greater. The conventional method is 33.3%, whereas, with TLT it is 99.3%. Furthermore, analyses of the weighted sum of inputs reveal that the hidden units of the proposed networks effectively extract the global features of input.
  • Keywords
    backpropagation; convergence of numerical methods; feature extraction; neural nets; threshold logic; transforms; backpropagation networks; convergence; faster training; feature extraction; hidden layer; input layer; threshold logic transform; Computer networks; Convergence; Data mining; Feature extraction; Laboratories; Logic; Multi-layer neural network; Network topology; Polynomials; Transfer functions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
  • Print_ISBN
    0-7803-1421-2
  • Type

    conf

  • DOI
    10.1109/IJCNN.1993.713981
  • Filename
    713981