• DocumentCode
    274192
  • Title

    Image processing with optimum neural networks

  • Author

    Bichsel, M.

  • Author_Institution
    Paul Scherrer Inst. Zurich, Lab. RCA Ltd., Switzerland
  • fYear
    1989
  • fDate
    16-18 Oct 1989
  • Firstpage
    374
  • Lastpage
    377
  • Abstract
    Although neurons in intermediate layers (hidden units) are very important components of most neural networks, there is a lack of general rules specifying how many hidden layers and how many hidden units per layer should be used to achieve optimum performance of a network. This lack of rules has its roots in the difficulty in judging the performance of the hidden units. The most widely used training algorithm, error back-propagation, allows the measurement of the performance of hidden units only indirectly by propagating their output to the output units, and then propagating the deviation from a desired output back to the hidden units under examination. To overcome these disadvantages, a performance measure, based on information theory, which allows every group of units (in particular groups of hidden units) to be judged, is developed. This performance measure enables us a network to be trained layer-by-layer without referring to particular connections in deeper layers
  • Keywords
    information theory; learning systems; neural nets; picture processing; error back-propagation; hidden units; image processing; information theory; intermediate layer neurons; learning systems; optimum neural networks; performance measure; training algorithm;
  • fLanguage
    English
  • Publisher
    iet
  • Conference_Titel
    Artificial Neural Networks, 1989., First IEE International Conference on (Conf. Publ. No. 313)
  • Conference_Location
    London
  • Type

    conf

  • Filename
    51996