• DocumentCode
    285252
  • Title

    A two-layer perceptron for nearest neighbor classifier optimization

  • Author

    Yan, Hong

  • Author_Institution
    Sch. of Electr. Eng., Sydney Univ., NSW, Australia
  • Volume
    3
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    585
  • Abstract
    The performance of a nearest-neighbor classifier is degraded if only a small number of training samples are used as prototypes. An algorithm is presented for modifying the prototypes so that classification rate can be increased. This algorithm makes use of a two-layer perceptron with one second-order input. Each hidden node of the perceptron represents a prototype and the weights of connections between a hidden node and the input nodes are initially set equal to the feature values of the corresponding prototype. The weights are then changed using a gradient-based algorithm to generate a new prototype. The algorithm has been tested with good results
  • Keywords
    feedforward neural nets; optimisation; pattern recognition; classification rate; gradient-based algorithm; hidden node; nearest neighbor classifier optimization; performance; training samples; two-layer perceptron; Australia; Computer errors; Degradation; Multilayer perceptrons; Nearest neighbor searches; Probability distribution; Prototypes; Robustness; Testing; Zinc;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.227111
  • Filename
    227111