• DocumentCode
    2770177
  • Title

    Supervised Information Maximization by Weighted Distance

  • Author

    Kamimura, Ryotaro

  • Author_Institution
    Tokai Univ., Hiratsuka
  • fYear
    0
  • fDate
    0-0 0
  • Firstpage
    1790
  • Lastpage
    1796
  • Abstract
    In this paper, we propose a method to extend information-theoretic competitive learning to supervised competitive learning. We have shown that information maximization correspond to competition in neurons. However, this information maximization cannot be used to specify which neurons should be winners. Thus, it is impossible to incorporate teacher information in information maximization. For dealing with this teacher information, we use weighted distance between input patterns and connection weights. Even if distance between input patterns and connection weights is not so small, the distance are made smaller by the parameter considering teacher information. By this weighted distance, we can naturally incorporate teacher information and extend unsupervised competitive learning to supervised information-theoretic competitive learning.
  • Keywords
    unsupervised learning; information-theoretic competitive learning; supervised competitive learning; supervised information maximization; unsupervised competitive learning; weighted distance; Entropy; Euclidean distance; Information science; Information technology; Information theory; Laboratories; Mutual information; Neural networks; Neurons; Supervised learning; Gaussian function; competition; entropy maximization; guide; mutual information maximization; weighted distance;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2006. IJCNN '06. International Joint Conference on
  • Conference_Location
    Vancouver, BC
  • Print_ISBN
    0-7803-9490-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2006.246896
  • Filename
    1716326