• DocumentCode
    3249563
  • Title

    Maximum likelihood neural networks for adaptive classification

  • Author

    Perlovsky, L.I. ; McManus

  • Author_Institution
    Nichols Res. Corp., Wakefield, MA, USA
  • fYear
    1989
  • fDate
    0-0 1989
  • Abstract
    Summary form only given, as follows. A maximum likelihood neural network has been designed for problems which require an adaptive estimation of metrics in classification spaces. Examples of such problems are an XOR problem and most classification problems with multiple classes having complicated classifier boundaries. The metric estimation has the capability of achieving flexible classifier boundary shapes using a simple architecture without hidden layers. This neural network learns much more efficiently than other neural networks or classification algorithms, and it approaches the theoretical bounds on adaptive efficiency according to the Cramer-Rao theorem. It also provides for optimal fusing of all the available information, such as a priori and real-time information coming from a variety of sensors of the same or different types, and utilizes fuzzy classification variables to provide for the efficient utilization of incomplete erroneous data, including numeric and symbolic data.<>
  • Keywords
    adaptive systems; learning systems; neural nets; pattern recognition; Cramer-Rao theorem; XOR problem; adaptive classification; adaptive efficiency; adaptive estimation; adaptive systems; flexible classifier boundary shapes; fuzzy classification variables; learning systems; maximum likelihood neural network; metric estimation; numeric data; pattern recognition; symbolic data; Adaptive systems; Learning systems; Neural networks; Pattern recognition;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1989. IJCNN., International Joint Conference on
  • Conference_Location
    Washington, DC, USA
  • Type

    conf

  • DOI
    10.1109/IJCNN.1989.118393
  • Filename
    118393