DocumentCode
1509971
Title
Learning capacity and sample complexity on expert networks
Author
Fu, LiMin
Author_Institution
Dept. of Comput. & Inf. Sci., Florida Univ., Gainesville, FL, USA
Volume
7
Issue
6
fYear
1996
fDate
11/1/1996 12:00:00 AM
Firstpage
1517
Lastpage
1520
Abstract
A major development in knowledge-based neural networks is the integration of symbolic expert rule-based knowledge into neural networks, resulting in so-called rule-based neural (or connectionist) networks. An expert network here refers to a particular construct in which the uncertainty management model of symbolic expert systems is mapped into the activation function of the neural network. This paper addresses a yet-to-be-answered question: Why can expert networks generalize more effectively from a finite number of training instances than multilayered perceptrons? It formally shows that expert networks reduce generalization dimensionality and require relatively small sample sizes for correct generalization
Keywords
generalisation (artificial intelligence); knowledge based systems; learning (artificial intelligence); neural nets; symbol manipulation; uncertainty handling; activation function; generalization dimensionality; knowledge-based neural networks; learning capacity; neural networks; rule-based neural networks; sample complexity; symbolic expert rule-based knowledge; uncertainty management model; Artificial intelligence; Artificial neural networks; Expert systems; Management training; Multi-layer neural network; Multilayer perceptrons; Network topology; Neural networks; Problem-solving; Uncertainty;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/72.548180
Filename
548180
Link To Document