Title :
Learning capacity and sample complexity on expert networks
Author_Institution :
Dept. of Comput. & Inf. Sci., Florida Univ., Gainesville, FL, USA
fDate :
11/1/1996 12:00:00 AM
Abstract :
A major development in knowledge-based neural networks is the integration of symbolic expert rule-based knowledge into neural networks, resulting in so-called rule-based neural (or connectionist) networks. An expert network here refers to a particular construct in which the uncertainty management model of symbolic expert systems is mapped into the activation function of the neural network. This paper addresses a yet-to-be-answered question: Why can expert networks generalize more effectively from a finite number of training instances than multilayered perceptrons? It formally shows that expert networks reduce generalization dimensionality and require relatively small sample sizes for correct generalization
Keywords :
generalisation (artificial intelligence); knowledge based systems; learning (artificial intelligence); neural nets; symbol manipulation; uncertainty handling; activation function; generalization dimensionality; knowledge-based neural networks; learning capacity; neural networks; rule-based neural networks; sample complexity; symbolic expert rule-based knowledge; uncertainty management model; Artificial intelligence; Artificial neural networks; Expert systems; Management training; Multi-layer neural network; Multilayer perceptrons; Network topology; Neural networks; Problem-solving; Uncertainty;
Journal_Title :
Neural Networks, IEEE Transactions on