Title :
Taking laws out of trained neural networks
Author :
Jaroslaw Majewski;Ryszard Wojtyna
Author_Institution :
University of Technology and Life Sciences, Faculty of Telecommunication &
Abstract :
In this paper, a problem of discovering numeric laws governing a trained neural network is considered. We propose new multilayer perceptrons implementing fractional rational functions, i.e. functions expressed as ratio of two polynomials of any order with a given number of components in the function numerator and denominator. Our networks can be utilized not only for the function implementation. They can also be used to extract knowledge embedded in the trained network. This is performed during the training process. The extracted laws, underlying the network operation, are expressed in the symbolic, fractional-rational-function form. Our networks provide information about the function parameters. The extraction ability results from applying proper activation functions in different perceptron layers, i.e. functions of exp(.), ln(.), (.)−1 and/or (.)2 types. Both theoretical considerations and simulation results are presented to illustrate properties of our networks.
Keywords :
"Polynomials","Mathematical model","Artificial neural networks","Knowledge engineering","Training","Neurons"
Conference_Titel :
Signal Processing Algorithms, Architectures, Arrangements, and Applications Conference Proceedings (SPA), 2010
Print_ISBN :
978-1-4577-1485-6