Title :
An empirical measure of element contribution in neural networks
Author :
Mak, Brenda ; Blanning, Robert W.
Author_Institution :
Dept. of Comput. Sci. & Inf. Syst., Hong Kong Univ., Hong Kong
fDate :
11/1/1998 12:00:00 AM
Abstract :
A frequent complaint about neural net models is that they fail to explain their results in any useful way. The problem is not a lack of information, but an abundance of information that is difficult to interpret. When trained, neural nets will provide a predicted output for a posited input, and they can provide additional information in the form of interelement connection strengths. This latter information is of little use to analysts and managers who wish to interpret the results they have been given. We develop a measure of the relative importance of the various input elements and hidden layer elements, and we use this to interpret the contribution of these components to the outputs of the neural net
Keywords :
explanation; learning (artificial intelligence); neural nets; clustering methods; element contribution measurement; explanation; hidden layer elements; input elements; interelement connection strengths; neural networks; predicted output; Business communication; Clustering methods; Information analysis; Intelligent networks; Intelligent structures; Intelligent systems; Mathematical model; Mathematical programming; Neural networks; Weight measurement;
Journal_Title :
Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on
DOI :
10.1109/5326.725342