DocumentCode
329121
Title
Nonparametric density estimation by a self-consistent neural network
Author
Rogers, George W. ; Szu, Harold H. ; Priebe, Carey E. ; Solka, Jeffrey L.
Author_Institution
Naval Surface Warfare Center, Dahlgren, VA, USA
Volume
2
fYear
1993
fDate
25-29 Oct. 1993
Firstpage
2001
Abstract
An improvement to the classic adaptive kernel estimator has been made by incorporating first order dynamics in a neural network framework that results in a fully self-consistent probability density function (pdf) estimate. The dynamics give rise to nonlinear interactions between the kernel parameters, resulting in a self-consistent pdf estimate. This is in contrast to the adaptive kernel estimator which is a simple three step procedure. Adaptive kernel estimates have asymptotic convergence rates of O(h4) if the errors involved in the pilot estimate can be ignored. This is compared to standard kernel estimators which converge as O(h2). By using a fully self-consistent method, this approach is also able to approach the theoretical O(h4) convergence rate while providing smoother estimates of the distribution tails than the adaptive kernel estimator. A one-dimensional application to the estimation of a log-normal distribution is included as an example.
Keywords
adaptive estimation; neural nets; nonparametric statistics; probability; asymptotic convergence rates; classic adaptive kernel estimator; first order dynamics; log-normal distribution; nonlinear interactions; nonparametric density estimation; self-consistent neural network; self-consistent probability density function estimate; Bandwidth; Convergence; Data analysis; Kernel; Log-normal distribution; Neural networks; Nonlinear equations; Power system modeling; Probability density function; Probability distribution;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN
0-7803-1421-2
Type
conf
DOI
10.1109/IJCNN.1993.717050
Filename
717050
Link To Document