• DocumentCode
    303193
  • Title

    Estimating the multivariate conditional density using relatively sparse training data pairs

  • Author

    Davis, Daniel T. ; Hwang, Jenq-Neng

  • Author_Institution
    Dept. of Electr. Eng., Washington Univ., Seattle, WA, USA
  • Volume
    1
  • fYear
    1996
  • fDate
    3-6 Jun 1996
  • Firstpage
    31
  • Abstract
    Unlike most nonlinear system identification tasks, which focus on determining the unknown system functions parametrically or nonparametrically to be used in future prediction of the output values given specific inputs, many statistical signal processing applications require the estimation of the corresponding conditional density function so that a probabilistic decision can be made. In this paper, we propose a new method to estimate the multivariate conditional density, f(m/x), a density over the output space m conditioned on any given input x. In particular, we are interested in cases where the number of available training data, points is relatively sparse within x space. We start from a priori considerations and establish certain desirable characteristics in kernel functions for conditional density estimation. We find that Gaussian kernels with expanding covariances, expanding as we move away from the data point of the kernel, satisfy these a priori considerations. We combine these expanding Gaussian kernels (EGK) according to Bayesian techniques. We compare the EGK with standard Gaussian kernel (SDK) methods, and find that EGK avoids multimodality, has diminishing confidence levels farther from training points, performs better asymptotically and performs better with respect to the Kullback-Leibler criteria
  • Keywords
    Bayes methods; identification; nonlinear systems; signal processing; statistical analysis; Bayesian techniques; conditional density estimation; expanding Gaussian kernels; multivariate conditional density estimation; nonlinear system identification; relatively sparse training data pairs; statistical signal processing; Bandwidth; Bayesian methods; Density functional theory; Information analysis; Information processing; Kernel; Laboratories; Nonlinear systems; Signal processing; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Conference_Location
    Washington, DC
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.548862
  • Filename
    548862