Title :
Error bounds for functional approximation and estimation using mixtures of experts
Author :
Zeevi, Assaf J. ; Meir, Ron ; Maiorov, Vitaly
Author_Institution :
Inf. Syst. Lab., Stanford Univ., CA, USA
fDate :
5/1/1998 12:00:00 AM
Abstract :
We examine some mathematical aspects of learning unknown mappings with the mixture of experts model (MEM). Specifically, we observe that the MEM is at least as powerful as a class of neural networks, in a sense that will be made precise. Upper bounds on the approximation error are established for a wide class of target functions. The general theorem states that ||f-fn||p⩽c/nr d/ for f∈Wpr(L) (a Sobolev class over [-1,1]d), and fn belongs to an n-dimensional manifold of normalized ridge functions. The same bound holds for the MEM as a special case of the above. The stochastic error, in the context of learning from independent and identically distributed (i.i.d.) examples, is also examined. An asymptotic analysis establishes the limiting behavior of this error, in terms of certain pseudo-information matrices. These results substantiate the intuition behind the MEM, and motivate applications
Keywords :
cooperative systems; error analysis; function approximation; information theory; learning (artificial intelligence); neural nets; parameter estimation; statistical analysis; stochastic processes; Sobolev class; applications; approximation error; asymptotic analysis; error bounds; estimation; functional approximation; learning unknown mappings; limiting behavior; mathematical aspects; mixture of experts model; multivariate regression; n-dimensional manifold; normalized ridge functions; pseudo-information matrices; stochastic error; target functions; upper bounds; Approximation error; Estimation error; Function approximation; Mathematical model; Neural networks; Space technology; Stochastic processes; Time series analysis; Upper bound; Vectors;
Journal_Title :
Information Theory, IEEE Transactions on