Title :
ELITE: Ensemble of Optimal Input-Pruned Neural Networks Using TRUST-TECH
Author :
Wang, Bin ; Chiang, Hsiao-Dong
Author_Institution :
Sch. of Electr. & Comput. Eng., Cornell Univ., Ithaca, NY, USA
Abstract :
The ensemble of optimal input-pruned neural networks using TRUST-TECH (ELITE) method for constructing high-quality ensemble through an optimal linear combination of accurate and diverse neural networks is developed. The optimization problems in the proposed methodology are solved by a global optimization a global optimization method called TRansformation Under Stability-reTraining Equilibrium Characterization (TRUST-TECH), whose main features include its capability in identifying multiple local optimal solutions in a deterministic, systematic, and tier-by-tier manner. ELITE creates a diverse population via a feature selection procedure of different local optimal neural networks obtained using tier-1 TRUST-TECH search. In addition, the capability of each input-pruned network is fully exploited through a TRUST-TECH-based optimal training. Finally, finding the optimal linear combination weights for an ensemble is modeled as a nonlinear programming problem and solved using TRUST-TECH and the interior point method, where the issue of non-convexity can be effectively handled. Extensive numerical experiments have been carried out for pattern classification on the synthetic and benchmark datasets. Numerical results show that ELITE consistently outperforms existing methods on the benchmark datasets. The results show that ELITE can be very promising for constructing high-quality neural network ensembles.
Keywords :
feature extraction; learning (artificial intelligence); neural nets; nonlinear programming; pattern classification; ELITE; TRUST-TECH; feature selection; global optimization method; interior point method; neural networks ensemble; nonlinear programming; optimal input pruned neural network; optimal linear combination; pattern classification; transformation under stability retraining equilibrium characterization; Accuracy; Artificial neural networks; Benchmark testing; Diversity reception; Optimization; Systematics; Training; Feature selection; global optimization; neural network ensemble; optimal linear combination; transformation under stability-retaining equilibrium characterization (TRUST-TECH); Algorithms; Artificial Intelligence; Computer Simulation; Linear Models; Neural Networks (Computer); Neurons; Nonlinear Dynamics; Software; Software Design;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2010.2087354