Title :
Balancing ensemble learning through error shif
Author_Institution :
Sch. of Comput. Sci. & Eng., Univ. of Aizu, Aizu-Wakamatsu, Japan
Abstract :
In neural network learning, it has been often observed that some data have been learned extremely well while others have been barely learned. Such unbalanced learning often lead to the learned neural networks or neural network ensembles that could be too strongly biased on those learned-well data. The stronger bias could contribute to the larger variance and the poorer generalization on the unseen data. It is necessary to prevent a learned model from being strong biased especially when the model have unnecessary large complexity for the application. This paper shows how balanced ensemble learning could guide learning to being less biased through error shift, and create weak learners in an ensemble.
Keywords :
computational complexity; generalisation (artificial intelligence); learning (artificial intelligence); mean square error methods; neural nets; ensemble learning balancing; error shift; generalization; neural network ensembles; neural network learning; unbalanced learning; Correlation; Diseases; Error analysis; Heart; Neural networks; Training;
Conference_Titel :
Advanced Computational Intelligence (IWACI), 2011 Fourth International Workshop on
Conference_Location :
Wuhan
Print_ISBN :
978-1-61284-374-2
DOI :
10.1109/IWACI.2011.6160030