Title :
Improved Neural Based Writer Adaptation for On-Line Recognition Systems
Author :
Haddad, Lobna ; Hamdani, Tarek M. ; Alimi, Adel M.
Author_Institution :
Nat. Eng. Sch. of Sfax (ENIS), Univ. of Sfax, Sfax, Tunisia
Abstract :
The adaptation module is a Radial Basis Function Neural Network (RBF-NN) that can be connected to the output of any recognition system and its aim is to examine the output of the writer-independent system and produce a more correct output vector close to the desired response. The proposed adaptation module is built using an incremental training named GA-AM algorithm (Growing-Adjustment Adaptation Module). Two adaptation strategies are applied : Growing and Adjustment. The growing criteria are based on the estimation of the significance of the new input and the significance of the nearest unit compared to the input. The adjustment consists of the update of two specific units (nearest and desired contributor) parameters using the standard LMS gradient descent to decrease the error at each time no new unit is allocated. This new training algorithm is evaluated by the adaptation of two handwriting recognition systems. The results, reported according to the cumulative error, show that the GA-AM algorithm leads to decreasing the classification error and to instantly adapting the recognition system to a specific user´s handwriting. Performance comparison of GA-AM training algorithm with two other adaptation strategies, based on four writer-dependent datasets, are presented.
Keywords :
gradient methods; handwriting recognition; learning (artificial intelligence); least mean squares methods; radial basis function networks; GA-AM training algorithm; RBF-NN; cumulative error; desired contributor parameter; growing criteria; growing-adjustment adaptation module; handwriting recognition systems; incremental training; nearest contributor parameter; neural-based writer adaptation module; online handwriting recognition systems; output vector; radial basis function neural network; recognition system output; standard LMS gradient descent; writer-dependent datasets; writer-independent system output; Accuracy; Approximation methods; Handwriting recognition; Neurons; Standards; Training; Vectors; Incremental learning of RBF-NN; Pattern Recognition; Writer Adaptation;
Conference_Titel :
Systems, Man, and Cybernetics (SMC), 2013 IEEE International Conference on
Conference_Location :
Manchester
DOI :
10.1109/SMC.2013.204