Title :
Automated evaluation of syntax error recovery
Author :
de Jonge, M. ; Visser, Eelco
Author_Institution :
Delft Univ. of Technol., Delft, Netherlands
Abstract :
Evaluation of parse error recovery techniques is an open problem. The community lacks objective standards and methods to measure the quality of recovery results. This paper proposes an automated technique for recovery evaluation that offers a solution for two main problems in this area. First, a representative testset is generated by a mutation based fuzzing technique that applies knowledge about common syntax errors. Secondly, the quality of the recovery results is automatically measured using an oracle-based evaluation technique. We evaluate the validity of our approach by comparing results obtained by automated evaluation with results obtained by manual inspection. The evaluation shows a clear correspondence between our quality metric and human judgement.
Keywords :
computational linguistics; system recovery; automated evaluation; human judgement; mutation based fuzzing technique; oracle-based evaluation technique; parse error recovery techniques; quality metric; recovery evaluation; representative testset; syntax error recovery; Error Recovery; Evaluation; IDE; Parsing; Test Generation;
Conference_Titel :
Automated Software Engineering (ASE), 2012 Proceedings of the 27th IEEE/ACM International Conference on
Print_ISBN :
978-1-4503-1204-2
DOI :
10.1145/2351676.2351736