• DocumentCode
    495195
  • Title

    A Novel Technique for Automated Linguistic Quality Assessment of Students´ Essays Using Automatic Summarizers

  • Author

    Latif, Seemab ; McGee Wood, M.

  • Author_Institution
    Sch. of Comput. Sci., Univ. of Manchester, Manchester, UK
  • Volume
    5
  • fYear
    2009
  • fDate
    March 31 2009-April 2 2009
  • Firstpage
    144
  • Lastpage
    148
  • Abstract
    In this paper, experiments have addressed the calculation of inter-annotator inconsistency in selecting the content in both manual and automatic summarization of sample TOEFL essays. A new finding is that the linguistic quality of source essay has a very strong correlation with the degree of disagreement among human assessors to what should be included in a summary. This leads to a fully automated essay evaluation technique based on degree of disagreement among automated summarizes. ROUGE evaluation is used to measure the degree of inconsistency among the participants (human summarizers and automatic summarizers). This automated essay evaluation technique is potentially an important contribution with wider significance.
  • Keywords
    document handling; linguistics; text analysis; ROUGE evaluation; TOEFL essays; automated linguistic quality assessment; automatic summarizers; fully automated essay evaluation technique; human summarizers; inconsistency degree; Computer science; Humans; Manuals; Particle measurements; Quality assessment; Automatic Summarization; ROUGE Evaluation; Summarization Evaluation;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computer Science and Information Engineering, 2009 WRI World Congress on
  • Conference_Location
    Los Angeles, CA
  • Print_ISBN
    978-0-7695-3507-4
  • Type

    conf

  • DOI
    10.1109/CSIE.2009.777
  • Filename
    5170514