• DocumentCode
    635253
  • Title

    Hunting for smells in natural language tests

  • Author

    Hauptmann, Benedikt ; Junker, Maximilian ; Eder, Sebastian ; Heinemann, Lars ; Vaas, Rudolf ; Braun, Peter

  • Author_Institution
    Tech. Univ. Munchen, Munich, Germany
  • fYear
    2013
  • fDate
    18-26 May 2013
  • Firstpage
    1217
  • Lastpage
    1220
  • Abstract
    Tests are central artifacts of software systems and play a crucial role for software quality. In system testing, a lot of test execution is performed manually using tests in natural language. However, those test cases are often poorly written without best practices in mind. This leads to tests which are not maintainable, hard to understand and inefficient to execute. For source code and unit tests, so called code smells and test smells have been established as indicators to identify poorly written code. We apply the idea of smells to natural language tests by defining a set of common Natural Language Test Smells (NLTS). Furthermore, we report on an empirical study analyzing the extent in more than 2800 tests of seven industrial test suites.
  • Keywords
    natural language processing; program testing; software quality; NLTS; code smells; industrial test suites; natural language test smells; software quality; software systems; source code; test smells; unit tests; Cloning; Maintenance engineering; Manuals; Measurement; Natural languages; Quality assessment; Testing; natural language; system testing; test smells;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Software Engineering (ICSE), 2013 35th International Conference on
  • Conference_Location
    San Francisco, CA
  • Print_ISBN
    978-1-4673-3073-2
  • Type

    conf

  • DOI
    10.1109/ICSE.2013.6606682
  • Filename
    6606682