• DocumentCode
    1605331
  • Title

    Making System User Interactive Tests Repeatable: When and What Should We Control?

  • Author

    Zebao Gao ; Yalan Liang ; Cohen, Myra B. ; Memon, Atif M. ; Zhen Wang

  • Author_Institution
    Dept. of Comput. Sci., Univ. of Maryland, College Park, MD, USA
  • Volume
    1
  • fYear
    2015
  • Firstpage
    55
  • Lastpage
    65
  • Abstract
    System testing and invariant detection is usually conducted from the user interface perspective when the goal is to evaluate the behavior of an application as a whole. A large number of tools and techniques have been developed to generate and automate this process, many of which have been evaluated in the literature or internally within companies. Typical metrics for determining effectiveness of these techniques include code coverage and fault detection, however, with the assumption that there is determinism in the resulting outputs. In this paper we examine the extent to which a common set of factors such as the system platform, Java version, application starting state and tool harness configurations impact these metrics. We examine three layers of testing outputs: the code layer, the behavioral (or invariant) layer and the external (or user interaction) layer. In a study using five open source applications across three operating system platforms, manipulating several factors, we observe as many as 184 lines of code coverage difference between runs using the same test cases, and up to 96 percent false positives with respect to fault detection. We also see some a small variation among the invariants inferred. Despite our best efforts, we can reduce, but not completely eliminate all possible variation in the output. We use our findings to provide a set of best practices that should lead to better consistency and smaller differences in test outcomes, allowing more repeatable and reliable testing and experimentation.
  • Keywords
    graphical user interfaces; operating systems (computers); program testing; public domain software; software fault tolerance; GUI; code coverage; fault detection; graphical user interface; open source application; operating system platform; user interactive application testing; Delays; Entropy; Graphical user interfaces; Java; Operating systems; Testing; software testing; experimentation; benchmarking; graphical user interfaces;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Software Engineering (ICSE), 2015 IEEE/ACM 37th IEEE International Conference on
  • Conference_Location
    Florence
  • Type

    conf

  • DOI
    10.1109/ICSE.2015.28
  • Filename
    7194561