Author :
Plaisant, Catherine ; Grinstein, Georges ; Scholtz, Jean
Abstract :
Visual analytics (VA) is the science of analytical reasoning facilitated by interactive visual interfaces. Assessing VA technology´s effectiveness is challenging because VA tools combine several disparate components, both low and high level, integrated in complex interactive systems used by analysts, emergency responders, and others. These components include analytical reasoning, visual representations, computer-human interaction techniques, data representations and transformations, collaboration tools, and especially tools for communicating the results of their use. VA tool users´ activities can be exploratory and can take place over days, weeks, or months. Users might not follow a predefined or even linear work flow. They might work alone or in groups. To understand these complex behaviors, an evaluation can target the component level, the system level, or the work environment level, and requires realistic data and tasks. Traditional evaluation metrics such as task completion time, number of errors, or recall and precision are insufficient to quantify the utility of VA tools, and new research is needed to improve our VA evaluation methodology.
Keywords :
cognition; data visualisation; interactive systems; interactive visual interfaces; synthetic-data-set generation; user reasoning; visual analytic tools; visual analytics evaluation; Character generation; Collaborative tools; Collaborative work; Data analysis; Data privacy; Guidelines; Interactive systems; Laboratories; Statistical analysis; Visualization; design guidelines; insight characterization; insight measurement; synthetic data; user reasoning; visual analytics;