DocumentCode :
621887
Title :
Multimodal data analysis and visualization to study the usage of Electronic Health Records
Author :
Weibel, Nadir ; Ashfaq, Shazia ; Calvitti, Alan ; Hollan, James D. ; Agha, Zia
Author_Institution :
Univ. of California San Diego, La Jolla, CA, USA
fYear :
2013
fDate :
5-8 May 2013
Firstpage :
282
Lastpage :
283
Abstract :
Understanding interaction with Electronic Health Records (EHR), often means to understand the multi modal nature of the physician-patient interaction, as well as the interaction with other materials (e.g. paper charts), in addition to analyze the tasks fulfilled by the doctor on his computerized system. Recent approaches started to analyze and quantify speech, gaze, body movements, etc. and represent a very promising way to complement classic software usability. However, it is hard to characterize multimodal activity, since often it requires manual coding of hours of video data. We present our approach to use automatic tracking of body, audio signals and gaze in the medical office to achieve multimodal analysis of EHR.
Keywords :
audio signals; data analysis; data visualisation; medical information systems; modal analysis; EHR; audio signals; body automatic tracking; computerized system; data visualization; electronic health records; multimodal activity characterization; multimodal data analysis; physician-patient interaction; software usability; video data; Context;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pervasive Computing Technologies for Healthcare (PervasiveHealth), 2013 7th International Conference on
Conference_Location :
Venice
Print_ISBN :
978-1-4799-0296-5
Electronic_ISBN :
978-1-936968-80-0
Type :
conf
Filename :
6563944
Link To Document :
بازگشت