Title :
A POMDP framework for human-in-the-loop system
Author :
Chi-Pang Lam ; Sastry, S. Shankar
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., Univ. of California, Berkeley, Berkeley, CA, USA
Abstract :
Human operators are involved in many real world systems such as automobile systems. Traditional human-assistance features such as warning systems in the aircraft and automatic braking systems in automobile only monitor the states of the machine in order to prevent human errors and enhance safety. We believe that next generation systems should be able to monitor both the human and the machine and give an appropriate feedback to them. Although having human in the control loop has its advantage, it lacks a unified modeling framework to manage the feedback between the human and the machine. In this paper, we will present how partially observable Markov decision process (POMDP) can be used as a unified framework for the three main components in a human-in-the-loop control system-the human model, the machine dynamic model and the observation model. We use simulations to show the benefits of this framework. Finally, we outline the key challenge to advance this framework.
Keywords :
Markov processes; automobiles; decision theory; feedback; POMDP framework; aircraft systems; automatic braking systems; automobile systems; control loop; feedback; human-assistance features; human-in-the-loop control system-the human model; machine dynamic model; observation model; partially observable Markov decision process; warning systems; Control systems; Data models; Hidden Markov models; Monitoring; Safety; Vehicle dynamics; Vehicles;
Conference_Titel :
Decision and Control (CDC), 2014 IEEE 53rd Annual Conference on
Conference_Location :
Los Angeles, CA
Print_ISBN :
978-1-4799-7746-8
DOI :
10.1109/CDC.2014.7040333