DocumentCode
26752
Title
Learning Driver Behavior Models from Traffic Observations for Decision Making and Planning
Author
Gindele, Tobias ; Brechtel, Sebastian ; Dillmann, Rudiger
Author_Institution
Comput. Sci., Karlsruhe Inst. of Technol., Karlsruhe, Germany
Volume
7
Issue
1
fYear
2015
fDate
Spring 2015
Firstpage
69
Lastpage
79
Abstract
Estimating and predicting traffic situations over time is an essential capability for sophisticated driver assistance systems and autonomous driving. When longer prediction horizons are needed, e.g., in decision making or motion planning, the uncertainty induced by incomplete environment perception and stochastic situation development over time cannot be neglected without sacrificing robustness and safety. Building consistent probabilistic models of drivers interactions with the environment, the road network and other traffic participants poses a complex problem. In this paper, we model the decision making process of drivers by building a hierarchical Dynamic Bayesian Model that describes physical relationships as well as the driver´s behaviors and plans. This way, the uncertainties in the process on all abstraction levels can be handled in a mathematically consistent way. As drivers behaviors are difficult to model, we present an approach for learning continuous, non-linear, context-dependent models for the behavior of traffic participants. We propose an Expectation Maximization (EM) approach for learning the models integrated in the DBN from unlabeled observations. Experiments show a significant improvement in estimation and prediction accuracy over standard models which only consider vehicle dynamics. Finally, a novel approach to tactical decision making for autonomous driving is outlined. It is based on a continuous Partially Observable Markov Decision Process (POMDP) that uses the presented model for prediction.
Keywords
Markov processes; belief networks; decision making; driver information systems; expectation-maximisation algorithm; learning (artificial intelligence); planning (artificial intelligence); road safety; road traffic; EM approach; POMDP; autonomous driving; decision making; driver assistance systems; driver behavior models; expectation maximization approach; hierarchical dynamic Bayesian model; learning; partially observable Markov decision process; planning; safety; traffic observations; traffic situations; Atmospheric measurements; Bayes methods; Behavioral science; Context modeling; Decision making; Particle measurements; Predictive models; Random variables; Road traffic;
fLanguage
English
Journal_Title
Intelligent Transportation Systems Magazine, IEEE
Publisher
ieee
ISSN
1939-1390
Type
jour
DOI
10.1109/MITS.2014.2357038
Filename
7014400
Link To Document