Title :
Approximate dynamic programming approach for process control
Author_Institution :
Dept. of Chem. & Biomol. Eng., Korean Adv. Inst. of Sci. & Technol., Daejeon, South Korea
Abstract :
Many process control and scheduling problems can be formulated as Markov Decision Process (MDP), which represents a multi-stage decision problem under uncertainty. For the past two decades, the Artificial Intelligence (AI) research community has seen significant activities in Approximate Dynamic Programming (ADP), which is developed into an effective method for solving Markov Decision Process (MDP). In this paper, we explore the use of ADP for process control and scheduling applicaitons and examine different design options within ADP, such as the pre-decision state vs. post-decision state value function, parametric vs. nonparametric value function approximator, etc. We show that ADP can be tailored into an effective method to solve stochastic constrained control problems.
Keywords :
Markov processes; approximation theory; decision making; dynamic programming; predictive control; process control; Markov decision process; approximate dynamic programming; artificial intelligence; dynamic programming; multistage decision problem; process control; process scheduling; state value function; stochastic constrained control; value function approximator; Aerospace electronics; Dynamic programming; Function approximation; Markov processes; Optimization; Process control; Dynamic programming; constraints; stochastic optimal control;
Conference_Titel :
Control Automation and Systems (ICCAS), 2010 International Conference on
Conference_Location :
Gyeonggi-do
Print_ISBN :
978-1-4244-7453-0
Electronic_ISBN :
978-89-93215-02-1