DocumentCode
2998059
Title
On the minimax feedback control of uncertain dynamic systems
Author
Bertsekas, D.P. ; Rhodes, I.B.
Author_Institution
Stanford University, Stanford, Calif.
fYear
1971
fDate
15-17 Dec. 1971
Firstpage
451
Lastpage
455
Abstract
In this paper the problem of optimal feedback control of uncertain discrete-time dynamic systems is considered where the uncertain quantities do not have a stochastic description but instead are known to belong to given sets. The problem is converted to a sequential minimax problem and dynamic programming is suggested as a general method for its solution. The notion of a sufficiently informative function, which parallels the notion of a sufficient statistic of stochastic optimal control, is introduced, and conditions under which the optimal controller decomposes into an estimator and an actuator are identified.
Keywords
Actuators; Adaptive control; Context modeling; Control systems; Cost function; Feedback control; Minimax techniques; Probability distribution; Stochastic processes; Uncertainty;
fLanguage
English
Publisher
ieee
Conference_Titel
Decision and Control, 1971 IEEE Conference on
Conference_Location
Miami Beach, FL, USA
Type
conf
DOI
10.1109/CDC.1971.271035
Filename
4044796
Link To Document