• DocumentCode
    3786540
  • Title

    A dynamic games approach to controller design: disturbance rejection in discrete-time

  • Author

    T. Basar

  • Author_Institution
    Decision & Control Lab., Illinois Univ., Urbana, IL, USA
  • Volume
    36
  • Issue
    8
  • fYear
    1991
  • Firstpage
    936
  • Lastpage
    952
  • Abstract
    It is shown that the discrete-time disturbance rejection problem, formulated in finite and infinite horizons, and under perfect state measurements, can be solved by making direct use of some results on linear-quadratic zero-sum dynamic games. For the finite-horizon problem an optimal (minimax) controller exists, and can be expressed in terms of a generalized (time-varying) discrete-time Riccati equation. The existence of an optimum also holds in the infinite-horizon case, under an appropriate observability condition, with the optimal control, given in terms of a generalized algebraic Riccati equation, also being stabilizing. In both cases, the corresponding worst-case disturbances turn out to be correlated random sequences with discrete distributions, which means that the problem (viewed as a dynamic game between the controller and the disturbance) does not admit a pure-strategy saddle point. Results for the delayed state measurement and the nonzero initial state cases are presented.
  • Keywords
    "Riccati equations","Optimal control","Time domain analysis","Minimax techniques","Control systems","Infinite horizon","Delay","Stochastic processes","Frequency measurement","Observability"
  • Journal_Title
    IEEE Transactions on Automatic Control
  • Publisher
    ieee
  • ISSN
    0018-9286
  • Type

    jour

  • DOI
    10.1109/9.133187
  • Filename
    133187