The following paper discusses the optimal control of these systems, characterized by a set of

first-order state equations, in which performance is measured by Chebyshev-type functional over the state trajectory. The determination of control functions that minimize the maximum value of a given state function over the trajectory interval is shown to follow directly from the development of a differential minimax cost. The differential minimax cost allows the problem to be formulated as a coordinate minimization in the cost-augmented state space, and leads to the consideration of a set of suboptimal problems whose solutions are shown to converge to the required minimax control. The modifications required for the application of standard variational techniques to the reformulated problem are also discussed. The main result of this study is the demonstration of equivalence between the Chebyshev-type control problem and a more conventional Mayer-type formulation.