Abstract:
Sufficient conditions are obtained for optimality of control which depends on the
specified set of coordinates of the process state vector. The coordinates are assumed
to be accurately measurable. Equations are derived for determining the control functions
from which, in extreme cases, the procedure of the stochastic maximum principle
and the stochastic Bellman equation are obtained. The application of these conditions
is illustrated with reference to linear systems whose performance criterion is quadratic.