Abstract:
The problem of observation process optimization of dynamical system motion under random perturbations is considered. Moreover, all types of uncertainty (both external perturbations and measurement error) are treated as random variables with given statistical characteristics. The transition function of the considered dynamic process contains a vector of unknown parameters. Using Bayesian method the original problem is reduced to the solution of a determinate optimal control problem. The paper demonstrates the possibility of using Bellman's principle of dynamic programming to the quick action problem with a nonlinear system. Under constrains on control examined the necessary and sufficient conditions of optimal control are found. The obtained results are illustrated on an example. Bibliogr. 4.