stochastic control theory

stochastic control theory

[stō′kas·tik kən′trōl ‚thē·ə·rē] (control systems) A branch of control theory that aims at predicting and minimizing the magnitudes and limits of the random deviations of a control system through optimizing the design of the controller.