adaptive control


adaptive control

[ə′dap·tiv kən′trōl] (control systems) A control method in which one or more parameters are sensed and used to vary the feedback control signals in order to satisfy the performance criteria.

Adaptive control

A special type of nonlinear control system which can alter its parameters to adapt to a changing environment. The changes in environment can represent variations in process dynamics or changes in the characteristics of the disturbances.

A normal feedback control system can handle moderate variations in process dynamics. The presence of such variations is, in fact, one reason for introducing feedback. There are, however, many situations where the changes in process dynamics are so large that a constant linear feedback controller will not work satisfactorily. For example, the dynamics of a supersonic aircraft change drastically with Mach number and dynamic pressure, and a flight control system with constant parameters will not work well. See Flight controls

Adaptive control is also useful for industrial process control. Since delay and holdup times depend on production, it is desirable to retune the regulators when there is a change in production. Adaptive control can also be used to compensate for changes due to aging and wear. See Process control