State Space


state space

[′stāt ‚spās] (control systems) The set of all possible values of the state vector of a system.

State Space

 

The state space of a dynamical system is a space such that each point in the space is uniquely associated with a certain state of the system (in some generalized coordinates). The points of the state space are known as representative points. To each process of change of state of the system (that is, to the system’s motion in these coordinates) there corresponds a certain trajectory of the motion of the representative point in the space. The concept of state space is used most often in investigations of the motion of dynamical systems in celestial mechanics, vibration theory, and the theory of automatic control.

In many technological, biological, and economic systems, the generalized coordinates of the system being considered can assume only discrete values. The states of such systems must also be regarded as discrete, and the representative points are viewed as belonging to a discrete state space. To the changes in state of such systems (that is, to the motion of the systems) there correspond successive jumps of the representative point from one position to another.

A different case is presented by systems involving parameters (coordinates) that are distributed in space. A state space with a finite number of dimensions is not sufficient for representing the states of the system in such a way as to provide a one-to-one correspondence between the state of the system (in these coordinates) and the position of the representative point. An example of such a system is a body whose state is characterized by its temperature field. A geometric image of the state of such a system can be obtained only in a space with an infinite number of dimensions.

The state space is a special case of a phase space.