Lipschitz Condition

Lipschitz condition

[′lip‚shits kən‚dish·ən] (mathematics) A function ƒ satisfies such a condition at a point b if |ƒ(x) - ƒ(b)| ≤ K | x-b |, with K a constant, for all x in some neighborhood of b.

Lipschitz Condition

 

a restriction on the behavior of an increment of a function. If for any points x and x″ in the interval [a, b ] the increment of a function satisfies the inequality

ǀf(x) - f(x′) ≤ Mǀ x - x′ǀα

where 0 < α ≤ 1 and where M is some constant, a function f(x) is said to satisfy a Lipschitz condition of order α on the interval [a, b ]; this is written as f(x) ∈ Lip a. Every function satisfying a Lipschitz condition on the interval [a, b ] for some α > 0 is uniformly continuous on [a, b ]. A function having a bounded derivative on [a, b ] satisfies a Lipschitz condition on [a, b ] for any α ≤ 1.

The Lipschitz condition was first examined in 1864 by the German mathematician R. Lipschitz (1832–1903) as a sufficient condition for the convergence of the Fourier series of a function f(x). Although it is historically inaccurate, some mathematicians associate only the most important case of the Lipschitz condition, that of α = 1, with the name of Lipschitz; for the case α < 1 they speak of the Hölder condition.