[i′nish·əl ¦val·yü ‚thir·əm] (mathematics) The theorem that, if a function ƒ(t) and its first derivative have Laplace transforms, and if g (s) is the Laplace transform of ƒ(t), and if the limit of sg (s) as s approaches infinity exists, then this limit equals the limit of ƒ(t) as t approaches zero.