In probability theory, Lindeberg's condition is a sufficient condition (and under certain conditions also a necessary condition) for the central limit theorem (CLT) to hold for a sequence of independent random variables. Unlike the classical CLT, which requires that the random variables in question have finite variance and be both independent and identically distributed, Lindeberg's CLT only requires that they have finite variance, satisfy Lindeberg's condition, and be independent. It is named after the Finnish mathematician Jarl Waldemar Lindeberg.Let ( Ω , F , P ) {displaystyle (Omega ,{mathcal {F}},mathbb {P} )} be a probability space, and X k : Ω → R , k ∈ N {displaystyle X_{k}:Omega o mathbb {R} ,,,kin mathbb {N} } , be independent random variables defined on that space. Assume the expected values E [ X k ] = μ k {displaystyle mathbb {E} ,=mu _{k}} and variances V a r [ X k ] = σ k 2 {displaystyle mathrm {Var} ,=sigma _{k}^{2}} exist and are finite. Also let s n 2 := ∑ k = 1 n σ k 2 . {displaystyle s_{n}^{2}:=sum _{k=1}^{n}sigma _{k}^{2}.} Because the Lindeberg condition implies max k = 1 , … , n σ k 2 s n 2 → 0 {displaystyle max _{k=1,ldots ,n}{frac {sigma _{k}^{2}}{s_{n}^{2}}} o 0} as n → ∞ {displaystyle n o infty } , it guarantees that the contribution of any individual random variable X k {displaystyle X_{k}} ( 1 ≤ k ≤ n {displaystyle 1leq kleq n} ) to the variance s n 2 {displaystyle s_{n}^{2}} is arbitrarily small, for sufficiently large values of n {displaystyle n} .