In mathematics, a series or integral is said to be conditionally convergent if it converges, but it does not converge absolutely.

Definition

More precisely, a series of real numbers ∑ n = 0 ∞ a n {\textstyle \sum _{n=0}^{\infty }a_{n}} is said to converge conditionally if lim m → ∞ ∑ n = 0 m a n {\textstyle \lim _{m\rightarrow \infty }\,\sum _{n=0}^{m}a_{n}} exists (as a finite real number, i.e. not ∞ {\displaystyle \infty } or − ∞ {\displaystyle -\infty }), but ∑ n = 0 ∞ | a n | = ∞ . {\textstyle \sum _{n=0}^{\infty }\left|a_{n}\right|=\infty .}

A classic example is the alternating harmonic series given by 1 − 1 2 + 1 3 − 1 4 + 1 5 − ⋯ = ∑ n = 1 ∞ ( − 1 ) n + 1 n , {\displaystyle 1-{1 \over 2}+{1 \over 3}-{1 \over 4}+{1 \over 5}-\cdots =\sum \limits _{n=1}^{\infty }{(-1)^{n+1} \over n},} which converges to ln ⁡ ( 2 ) {\displaystyle \ln(2)}, but is not absolutely convergent (see Harmonic series).

Bernhard Riemann proved that a conditionally convergent series may be rearranged to converge to any value at all, including ∞ or −∞; see Riemann series theorem. Agnew's theorem describes rearrangements that preserve convergence for all convergent series.

The Lévy–Steinitz theorem identifies the set of values to which a series of terms in Rn can converge.

Indefinite integrals may also be conditionally convergent. A typical example of a conditionally convergent integral is (see Fresnel integral) ∫ 0 ∞ sin ⁡ ( x 2 ) d x , {\displaystyle \int _{0}^{\infty }\sin(x^{2})dx,} where the integrand oscillates between positive and negative values indefinitely, but enclosing smaller areas each time.

See also

  • Walter Rudin, Principles of Mathematical Analysis (McGraw-Hill: New York, 1964).