Fine Art

.

In the mathematical field of numerical analysis, a Bernstein polynomial, named after Sergei Natanovich Bernstein, is a polynomial in the Bernstein form, that is a linear combination of Bernstein basis polynomials.

A numerically stable way to evaluate polynomials in Bernstein form is de Casteljau's algorithm.

Polynomials in Bernstein form were first used by Bernstein in a constructive proof for the Stone–Weierstrass approximation theorem. With the advent of computer graphics, Bernstein polynomials, restricted to the interval x ∈ [0, 1], became important in the form of Bézier curves.

Definition

The n + 1 Bernstein basis polynomials of degree n are defined as

\( b_{\nu,n}(x) = {n \choose \nu} x^{\nu} \left( 1 - x \right)^{n - \nu}, \quad \nu = 0, \ldots, n. \)

where \( {n \choose \nu} \) is a binomial coefficient.

The Bernstein basis polynomials of degree n form a basis for the vector space Πn of polynomials of degree at most n.

A linear combination of Bernstein basis polynomials

\(B(x) = \sum_{\nu=0}^{n} \beta_{\nu} b_{\nu,n}(x) \)

is called a Bernstein polynomial or polynomial in Bernstein form of degree n. The coefficients \beta_\nu are called Bernstein coefficients or Bézier coefficients.
Example

The first few Bernstein basis polynomials are:

\( \begin{align} b_{0,0}(x) & = 1, \\ b_{0,1}(x) & = 1 - x, & b_{1,1}(x) & = x \\ b_{0,2}(x) & = (1 - x)^2, & b_{1,2}(x) & = 2x(1 - x), & b_{2,2}(x) & = x^2 \\ b_{0,3}(x) & = (1 - x)^3, & b_{1,3}(x) & = 3x(1 - x)^2, & b_{2,3}(x) & = 3x^2(1 - x), & b_{3,3}(x) & = x^3 \\ b_{0,4}(x) & = (1 - x)^4, & b_{1,4}(x) & = 4x(1 - x)^3, & b_{2,4}(x) & = 6x^2(1 - x)^2, & b_{3,4}(x) & = 4x^3(1 - x), & b_{4,4}(x) & = x^4 \end{align} \)

Properties

The Bernstein basis polynomials have the following properties:

\( b_{\nu, n}(x) = 0, if \nu < 0 or \nu > n. \)
\( b_{\nu, n}(0) = \delta_{\nu, 0} \) and \( b_{\nu, n}(1) = \delta_{\nu, n} \) where \( \delta \) is the Kronecker delta function.
\( b_{\nu, n}(x) \) has a root with multiplicity \nu at point x = 0 (note: if \( \nu = 0 \), there is no root at 0).
\( b_{\nu, n}(x) \) has a root with multiplicity \( \left( n - \nu \right) \) at point x = 1 (note: if \( \nu = n \), there is no root at 1).
\( b_{\nu, n}(x) \ge 0 for x \in [0,\ 1]. \)
\( b_{\nu, n}\left( 1 - x \right) = b_{n - \nu, n}(x). \)

The derivative can be written as a combination of two polynomials of lower degree:

\( b'_{\nu, n}(x) = n \left( b_{\nu - 1, n - 1}(x) - b_{\nu, n - 1}(x) \right). \)

The integral is constant for a given n

\( \int_{0}^{1}b_{\nu, n}(x)dx = \frac{1}{n+1} \forall \nu = 0,1 \dots n \)

If \( n \ne 0 \) , then \( b_{\nu, n}(x) \) has a unique local maximum on the interval [0,\ 1] at \( x = \frac{\nu}{n} \). This maximum takes the value:

\( \nu^\nu n^{-n} \left( n - \nu \right)^{n - \nu} {n \choose \nu}. \)

The Bernstein basis polynomials of degree n form a partition of unity:

\( \sum_{\nu = 0}^n b_{\nu, n}(x) = \sum_{\nu = 0}^n {n \choose \nu} x^\nu \left( 1 - x \right)^{n - \nu} = \left(x + \left( 1 - x \right) \right)^n = 1. \)

By taking the first derivative of \( (x+y)^n \) where y = 1-x, it can be shown that

\( \sum_{\nu=0}^{n}\nu b_{\nu, n}(x) = nx \)

The second derivative of (x+y)^n where y = 1-x can be used to show

\( \sum_{\nu=1}^{n}\nu(\nu-1) b_{\nu, n}(x) = n(n-1)x^2 \)

A Bernstein polynomial can always be written as a linear combination of polynomials of higher degree:

\( b_{\nu, n - 1}(x) = \frac{n - \nu}{n} b_{\nu, n}(x) + \frac{\nu + 1}{n} b_{\nu + 1, n}(x). \)

Approximating continuous functions

Let ƒ be a continuous function on the interval [0, 1]. Consider the Bernstein polynomial

\(B_n(f)(x) = \sum_{\nu = 0}^n f\left( \frac{\nu}{n} \right) b_{\nu,n}(x). \)

It can be shown that

\( \lim_{n \to \infty}{ B_n(f)(x) } = f(x) \, \)

uniformly on the interval [0, 1]. This is a stronger statement than the proposition that the limit holds for each value of x separately; that would be pointwise convergence rather than uniform convergence. Specifically, the word uniformly signifies that

\( \lim_{n \to \infty} \sup \left\{\, \left| f(x) - B_n(f)(x) \right| \,:\, 0 \leq x \leq 1 \,\right\} = 0. \)

Bernstein polynomials thus afford one way to prove the Weierstrass approximation theorem that every real-valued continuous function on a real interval [a, b] can be uniformly approximated by polynomial functions over R.

A more general statement for a function with continuous k-th derivative is

\( {\left\| B_n(f)^{(k)} \right\|}_\infty \le \frac{ (n)_k }{ n^k } \left\| f^{(k)} \right\|_\infty \text{ and } \left\| f^{(k)}- B_n(f)^{(k)} \right\|_\infty \to 0 \)

where additionally

\( \frac{ (n)_k }{ n^k } = \left( 1 - \frac{0}{n} \right) \left( 1 - \frac{1}{n} \right) \cdots \left( 1 - \frac{k - 1}{n} \right) \)

is an eigenvalue of Bn; the corresponding eigenfunction is a polynomial of degree k.

Proof

Suppose K is a random variable distributed as the number of successes in n independent Bernoulli trials with probability x of success on each trial; in other words, K has a binomial distribution with parameters n and x. Then we have the expected value E(K/n) = x.

By the weak law of large numbers of probability theory,

\( \lim_{n \to \infty}{ P\left( \left| \frac{K}{n} - x \right|>\delta \right) } = 0 \)

for every δ > 0. Moreover, this relation holds uniformly in x, which can be seen from its proof via Chebyshev's inequality, taking into account that the variance of K/n, equal to x(1-x)/n, is bounded from above by 1/(4n) irrespective of x.

Because ƒ, being continuous on a closed bounded interval, must be uniformly continuous on that interval, one infers a statement of the form

\( \lim_{n \to \infty}{ P\left( \left| f\left( \frac{K}{n} \right) - f\left( x \right) \right| > \varepsilon \right) } = 0 \)

uniformly in x. Taking into account that ƒ is bounded (on the given interval) one gets for the expectation

\( \lim_{n \to \infty}{ E\left( \left| f\left( \frac{K}{n} \right) - f\left( x \right) \right| \right) } = 0 \)

uniformly in x. To this end one splits the sum for the expectation in two parts. On one part the difference does not exceed ε; this part cannot contribute more than ε. On the other part the difference exceeds ε, but does not exceed 2M, where M is an upper bound for |ƒ(x)|; this part cannot contribute more than 2M times the small probability that the difference exceeds ε.

Finally, one observes that the absolute value of the difference between expectations never exceeds the expectation of the absolute value of the difference, and that E(ƒ(K/n)) is just the Bernstein polynomial Bn(ƒ, x).

See for instance.[1]
See also

Bézier curve
Polynomial interpolation
Newton form
Lagrange form

Notes

^ L. Koralov and Y. Sinai, "Theory of probability and random processes" (second edition), Springer 2007; see page 29, Section "Probabilistic proof of the Weierstrass theorem".

References

Weisstein, Eric W., "Bernstein Polynomial" from MathWorld.
Korovkin, P.P. (2001), "Bernstein polynomials", in Hazewinkel, Michiel, Encyclopedia of Mathematics, Springer, ISBN 978-1556080104
H. Caglar and A.N. Akansu, "A Generalized Parametric PR-QMF Design Technique Based on Bernstein Polynomial Approximation," IEEE Transactions on Signal Processing, vol. 41, no. 7, pp. 2314–2321, July 1993.
This article incorporates material from properties of Bernstein polynomial on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.
From Bézier to Bernstein
BERNSTEIN POLYNOMIALS by Kenneth I. Joy


Mathematics Encyclopedia

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License

Home - Hellenica World