Fine Art

.

In number theory, Lochs' theorem is a theorem concerning the rate of convergence of the continued fraction expansion of a typical real number. A proof of the theorem was published by Gustav Lochs in 1964.[1]

The theorem states that for almost all real numbers in the interval (0,1), the number of terms m of the number's continued fraction expansion that are required to determine the first n places of the number's decimal expansion behaves asymptotically as follows:

\( \lim_{n \rightarrow \infty} \frac{m}{n} = \frac {6 \ln 2 \ln 10}{ \pi^2} \approx 0.97027014 \) (sequence A086819 in OEIS).[2]

As this limit is only slightly smaller than 1, this can be interpreted as saying that each additional term in the continued fraction representation of a "typical" real number increases the accuracy of the representation by approximately one decimal place. The decimal system is the last positional system for which each digit carries less information than one continued fraction quotient; going to base-11 (changing \( \ln 10 \) to \( \ln 11 \) in the equation) makes the above value exceed 1.

The reciprocal of this limit,

\( \frac { \pi^2}{6 \ln 2 \ln 10} \approx 1.03064083 \)(sequence A062542 in OEIS),

is twice the base-10 logarithm of Lévy's constant.

References

Lochs, Gustav (1964), "Vergleich der Genauigkeit von Dezimalbruch und Kettenbruch", Abhandlungen aus dem Mathematischen Seminar der Universität Hamburg (in German) 27: 142–144, doi:10.1007/BF02993063, MR 0162753.
Weisstein, Eric W., "Lochs' Theorem", MathWorld.

Mathematics Encyclopedia

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License

Home - Hellenica World