Published by
Springer-Verlag London, 2001,
x + 164 pages, hardcover, ISBN 1-85233-417-7.
Order from Springer-Verlag, Amazon, Barnes & Noble. Reviewed in J Sci Expl, AMS Notices, Folha de S. Paulo, Comp Rev, New Scientist.
|
This essential companion volume to Chaitin's highly successful books The Unknowable and The Limits of Mathematics, also published by Springer, presents the technical core of his theory of program-size complexity, also known as algorithmic information theory. (The two previous volumes are more concerned with applications to meta-mathematics.) LISP is used to present the key algorithms and to enable computer users to interact with the author's proofs and discover for themselves how they work. The LISP code for this book is available at the author's Web site together with a Java applet LISP interpreter.
``No one has looked deeper and farther into the abyss of randomness
and its role in mathematics than Greg Chaitin. This book tells you
everything he's seen. Don't miss it.''
--- John Casti, Santa Fe Institute, Author of Gödel: A Life of Logic
``The Limits of Mathematics contains unconventional, new and
challenging reading at all levels, laymen and experts alike, the only
prerequisite being the willingness to question and, if necessary,
abandon long-held beliefs and prejudices.''
--- Karl Svozil, Complexity magazine
``According to Gregory Chaitin, `some mathematical statements are true
for no reason, they're true by accident'. The Unknowable is his
extremely readable exposition of the randomness at the heart of
arithmetic.''
--- Marcus Chown, New Scientist, December 1999
``[The Unknowable] presents three milestone achievements in
a clear, understandable, hands-on way.''
--- Cristian S. Calude, Mathematical Reviews
Go to the LISP interpreter Web page and use cut and paste to enter and run the LISP source code that's given below.
"Let's suppose that the infinite binary sequence r isn't strong Chaitin random. In other words, that there is a k for which there are infinitely many n-bit prefixes of r such that..."
[Why must two n-bit strings α, β always have mutual information H(α:β) ≥ H(n) − c? Well,
= (H(n) + H(α|n)) + (H(n) + H(β|n)) − (H(n) + H(α,β|n)) + O(1)
= H(n) + H(α|n) + H(β|n) − H(α,β|n) + O(1).
But obviously
(relativised subadditivity). So
which was to be proved.]