VII. Mathematics in the Third Millennium?

[Based on my talk on ``Mathematics in the third millennium'' at Tor Nørretrander's fabulous Mindship institute, Copenhagen, summer of 1996. Also based on the interview with me conducted by Guillermo Martínez and published June 1998 in the Buenos Aires newspaper Página/12.]


Synopsis

Is math quasi-empirical? (Again!) I don't believe there should be an abrupt discontinuity between how mathematicians work and mathematical physicists work—it should be a continuum of possibilities.

Randomness & entropy in physics versus lack of structure defined via program-size complexity: Boltzmann, individuals versus ensembles. Wolfram: maybe the universe is like π, pseudo-random!

Mathematical discovery: Discovery versus formal reasoning, Euler versus Gauss, Polya's Mathematics and Plausible Reasoning, reading Euler's Opera Omnia as a child!

Biological complexity, evolution & the origin of life!? My complexity is too hard to increase. Wolfram: because of the ubiquity of universality, maybe evolution is easy!

Nature is a cobbler, un bricoleur. Contrast biology with elementary number theory.

Material from Guillermo Martínez interview. Beauty of mathematics, simple, powerful, elegant ideas. Is math becoming like biology, messy, complex?

Complicated contemporary physics. No simple equations, no hydrogen atom. Now it's many-body statistical physics. Even fundamental physical theory is like that: the quantum field vacuum is a hot bed of activity. Joke from book on many-body problems on the progress of physics, i.e., how many bodies does it take to have a problem?


The Beauty of Mathematics

When I was young, one of the things that attracted me a great deal was the beauty of mathematics. I had similar feelings when I read and understood a beautiful mathematical idea as when I saw a beautiful painting, a beautiful woman, or a graceful ballerina. Human society might be a mess, life a chaotic tragedy, but I could escape into the beautiful, clear, sharp, inhuman light of elementary number theory, of the prime numbers, where a few simple powerful elegant ideas were what counted, not power, not violence, not money!

I recall that at school I was good at subjects that required reasoning, not memorization. I was very good at math and physics; everything could be deduced from the basic principles. I was bad at French; there were not enough simple powerful unifying principles.

Take a good look at biology! It's a complicated mess. Are there laws of biology in the same sense as there are laws of physics? Nature is a cobbler, nature patches and reworks biological organisms, they're a mess, but they work, they survive! That's natural selection for you!

I liked physics as well as math. Look at the Bohr model for the hydrogen atom, or at the Schrödinger equation. A few simple equations explained it all!

Well, a funny thing has been happening. Math has been getting more complicated. Look at the immense computer proof-by-cases for the four-color theorem. [That's the assertion that with four colors you can paint any map on the plane in such a way that adjacent countries have different colors.] Look at the human generated but still monstrous classification of all simple groups: ten-thousand pages of proofs written by many, many mathematicians! [Roughly speaking, simple groups play the same role in group theory that the primes play in number theory. For understandable explanations of the proof of the four-color theorem and of the classification of the simple groups, see L.A. Steen, Mathematics Today—Twelve Informal Essays.]

And look at contemporary physics. Now you don't do theoretical physics writing down a simple equation and solving it analytically in closed form, like you did when I was a child. Now it's complicated computer models that you simulate on the computer to see how they behave... [See for example G.W. Flake, The Computational Beauty of Nature—Computer Explorations of Fractals, Chaos, Complex Systems, and Adaptation.]

Complicated contemporary physics! No simple equations, no hydrogen atom. Now it's many-body statistical physics. Even fundamental physical theory is like that. Even the quantum field vacuum is a hotbed of activity. Here's a joke, from a book on many-body problems, on the progress of physics, as measured by how many bodies it takes to have a problem.

Richard Mattuck, in his book he modestly refers to as Feynman Diagrams for Idiots (the official title is A Guide to Feynman Diagrams in the Many-Body Problem), sums up the progress of physics like this. How many bodies does it take to have a problem? In Newtonian physics, it was three bodies. Two gravitating point masses you can solve exactly in closed form, three, no. In general relativity two bodies get you in trouble. For a single mass point, you have the neat Schwarzchild solution, also known as the black hole. But for two bodies, it's complicated numerical work on the computer... Now in quantum field theory, even zero bodies is too much! Because the quantum mechanical vacuum is very complicated, it's a seething sea of creation and annihilation of virtual particles... You can do perturbation expansions to make estimates, but exact closed-form solutions? Forget it! [For an understandable explanation of quantum field theory, see R.P. Feynman, QED—The Strange Theory of Light and Matter... Let me give a particularly dramatic—but more technical—example. Following what's called the lattice gauge theory approach, my colleague Don Weingarten built a massively parallel super-computer just in order to do Monte Carlo estimates (estimates via statistical sampling) of Feynman path integrals (sums over all histories) in QCD (quantum chromodynamics, the theory of quarks and gluons). Each computation took about a year!]

So what will the mathematics of the future be like? Will there be wonderful new simple powerful ideas, or will things be messy and complicated as in biology? In that case, new kinds of scientific personalities will be needed to do this new kind of mathematics...

Well, it's a funny thing, but if you look back at the work of Gödel, Turing, and my own that I've presented here, this was already happening... You can already clearly see the beginning of a new kind of mathematics, a very different kind of mathematics, one that is more complicated, one that is in a way more like biology...

Here's why I say this...


A new complicated mathematics?

My approach is, in a way, just as complicated as Gödel's and Turing's, except that the complications are different. For Gödel, it's the internal structure of his axiomatic system and his primitive recursive definitional schemes and his Gödel numbering that's complicated. For Turing, it's the universal Turing machine interpreter program, which he spells out in his 1936 paper. And for me it's the LISP interpreter (which corresponds to Turing's complicated universal machine), which you don't see, and the definition of the LISP language, the size of the programmer's manual, which you do see. In my case, the complications are like an iceberg, most of which is below the water!

Peano arithmetic + first order logic as a formal axiomatic system, the code for Turing's universal machine, the interpreter for my LISP... These are very strange kinds of mathematical objects, completely different from traditional mathematical objects... Look at the primes, at the Riemann ζ function ζ(s), they're so simple... Look at a workable formal axiomatic system, at Turing's universal machine, at a LISP interpreter, they're so complicated...

So in a way, in all three cases, Gödel, Turing, and I, we already have a new ``biological'' complicated mathematics, the mathematics of the third millennium, or at least of the 21st century. [As a child I used to dream that I was in the far future, in a library, desperate to see how it had all turned out, desperate to see what science had achieved. And I would take a volume off the shelf and open it, and all I could see were words, words, words, words that made no sense at all... Writing this book brings back long-forgotten thoughts and the unusual lucidity I experience when my research is going well and everything seems inevitable.]


Integrative themes: Information, Complexity, Randomness

In a way, these three words really sum up and tie together an immense complicated scientific and technological paradigm shift at the end of this century and of this millennium. They sum up the new zeitgeist, the new spirit of these times.

Look at DNA, it's biological information... Look at the new field of quantum computing and quantum information theory... Look at the title of my colleague Rolf Landauer's 1991 paper in Physics Today: ``Information is physical''...

Look at how complicated computer hardware and especially software is becoming... At the megabytes and megabytes of code one is now accustomed to have, and that you need to have, to use a computer...

Look at the human genome project, it's so much information, a huge data base of it, many huge databases... And one needs new software technology to organize it, to search it, to use it... [See D.S. Robertson, The New Renaissance—Computers and the Next Level of Civilization, for a deep information-theoretic analysis of the four different levels of civilization associated with speech, reading and writing, the printing press, and the PC, the Internet and the Web. According to Robertson, the key feature of each of these steps forward has been a substantial increase in the amount of information that can be stored, remembered and processed by the human race. And each of these jumps in information-processing power is associated with major social change.]

Look at artificial intelligence. I think it's happening, I think we're half-way there, we just don't realize it. People used to think, AI pioneers used to think, that they just needed a handful of great ideas, Nobel-prize-winning level ideas, and they would understand how human intelligence works and how to create an artificial intelligence. Instead we're getting chess playing, speech recognition and synthesis, etc., by accretion, by summing the work on hardware and software by an entire planet of hardware and software engineers... It's not a few fundamental new ideas, it's megabytes and megabytes of complicated software, that is gradually developing and evolving...

Look at some recent speculations on the nature of consciousness [D.J. Chalmers, The Conscious Mind—In Search of a Fundamental Theory, G.R. Mulhauser, Mind Out of Matter—Topics in the Physical Foundations of Consciousness and Cognition, T. Nørretranders, The User Illusion—Cutting Consciousness Down to Size] where information theory is discussed. Consciousness does not seem to be material, and information is certainly immaterial, so perhaps consciousness, and perhaps even the soul, is sculpted in information, not matter. As science fiction writers are fond of pointing out, ``soul'' is to ``body'' as ``program'' is to ``computer.''

The conventional view is that matter is primary, and that information, if it exists, emerges from matter. But what if information is primary, and matter is the secondary phenomenon! After all, the same information can have many different material representations in biology, in physics, and in psychology: DNA, RNA; DVD's, videotapes; long-term memory, short-term memory, nerve impulses, hormones. The material representation is irrelevant, what counts is the information itself. The same software can run on many machines.

Information is a really revolutionary new kind of concept, and recognition of this fact is one of the milestones of this age.

That really sums up what I have to say, what I see as the moral of the story... What I see as the broad picture... But I can't resist a few more detailed final remarks... Some final words...


Afterthoughts...

What is Ω? It's just the diamond-hard distilled and crystallized essence of mathematical truth! It's what you get when you compress tremendously the coal of redundant mathematical truth... And is math quasi-empirical? (Not that again!) Let me state my position as modestly and uncontroversially as possible: I don't believe there should be an abrupt discontinuity between how mathematicians work and mathematical physicists work—it should be a continuum of possibilities. No proof is totally convincing. There are just differing degrees of credibility. [I am not saying that math and physics are one and the same; math deals with the world of mathematical ideas and physics deals with the real world, math is quasi-empirical and physics is empirical. In particular there is a big difference between the two subjects that was drummed into me when I was a guest in Gordon Lasher's theoretical physics group. That's the fact that physicists know that no equation is exact—they're merely good approximations in which one ignores lower-order effects, in which one ignores perturbations that operate on smaller scales. As Jacob Schwartz so beautifully put it in an essay in M. Kac, G.-C. Rota, and J.T. Schwartz's anthology Discrete Thoughts—Essays on Mathematics, Science, and Philosophy, physicists know that all equations are approximate, so they prefer short, robust, unrigorous proofs that are stable under perturbations, to long, fragile, rigorous proofs that are not stable under perturbations (but that are perfectly okay in pure mathematics)... I also strongly recommend Gian-Carlo Rota's anthology Indiscrete Thoughts. Among his other fascinating observations on doing mathematics, Rota makes the point that some mathematicians are mental athletes who like finding new proofs and settling old problems, while others are dreamers who prefer to find new definitions and create new theories. I definitely belong to the latter class! Rota makes the point that these two extremely different kinds of mathematical personalities sometimes view each other with thinly veiled contempt!... By the way, these remarks cost Rota some friends. And that's another difference between mathematics and physics: physicists have a sense of humor, mathematicians don't!]

I should state here that AIT has an intimate connection with physics. Charles Bennett and others have used program-size instead of Boltzmann entropy in their discussion of Maxwell's demon. Two very readable books on this subject were published in 1998. See T. Nørretranders, ``Maxwell's Demon,'' chapter 1 in The User Illusion—Cutting Consciousness Down to Size, and H.C. von Baeyer, Maxwell's Demon—Why Warmth Disperses and Time Passes. Let's compare randomness and entropy in physics with lack of structure as defined via program-size complexity. It's just individuals versus ensembles! In statistical physics you have Boltzmann entropy which measures how well probability is distributed over an ensemble of possibilities. It's an ensemble notion. In effect, in AIT I look at the entropy/program-size of individual microstates, not at the ensemble of all possible microstates and the distribution of probability across the phase space. For more on the history of these ideas, see David Ruelle's delightful book Chance and Chaos.

Some final words on Stephen Wolfram's fascinating, and unfortunately unpublished, ideas. Wolfram has a very different view of complexity from mine. In my view π is not at all complex, but to Wolfram it's infinitely complex, because it looks completely random. Wolfram's view is that simple laws, simple combinatorial structures, can produce very complicated unpredictable behavior. π is a good example. If you didn't know where they come from, its digits would look completely random. In fact, Wolfram says, maybe the universe contains no randomness, maybe everything is actually deterministic, maybe it's only pseudo-randomness! And how could you tell the difference? The illusion of free will is because the future is too hard to predict, but it's not really unpredictable. [To Wolfram's exceedingly bright and sharp mind, the idea of indeterminacy, of randomness, of something irrational, that escapes the power of reason, of simple unifying principles, that happens for no reason—and that he will never be able to understand—is totally abhorrent. The horror of a vacuum of the ancients becomes a modern horror of randomness. To such a mind, I must appear, because of my belief in randomness, as a muddle-headed mystic!... I'm also reminded of Feynman's fury in a conversation we had near the end of his life when I suggested that there might be wonderful new laws of physics waiting to be discovered. Of course!, I told myself later, how could he bear the thought that he wouldn't live to see it?... Science and magic both share the belief that ordinary reality is not the real reality, that something more fundamental is hidden behind everyday appearances. They share a belief in the fundamental importance of hidden secret knowledge. Physicists are searching for their TOE, theory of everything, and kabbalists search for a secret name of God that is the key that unlocks all understanding. In a way the two are allies, for neither can bear the thought that there is no secret meaning, no final theory, and that things may be arbitrary, random, meaningless, incompressible and incomprehensible. For a dramatization of this idea, see D. Aronofsky's 1998 film π. See also G. Johnson, Fire in the Mind—Science, Faith, and the Search for Order, and P. Davies, The Mind of God—The Scientific Basis for a Rational World.]

Wolfram also has some fascinating ideas about biology, the origin of life and evolution. One of my big disappointments, the big disappointment in my scientific life, is that I couldn't use my program-size complexity to make a mathematical theory out of Darwin. [I was strongly influenced by von Neumann. For an early report of von Neumann's ideas, see J.G. Kemeny's 1955 article in Scientific American, ``Man viewed as a machine.'' For a statement by von Neumann himself, see ``The general and logical theory of automata'' in volume 4 of J.R. Newman's The World of Mathematics. For a posthumous account assembled by A.W. Burks, see von Neumann's Theory of Self-Reproducing Automata. For samples of contemporary thought on these matters, see P. Davies, The Fifth Miracle—The Search for the Origin of Life, and C. Adami, Introduction to Artificial Life.] My complexity is conserved, it's impossible to make it increase, which is great if you're doing metamathematical incompleteness results, but hell if you want to get evolution. So I asked Wolfram his thoughts on this matter, and his reply was absolutely fascinating. He has amassed much evidence of the ubiquity of universality. In other words, he's discovered that many, many different kinds of simple combinatorial systems achieve computational universality, and have rich, complicated unpredictable behavior. π is just one example... So what's so surprising about getting life, about getting clever organisms that exhibit rich, complicated behavior, that need it to survive? That's easy to do!!! And I suspect that Wolfram is right, I just want to get a copy of his 800-page book on the subject and be able to read it and think about it at my leisure. I have held its two volumes in my hands, briefly, once, during a fascinating visit to Wolfram's home...

A final, very final, word on mathematical discovery. It's been fun, great fun, for me to work on incompleteness and information. But incompleteness results are depressing, and formal systems are a drag, a bore... It's much more fun to think about mathematical discovery, about creativity instead of formal reasoning, and about L. Euler instead of C.F. Gauss. Why Euler versus Gauss? Because Euler published every step in his reasoning, in the discovery process, while Gauss carefully removed all the scaffolding from around his beautiful buildings... Gauss's papers, I'm told, are very hard to read... P.G.L. Dirichlet traveled with Gauss's masterpiece Disquisitiones Arithmeticae everywhere for years. But Euler is a delight to read...

I still remember my childish joy at reading the story of how Euler made some of his great mathematical discoveries in Polya's two volume Mathematics and Plausible Reasoning. As a child I was lucky enough to get permission to wander through the stacks at the Columbia University mathematics library, and I was fascinated by some of the collected works that I found there: N.H. Abel's collected works (Oeuvres Complètes), which are small but wonderful, and in beautiful old French, and Euler's collected works (Opera Omnia), which are anything but small! They're still being gradually published; Euler left so many manuscripts...

I remember what a joy it was to read a series of papers on number theory by Euler and see the evidence that led him to conjecture a result, and how he gradually filled in the holes in a proof until he finally had a complete proof! What a treat it was for me to translate one of his number theory papers written in Latin... I knew no Latin, I just had a Latin dictionary—but I did know plenty of number theory! Or his paper in French explaining his discovery of a recursion formula for σ(n), the sum of the divisors of the natural number n, what a delight!

So no more depressing incompleteness results! No more cold, dry formal axiomatic systems! A sensual, joyful theory of discovery, of creation, that's what I want! My theorems may be pessimistic, but I'm an optimist! [For more evidence of this, see my interview in J. Horgan, The End of Science—Facing the Limits of Knowledge in the Twilight of the Scientific Age.] Maybe you or some other reader of this book can find a way to do it! After all, all it takes is ``guts and imagination''! [A memorable phrase from N.C. Chaitin's 1962 film The Small Hours.]