Technology

How Newton got real

March 28, 2014
Placeholder image!

The Abel Prize, awarded annually by the Norwegian Academy of Science and letters for achievements in mathematics, is widely regarded as the “maths Nobel”. Only the Field Medal, which is awarded only every four years and is restricted to mathematicians under the age of 40, can claim a comparable status. But probably no Nobel presents reporters with a greater challenge than the Abel for explaining to the lay person what the prize is being given for. Some of the recent citations have a terseness that warns you to ask no further if you don’t want to hear something incomprehensible: “for his revolutionary contributions to geometry”, "for his vast and lasting impact on the theory of numbers." Such is the world of modern maths.

But this year is a little different, because the work of the 2014 Abel laureate is not an exploration of the abstractions of pure numbers, shapes or curves. Yakov Sinai of Princeton University is more of a mathematical physicist, which is to say that his research focuses on problems linked to physics: to real objects and the ways they behave. Better still, these problems are concerned with a very fundamental and easily grasped aspect of that behaviour: how things move, the remit of the discipline known as dynamics.

That has always been an issue at the core of physics, going right back to Aristotle. His explanations tended towards the teleological, not to say the tautological: things moved as they did because it was in their intrinsic nature to do so. That is why, for example, stones fall to earth. All this changed with Galileo and Isaac Newton in the seventeenth century, who discovered the basic laws of motion. Galileo explained that objects change the trajectory of their motion in response to forces acting on them. In the absence of any such force, an object moving in a straight line will do that forever. Newton put those ideas into mathematical form, and in particular he explained what this picture means for the force of gravity. Thanks to the force of gravity, stones accelerate as they fall to earth, while gravity holds the moon in orbit around the earth, and the earth and other planets in orbit around the sun.

One problem with Newton’s picture, which he himself partly understood, is that his laws of motion get very complicated once you apply them to more than two objects that feel each other’s force. With only three bodies, the equations can go haywire: even though you can write down the equations of motion exactly, you can’t predict how the bodies will move long into the future, because the tiniest imprecision in our knowledge of the initial motions eventually becomes blown up into utterly different outcomes. This is the property known as deterministic chaos. It’s “deterministic” because there’s nothing random driving it.

And the fact is that nearly all real-world problems of dynamics, such as the flows of the atmosphere that create the weather, involve many, many interacting objects. Even the solar system, populated as it is with several planets and moons, is a chaotic system: we can’t predict what the orbits will be a few hundreds of millions of years into the future. What’s more, such movements are often also at the mercy of random influences too: a cricket ball flying through the air experiences the buffeting of unpredictable, essentially random eddies, say. There’s similar randomness in systems that don’t exactly move but which change over time according to similar “dynamical” principles of interaction, cause and effect, such as stock-market prices or the sizes of animal populations.

Sinai has worked on this sort of “complex dynamics”, where the presence of chaos or randomness—or both—makes it impossible to predict the future behaviour exactly. In such cases, one is forced to make such predictions statistically, which is of course exactly how we handle the weather or climate: estimating the chances of this or that outcome. Sinai was inducted into this area by his doctoral supervisor at Moscow State University in the 1950s: Andrey Kolmogorov, one of the greatest unsung mathematical physicists of the twentieth century, who pioneered the understanding of probability theory.

So Sinai’s research has really been about adapting the elegance of Newtonian mechanics to the messiness of the real world. One of his key insights, developed with Kolmogorov, was to show that even for systems where the trajectories of the component objects is unpredictable, there is a quantity that it’s possible to define and measure rather precisely. It’s called the Kolmogorov-Sinai entropy, and is a measure of the unpredictability of the behaviour: the greater this entropy, the lower the predictability. This quantity relates the theory of complex dynamics to the theory of information that Claude Shannon developed in the 1940s, which invokes an analogous entropy to measure how much information is contained in a noisy signal—Shannon, an electrical engineer working at Bell Laboratories in New Jersey, was motivated by the demands of the nascent telecommunications industry.

The Kolmogorov-Sinai entropy reveals something surprising about systems with complex dynamics. You might naively expect those subject to some randomness to be quite different from those for which you can write down exact equations describing the motions, with no randomness at all to interfere (and which are therefore deterministic). But in fact the fundamental distinction is between two classes of purely deterministic systems. Some have zero entropy, meaning that you can predict them precisely as far ahead as you like, while others do have a certain amount of entropy and thus unpredictability—most notably, those that are chaotic. This makes them in some ways more akin to the systems ruffled by randomness.

In this way, Sinai’s work has helped to show that processes that look like they should be different actually have a deep similarity. American mathematician Jordan Ellenberg, who presented the award address in Oslo on Wednesday, says that it’s rather like the way Newton showed that the downward motion of an apple and the circular motion of the planets are in fact expressions of the same basic law of gravity.

Ellenberg told me that the distinguishing feature of Sinai’s work has been an ability to get to the core of what a problem is all about: to approach physics with the rigour and the “soul of a mathematician.” This prompted Ellenberg to make an observation that speaks more broadly to an important difference between maths and the other sciences—that in maths, “a good definition is as important as a good theorem.” In other words, sometimes understanding can come not from solving a problem so much as from formulating the problem in the right way. That’s something very evident from Albert Einstein’s work too, not to mention Newton’s—and in this sense, folk like Kolmogorov and Sinai are probably more directly the descendants of those towering figures than are many of today’s more practically inclined physicists.