The idea of a variable speed of light, championed by an angry young scientist, could one day topple Einstein's theory of relativityby Paul Davies / April 20, 2003 / Leave a comment
Einstein’s famous equation E=mc2 is the only scientific formula known to just about everyone. The “c” here stands for the speed of light. It is one of the most fundamental of the basic constants of physics. Or is it? In recent years a few maverick scientists have claimed that the speed of light might not be constant at all. Shock, horror! Does this mean the next Great Revolution in Science is just around the corner?
Well, maybe. According to one of those scientists, Portuguese-born, London-based Jo?o Magueijo, cracks are appearing in Einstein’s theory of relativity – the cornerstone of our present understanding of space, time and gravitation. In “Faster than the Speed of Light”(Heinemann) he describes his personal journey through this controversial and emotionally supercharged field.
Magueijo got into the subject while puzzling over the smoothness of the universe, a property illustrated by the recent results from the satellite WMAP (Wilkinson Microwave Anisotropy Probe), showing a snapshot of the universe just 380,000 years after the big bang (see picture). Significantly, the infant cosmos appears uniform in temperature and density to about one part in 100,000.
The mystery here is that light can have travelled no more than 380,000 light years by that epoch, yet different patches of the sky shown in the snapshot might be millions of light years apart. As no force or influence can travel faster than light, these various patches can never have been in causal contact. So why are they so similar?
Magueijo has an answer. Perhaps light travelled much faster in the past, enabling forces to propagate more quickly. In that case, widely separated regions of the universe could have pushed and pulled on each other, and thus smoothed out their differences. The theory is easy to state, but it flies in the face of much accepted wisdom. For a start, cosmologists already line up behind a very different explanation for cosmic smoothness, called inflation. According to this scenario, the universe jumped in size by an enormous factor during the first split second. Any primordial irregularities would then have been stretched to oblivion. WMAP lends strong support to inflation.
More worryingly, constancy of the speed of light is central to the theory of relativity and the other areas of modern physics that this theory penetrates. Physicists will give up this key set of ideas only after a bitter struggle. Magueijo describes just what a struggle he had personally in convincing his colleagues to take the varying speed of light (or VSL) theory seriously.
Although his work, in collaboration with Andreas Albrecht, was eventually published in a leading scientific journal, Magueijo’s theory might have been shrugged aside were it not for some remarkable astronomical observations compiled by John Webb of the University of New South Wales. They centre on the quality of the light coming from distant objects called quasars. Split into a spectrum, quasar light is slashed through with dark lines created when intervening clouds of cold gas selectively absorb certain wavelengths, or colours. On careful inspection, some of these lines decompose into closely spaced sub-lines, called “fine structure.” What Webb and his colleagues noticed is that the fine structure in the quasar light is subtly different from its laboratory counterpart. One explanation for this difference is that the speed of light fell slightly between 10 and 6 billion years ago. The effect is tiny – a few parts in a million – but potentially of huge significance.
Taken on their own, Webb’s results do not foreshadow the collapse of textbook physics. History shows that when a crack appears in an established conceptual framework, one of two consequences may follow. One is that the old theory gets tweaked a bit to accommodate the new findings, but the basic framework remains intact. The other is the disintegration of the whole edifice, which gets replaced by something radically new. The latter is what happened when Einstein replaced Newton’s ideas of space, time and gravitation with his own theory of relativity.
Is it now Einstein’s turn to be toppled? It could be. In spite of the entrenched position of the theory of relativity, few physicists would claim that it is the last word. In particular, it cannot readily be united with the other great product of 20th-century physics – quantum mechanics. All attempts to develop a consistent quantum description of gravitation involve fudging Einstein’s original theory in some way. So sooner or later observational flaws will probably appear in the theory of relativity. It may be that Webb’s quasar data already hint at trouble ahead.
The idea that the speed of light might vary from time to time, or even place to place, isn’t new. Several years ago John Moffat of the University of Toronto published a theory along these lines, but his work seems to have been overlooked. For some reason, the flurry of recent papers on the subject has stirred up a scientific hornet’s nest, provoking a strong negative reaction from mainstream physicists.
I have some personal experience of this. Last August I published a short note in Nature, co-authored with two colleagues from the University of New South Wales. We applied the VSL idea to the theory of black holes, to see what the implications would be for the laws of thermodynamics. The paper appeared in the northern hemisphere silly season, and so it received extensive media coverage, in spite of its modest scope and technical nature. Almost immediately I was deluged with hostile denunciations from colleagues, in some cases couched in tones of barely concealed anger. What was all the fuss about? This was, after all, just a calculation.
Part of the answer may be due to pique that our work was given such prominence. But there must be more to it than that. Belief in the absoluteness of nature’s laws is a deeply-rooted part of the scientific culture: to do science, you have to have faith that something is sacrosanct and utterly dependable. For historical reasons, the absolute constancy of the speed of light forms part of the bedrock on which the scientific edifice is built. Attempts to meddle with that bedrock produce an almost visceral response.
Among the angry outbursts we received was one legitimate concern. Some physicists say that the statement “the speed of light has changed” isn’t wrong, but meaningless. The problem arises from the fact that speed is a quantity that normally has units attached. For example, the speed of light is measured to be 300,000 km per second, 186,000 miles per second or one light year per year, depending on which distance and time units you choose. These different numbers merely reflect convention. To say that c has varied invites the response, varied with respect to what? One could explain the Webb results, for example, by saying that distances or clock rates have changed, leaving c constant by convention.
To avoid such ambiguities over units, one must make up a ratio of two speeds. It so happens that physics gives us another fundamental unit of speed, composed of two basic quantities – the unit of electric charge on the electron and Planck’s constant of quantum mechanics. When this other speed is divided by the speed of light the result is a pure number, known as the fine-structure constant; its value is close to 1/137. It is this ratio – not the speed of light as such – that Webb’s results indicate may have changed slightly.
On their own, the astronomical observations can say no more than that. However, if some other branch of physics, such as gravitation, is brought into the picture, then additional, independent, units of speed can be defined. But in any given case, there will be a choice as to what quantities are regarded as fixed in order to define the system of units, and what will be deemed to vary.
Magueijo is well aware of this criticism, and he addresses it competently in his book. He argues that if the theory that predicts a variation of the fine-structure “constant” is simple when cast in terms of a varying speed of light, but very complicated if cast in terms of a varying electric charge, then it makes sense to use the former description. There is thus an element of judgement involved in the way the ideas are presented. But this has always been the case in physics, where simplicity and elegance play a guiding role in formulating new theories. To my mind, this puts the matter to rest. It is perfectly meaningful to claim that the speed of light was greater in the past, whether or not it is actually the case.
There are other reasons than purely scientific ones why Magueijo’s book will raise hackles. Not only is his theory confrontational, so are his opinions about the scientific community. He devotes a lot of space to attacking the peer-review system, university administration, journal policy and some of his colleagues. Although a few of these jibes are probably justified, and young researchers may empathise with them, I found his remarks too coarse and flippant. I sympathise with the author for the rough ride he has received in trying to get his ideas across to a sceptical scientific community, but I believe that on balance the quality control mechanisms for theoretical physics work well. It is only right and proper that unconventional new ideas should be battle-tested before gaining currency; that makes them all the stronger if they survive.
Many readers will enjoy this book’s irreverence and iconoclastic message. The work of Magueijo and others could herald the start of a major shake-up in physics, or it may turn out to be a blind alley. We shall have to await the outcome of future research to see. But even if the speed of light is constant after all, there is much fascinating physics and cosmology here, plus some unusual perspectives into the way professional science is conducted.