For half a century the Big Bang theory has dominated attempts to explain the origin of the Universe. It is now being discredited by 16 billion year old stars. John Maddox explains the new crisis in cosmologyby John Maddox / February 20, 1996 / Leave a comment
Published in February 1996 issue of Prospect Magazine
Cosmology seems to have staggered from one crisis to another since Galileo was accused of heresy in 1633. Three decades ago, cosmology was recovering from the great dispute between two Cambridge astronomers, Sir Fred Hoyle and Sir Martin Ryle, on the issue of whether the Universe had a beginning. Then, just a few weeks ago, Professor Stephen Hawking was telling a packed audience at London’s Albert Hall that the ultimate fate of the Universe cannot be foretold. What is cosmology telling us? The latest crisis has blown up because of the doubt cast on the idea that the Universe, which means Everything, began between 10 and 20 billion years ago. “In the beginning there was the void,” remember? Then suddenly everything appeared in a flash-a huge hot flash whose energy brought with it all the matter now in the world and the momentum that keeps the whole contraption expanding. This neat echo of Genesis has been the standard view of how our world began. It is known to cosmologists and the world at large as the Big Bang. The theory has been in the public domain since 1946, when George Gamow, a Russian ?migr? to the US, put it forward as an explanation of how the Universe seems to be expanding. Later, with his colleague Herman Alpher, Gamow argued that at all stages in its history the Universe must have had a particular temperature. At the outset, the temperature would have been indistinguishable from infinity-whatever that may mean. Now, despite the temperature at the surface of stars such as the Sun (6,000 degrees) or at their interior (say 15 million degrees), the temperature of the Universe as a whole has fallen to below that of the boiling point of liquid helium, the coldest fixed point of temperature known. Without Gamow’s Big Bang there would have been no explanation for what remains the most striking feature of the Universe-its apparently continuing expansion. So much had been established by Edwin T. Hubble, the US astronomer, as long ago as 1929. He had trained the then brand new 100-inch Hale telescope in California on some hundreds of the galaxies of stars lying outside the Milky Way, and had found that the light from the fainter (and so presumably more distant) galaxies was redder than it should have been. The classical interpretation is that the reddening of the light arises in much the same way in which the pitch of a motorcar horn changes from higher to lower as it passes by, so that the reddening of the light is a measure of the speed of recession of the galaxies. Big Bang made sense of Hubble’s expanding Universe. Before Gamow’s Big Bang the Universe did not have a beginning or an end and did not change fundamentally with the passing of time. But for the past 50 years Big Bang has swept all before it. It has two bright feathers in its cap, of which the chief is that it explains the persistence in the real world of what are called the “light elements,” which include “heavy hydrogen” together with particular isotopes of elements such as helium, lithium, beryllium and boron. These are truly primordial: they are components of the material with which stars such as the Sun and galaxies such as the Milky Way were endowed at their formation. Other elements-carbon, iron and uranium, for example-are formed as stars (such as the Sun) run their course and eventually scatter their thermonuclear debris through interstellar space. The point is that the light elements would be consumed, rather than created, in such processes. But in the course of the Big Bang, there should have been a point-perhaps a few hours after the very beginning-when the temperature would have been just right for the formation of the light elements. The other evidence for Big Bang is more spectacular. Just over 30 years ago, two workers at the Bell Research Laboratories discovered that the whole of space is filled with microwave radiation corresponding to a temperature of just 2.7 degrees above the absolute zero of temperature (itself just over 273 degrees centigrade below the melting point of ice, or four degrees below the boiling point of liquid helium). That was quickly hailed as proof of the reality of the Big Bang. Literally, the microwave radiation (called the Cosmic Microwave Background) is supposed to be a relic of the time, perhaps 100,000 years after the Big Bang and long before stars or even galaxies could have formed, when radiation and matter (mostly atoms of hydrogen) were intimately interacting with each other. With the continued expansion of the Universe, the point would have been reached when the hydrogen became transparent to the radiation, which then filled the Universe as it does today. Since then the temperature of the radiation has been gradually falling-precisely as the original Big Bang theory predicted that it would. So what exactly is the problem with Gamow’s apparently perfect theory? One sign that everything may not be well is the doubt about the age of the Universe. What can be made of the statement that the age is “between about 10 and 20 billion years”? If the temperature of the microwave background is now known to one part in 250 (which is the case), why cannot the age be known more exactly? The difficulty lies in the measurement of distance in the Universe, on which cosmologists have repeatedly stubbed their toes. At least in principle, the speed of a distant galaxy can be measured from its red-shift. Telling how far away it may be is much more difficult. Yet without an estimate of its distance, there is no way of telling for how long it has been receding from our own galaxy, and so no way of telling when all the galaxies were clumped together at the Big Bang. Hubble’s own first estimate of the age of the Universe-assuming the brightness of a galaxy (or, rather, its faintness) is a proxy for its distance-was a mere 2 billion years, or less than half the age of the earth. This was later corrected to 5 billion years, embarrassingly like the age of the Sun. Only in the late 1950s was that estimate changed again, when astronomers appreciated that distant galaxies would seem fainter than they are because of the effect of such things as the absorption of their light by interstellar dust. What has now put the cat among the pigeons is, in essence, the more accurate measurement of the distance of three galaxies whose recession speed can be measured accurately. It is still a needle-in-a-haystack measurement, depending on finding in the other galaxies individual stars which rhythmically change in their brightness, over days or weeks, with a speed that depends only on their inherent brightness. By comparing that with the measured brightness, the distance can be inferred. Variable stars like these, which are in the later stages of their evolution, have been used to map our own and nearby galaxies. When the Hubble space telescope was designed, one of its chief objectives was to find these variable stars in distant galaxies so as to extend the accurately-known distance scale to more distant regions of the Universe, although, ironically, the first measurement came 18 months ago from a French and Canadian ground-based telescope in Hawaii. Since then, the Hubble telescope has produced two further measurements of variable stars in other galaxies. The shock is that the measurements are all consistent with each other and that collectively they point to a young Universe-one more like 10 billion years old or even less. This is a serious embarrassment because there are some stars in our own galaxy which are at least 12 billion years old, perhaps even 16 billion. How can the Universe, which is Everything, be younger than some of its parts? This is the new crisis in cosmology. So far, the reaction of cosmologists has been muffled. Those who have never believed in the Big Bang are naturally pleased at the discomfiture of their opponents. More orthodox observers, quite properly, say that this first clutch of measurements does not clinch the issue of the age of the Universe, although they do acknowledge that the new measurements at the very least mean that the amount of matter in the Universe must be less than has recently been thought. Attention is now turning to the stars whose great age-between 12 and 16 billion years-is at the root of the contradiction which has come to light. These are exceptional stars not just because they are very old, but also because they belong to structures in our own galaxy called globular clusters, some of them including thousands of stars, which are packed much more closely together than most galactic stars. Is it, perhaps, that these structures are survivors from an earlier epoch of the Universe, predating the Big Bang 10 billion years ago? That would fit with a theory put forward two years ago by Sir Fred Hoyle. The idea is that there have been several successive bouts of matter-creation during an indefinitely long history of the Universe. The link with Genesis is severed. But can the creation of matter be consistent with physics as now understood? That depends on what is meant by creation. It is now accepted that particles of matter can appear out of thin air. One of Hawking’s claims on our attention is that he has used this idea to show that, if there are black holes consisting of matter so dense that light cannot escape from them, the regions around them should be sources of radiation and even matter. All this is still speculative. For that matter so is cosmology as a whole. It will be a rich irony if Hoyle, cast into the wilderness for scepticism about Big Bang in the 1950s, is now proved half right. The more probable outcome will be much more careful measurement of the distances to other galaxies near our own and a more radical account of how the Universe is constructed and where it came from than any yet advanced. Watch this space, but be prepared to wait some time.