One week in October a rather alarming issue of the Economist was brought to my door, bearing the headline: “How science goes wrong.” The accompanying leader article made a rather compelling case for reform in the way that science is done. What the article failed to point out, however, is that reform is already well underway.
The Economist’s basic arguments are that science has become complacent, and is no longer self-correcting. People are publishing in a hurry, and the peer review process is no longer fit for purpose. The reality is not so simple. What’s happening is that the nature, scale and pace of science is changing more rapidly than ever before. We shouldn’t be surprised that the time-honoured procedures that have served science well for centuries have to evolve, and they are. Experimental particle physics provides a good case study.
Let’s take a look at academic publishing. The reality of life for a young researcher in a modern particle physics experiment is that time scales are long. At Cern, for example, whole cohorts of doctoral students have graduated through work undertaken on the Large Hadron Collider experiments before those experiments have even started to take data. And even now that we’ve got data, it’s not necessarily the publication record that will make one name from a collaboration of 3,000 stand out. These experiments have therefore had to find other ways of recognising excellence, and they do so by using factors such as a student’s publication record internal to the collaboration, or activity in working groups. Particle physics may be ahead of the curve here, but in many other areas of science, teams are also getting larger.
What of the peer review process? It’s not perfect, but there’s no better way to critically examine a piece of work than to have it dissected by experts. In many areas of science, post-publication peer review is complementing traditional pre-print peer review through submission of articles to pre-print servers. Everything that comes out of our big collaborations at Cern is internally peer reviewed before submission to traditional peer reviewed journals, and everything is also posted on a pre-print server, where its success is determined by post-publication peer review. That’s a high level of scrutiny.
One very important case made by the Economist is the value of a negative result, and here I could not agree more. Arguably, this year’s most significant result from Cern was a negative one. By measuring an extremely rare process, the LHCb experiment has managed to rule out a large number of theoretical models for new physics. This kind of result doesn’t generate the same media attention that comes with a discovery, but by focusing theoretical attention in the right place it can be very positive for the evolution of the field.
The Economist highlights one example from my field: the notorious saga of the pentaquark. The point the magazine makes is that science is not quite as objective, self-critical and self-correcting as it would like to think. Having lived through the saga, I’d say it demonstrates the opposite. In a nutshell, a series of apparent observations of an exotic new particle ultimately turned out to be wishful thinking caused largely by the fact that the analyses were not conducted blind. Humans are very good at seeing patterns where they think patterns should be, and for this reason it’s vital that research procedures keep human intervention to a minimum: that they conduct the analysis blind. I don’t want to claim causality, but blinding techniques in today’s experiments are far tighter than they were at the time of the pentaquarks. And the story was ultimately resolved by constant questioning within the scientific community: just what is said to be missing.
Change in particle physics began a long time ago, and we’ve had time to adapt. I don’t claim we’re perfect—we’re human after all—but we’re conscious of the pitfalls, and we’re addressing them. A prerequisite for being a good scientist is the ability to doubt, to question, and to change the way you think if the evidence—properly verified through repeatable experimentation—tells you to. And there are always enough doubters to compensate for the occasional chaser of rainbows.
I could go on to talk about Cern’s drive for open access publishing and making data available to anyone who wants to study them. But the main point I want to leave you with is this: modern science is evolving. As the Economist says, “it has changed the world beyond recognition, and overwhelmingly for the better.” It’s my firm conviction that it will continue to do so, with as self-critical an eye as it has always had and new tools adapted to the modern scientific endeavour.