How we "count" migration

People are very hard to count, especially in a free society. The failings in Britain's system of counting migration reflect the inherent flaws in any mass sampling system. Although the system could be improved, it will always be tough to predict future trends
November 25, 2007
Discuss this article at First Drafts, Prospect's blog

In late September, the Office for National Statistics (ONS) sharply increased its projection for annual net immigration to Britain—from 145,000 to 190,000. Few numbers light the political touchpaper quite like these. The increase is based partly on data for two new years (2004 and 2005), when net migration was at record levels, and partly on methodological changes. In practical terms, this means in 2005-06 an estimated net inflow of about 500 people a day; in future, the ONS assumes, a few more.

Since scepticism about immigration often expresses itself as cynicism about the statistics, it is worth looking more closely at where the numbers come from. A common assumption is that we count immigrants, or try, and that if we get it wrong, it is because some slip under the wire, through illegal channels, official incompetence or conspiracy. The truth is that we don't even do that. Few statistics about the economy and society are the result of a count. The migration number comes from a sample, and it is a good illustration of sampling's limitations. Some people, hearing how it is done, don't quite believe it.

The international passenger survey

The story begins with the International Passenger Survey (IPS). The IPS is a sample of all people entering and leaving the country—including migrants, who are defined as those intending to settle or leave for a year or more. The IPS interviews about 300,000 people a year on boats, at the channel tunnel and, mostly (about 86 per cent), at airports.

article body image

You may have seen them in action on a sea crossing. In matching blue blazers, a survey team finds its ferry, stands at the top of the various stairs to the passenger decks and selects a sample by scribbling a description of every tenth person aboard: the rucksacked, the refugees, the suited or the carefree. The plan is to use this description to pick them out later, during the voyage, for a gentle interrogation, hoping for no strange languages; hoping no one's in the shower, asleep under the newspaper, or in a change of clothes; hoping they will be willing to answer questions. For the only way to know whether they are migrants rather than tourists, on business or a booze cruise, is to ask them. This gives the origin of the data an air of pantomime: blazers go a-hunting for oblivious passengers, for the woman in the paisley skirt, the bearded man in the biker T-shirt. And so the tides of people seeking new lives or fleeing old, heading for work, marriage or retirement, enter the official statistics when politely cornered, skulking by slot machines, halfway though a croissant, off to the ladies' loo. (Airports are more straightforward. The IPS teams work in shifts across the day. Passengers are counted as they cross a predetermined line and every nth is interviewed; n varies across sites but is supposedly "never more than 67.")

The sample of those questioned constitutes about 0.2 per cent of total traffic, or 1 in 500. About 1 per cent of this 0.2 per cent are migrants. A complex weighting is then applied to the numbers to gross the sample contacts to total passenger numbers supplied by the Civil Aviation Authority (air), the department of transport (sea) and Eurostar/Eurotunnel (tunnel).

Large though the total number of interviews is in absolute terms, the huge quantity of passengers, and the relatively small proportion of migrants within the sample, means, for example, that the registrar general of Scotland says inward migration to Scotland in 2004 was estimated from about 100 contacts.

Mervyn King, governor of the Bank of England—in charge of setting interest rates, and with reason to want to know the size of the workforce—has described the survey as "hopelessly inadequate." In November 2006, he told the treasury select committee that estimates of eastern European migration had been "surprisingly low," adding: "We do not have any particularly accurate method in the UK of measuring migration, either gross or net."

Some of the detail of King's evidence is worth repeating: "In 2003," he told the committee, "I think there were 516,000 passenger journeys between the UK and Poland. That is both in and out… Almost all of them were to Gatwick, Heathrow or Manchester. Over the next two years the number of passenger journeys between the UK and Poland went from 516,000 to about 1.8m. Almost all of that increase was in airports other than Heathrow, Gatwick and Manchester. So, from a figure of 11,000 in 2003, there were more than 1m passenger journeys between the UK and Poland only two years later—in airports outside Heathrow, Gatwick and Manchester. Why does this matter? Because most of the people handing out the questionnaires for the IPS were at Heathrow, Gatwick and Manchester."

By definition, we are not good at predicting surprises. Although some would argue that we ought not to have been surprised by the influx of eastern Europeans after EU enlargement, the ONS does not regard itself as being in the business of putting speculative numbers on political change. What it mostly does is to roll forward current trends.

So new patterns of movement can leave old samples looking biased. And from time to time, other glitches turn up in the methodology. For example, for years it had been policy not to survey overnight passenger traffic. This was not thought to cause any bias, until someone wondered about the early morning arrival of flights from Hong Kong. Since January 2007, IPS shifts at Heathrow's terminal 1 have begun at 5am. Then there is the fact that the IPS does not interview people coming to and from Ireland, and so relies on Irish data to supplement its own.

The figures are also prone to doubt about people known as "switchers." In order to be defined as a migrant, you have to declare an intention to stay (or to leave) for 12 months or more. But plenty of passengers might say they plan to come for a year and then change their minds. On the other hand, those who plan only one season's work, or just travel, may later decide to stay, perhaps to marry or study. People, being people, might intend to go home sooner, or say they do, then change their minds, or come to do seasonal work, go home for a while, then come back. In the past, switchers from the EU were assumed to number zero. In future, of those who tell the IPS that they are "possible migrants," it will be assumed that 50 per cent will stay.

One final curiosity in a list that could go on: imagine, for whatever reason, a sudden increase in the number of people who start coming to Britain for stays of less than 12 months. They would not be defined as immigrants, and therefore not appear in the migration statistics, even though the person on the street might notice that there seemed to be more foreigners around (and they would have access to some public services). The ONS is trying to find ways of estimating this "short-term migration."

One common reaction to these assorted flaws in the data is scorn. But it would be hard to accuse the ONS of incompetence or negligence. One glance at the volumes of guidance issued to IPS staff reveals a fastidious attention to detail. They are warned, for example, that when a passenger declares the purpose of a trip to be sport, they must be asked if this is to participate or as a spectator; if as a participant, whether amateur or professional.

The IPS is not the only source of data about migration, and was never in fact designed for this purpose (it was intended to tell us more about tourism and business travel), but it remains the most important. Among others are the national census, though being ten-yearly this is no use for keeping abreast of sudden changes like the Polish influx. Work permits issued also measure the entry of workers, but only from outside the European Economic Area, and only when people apply for them. Since 2004, the worker registration scheme has served a similar purpose for immigrants from new EU countries. Asylum applications to the home office tell us something about the number of asylum seekers, but have well-documented flaws of their own. The ONS does not produce estimates for illegal migrants.

Another source of data, likely to be used more in future, is the labour force survey, a rolling sample of about 60,000 households each quarter. Migrants are identified by a question asked annually about where they were living 12 months ago. If they answer "abroad," they are categorised as immigrants. New national insurance numbers are issued to people who seek employment or benefits, but not to dependents, so they can only help determine the right order of magnitude of immigration. Nor do they tell us how many people are leaving. No one is required to give back a national insurance number if they go, so NI numbers give a cumulative total, not a net flow.

Does all this amount to a scandal? That depends what you think of the alternatives. Some suggest issuing visas to everyone, EU citizens included. But that would breach the principle of free movement within the EU, and it would not address the problem of people who overstay their visas. Most visas have never required holders to tell the authorities when they leave, so this would not help us to know the net number remaining. And British citizens, who are a big part of migration (both ways), would be missed.

Another alternative would be to attempt to actually count migration rather than sample it. This would mean scanning every passport of every passenger, in and out of the country, at all airports and other border points. But this would give us only a retrospective figure: here's who came in, here's who among them left again, therefore we know at the end of the 12 months that those born overseas who remain can now be classified as migrants.

This is not perfect and nor is it a trivial undertaking. There are about 90m passenger journeys to and from Britain each year. Some countries do not issue scannable passports (no bar code), and the data has to be punched in manually, causing delay. Some people have two passports, legally or otherwise, entering on one and leaving on another. Some fail to leave because they die. And it would require near 100 per cent coverage of all border points. By 2014, Britain expects to have a system something like this with so-called "e-borders," if the IT comes in on schedule.

The ONS's private view of the claim that Britain has lost control of its borders is that we never had it. The pressures in the past were simply lighter. The introduction of e-borders will, the ONS believes, give us a better understanding of who's coming and going. But the system will not predict and it will not capture people's intentions. For this reason, it may be just as bad at picking up sudden changes.

The trouble with sampling

Samples are the norm for trying to capture social and economic change in any free market economy. Poverty, employment, productivity, economic growth, profits, trade, the balance of payments, rates of pay… these are but a selection of the basic indicators of life in Britain that are based on samples, not full counts. In fact, of the dozen or so measures on the front page of the ONS website, on a recent day taken at random, only one was a real count: of the most popular babies' names (they're Olivia and Jack).

The rest? Samples and surveys by the score, day in day out a background hum of measurement, the stuff of endless news reports and analysis. Routine as they are, the unavoidable weaknesses of samples also make them spanners in the works of Britain's self-understanding. And their errors can be serious. In estimating GDP, for example, it's hard to survey those new parts of the economy which may be growing fastest, since you might not know they exist until later, when you see the tax returns. So we don't particularly try. The result is that the British growth rate is consistently revised upwards, whereas in the US—where they guess this element of new growth—growth is usually revised down.

Should we give up on surveys and samples? No. These are important numbers, we want to know them as best we can, and we can't count everything (though it is worth noting that political will seems lacking even to pay for this much: the ONS budget is about to be cut). Sampling is, in its way, a minor if flawed miracle, allowing us, among other things, to say how many fish there are in the sea, and to conclude with some confidence that cod stocks in the North sea are dangerously low. Samples are often the best we can do—or are willing to pay for—in an imperfect world. The sensible precaution is to arm ourselves against their imperfection, to accept that they will often be less robust than expected, sometimes disconcertingly so.

Consider population trends. Two projections ten years apart (in 1955 and 1965) produced estimates of the British population for 1995 that differed by more than 20m. The ONS says the 1965-based projection must have assumed that the fertility rate would remain at around three children per woman. In fact, it fell sharply from its 1964 peak to a then record low level of 1.69 in 1977. In the 1965-based projection, over 1.5m births were projected for the year 2000, well over double the actual figure. As shots at the future go, these wouldn't have hit a barn door.

Figures at the other end of life have been about as good. This is also a consequence of people's ability to change, in this case by obdurately living longer. Again, such changes might be expected, but the frequency with which we have been surprised by quite how much longer has been astonishing.

Seen in this light, the failure to project the migration statistics with any accuracy looks pretty much par for the course. And the migration numbers have, after all, seen their own handbrake turns. The assumption that Britain would continue the trend up to the mid-1980s, or again in the early 1990s, of a net loss of population was based mostly on a simple projection of past experience.

A projection is not a prediction, of course, and the ONS feels hard done by when it is accused of failing to predict. It doesn't try and it doesn't claim to. Whether it should is another question. So it ought to be unsurprising that the numbers entirely failed to foresee the very sharp increase of the last few years.

The fact that we often get the numbers wrong is not the result of a conspiracy to suppress sensitive trends. People are a devil to count: such complicated creatures, so fickle, so evolving, so many of us. Picking a few and hoping they are representative of the rest, then assuming they behave consistently, is not a foolish policy, even if aspects could be improved.

Discuss this article at First Drafts, Prospect's blog