Technology

To beat Covid-19, we need to use data. Here's how to get it right

Data-driven engineering is not only the beating heart of civilisation—it is the activity that will ultimately allow us to overcome Covid-19

June 15, 2020
A person using the NHS app on a mobile phone, as the UK continues in lockdown to help curb the spread of the coronavirus. Picture date: Monday April 27, 2020.
A person using the NHS app on a mobile phone, as the UK continues in lockdown to help curb the spread of the coronavirus. Picture date: Monday April 27, 2020.

Data is often described as the “new oil.” Having spent my career dealing with both, I have always been sceptical of the comparison. But they do share one important property: their ability to empower people to get things done. Today it is data, rather than oil, that drives progress in engineering, enabling us to create the tools and systems that allow us to shape our worlds.

Data-driven engineering is not only the beating heart of civilisation—it is the activity that will ultimately allow us to overcome Covid-19. Scientists and doctors will gather data on the virus itself and how our bodies respond to it, but it is engineering that must transform those insights into safe and effective therapies and vaccines that can be delivered at scale.

As we wait patiently for those crucial medical innovations, it is imperative that we use data right now in a different way: to engineer ways to break the virus’s chains of transmission. As lockdowns are eased, we need to switch from blanket, population-wide measures to a more focused approach that can detect new clusters of infection and stop them growing. Some of the data needed to achieve this ambition are about as intimate as can be: where we’ve been, who we’ve seen, how close we came and how long we stayed there.

It should come as no surprise that people are wary about sharing this data. They see huge technology companies using personal data to generate profit, enabling increasingly precise targeting of everything from products to political parties. And they also see abuse of personal data by governments, with facial recognition systems allegedly being used to oppress minority populations in China and elsewhere.

But gathering and collating personal information can yield powerful insights with immense utility in public health and beyond. When I was writing my latest book, Make, Think, Imagine: The Future of Civilisation I spoke to Nigel Shadbolt, co-founder and chairman of the Open Data Institute, who believes that “the benefit that accrues from my data being amalgamated with many other people’s to provide insights about general conditions is an inarguable thing…it’s very odd that the narrative isn’t around empowerment.”

Shadbolt is right. Huge data combined with machine learning could transform our approach to public health, in ways that extend far beyond the immediate challenge of Covid. They could help us to escape an approach to healthcare which is far too reactive, one which biomedical scientist and entrepreneur Craig Venter has condemned, with some justification, as “medieval.” The more data we have at our disposal, the more empowering that data will become. A key challenge, therefore, is to ensure that concerns about privacy do not derail these efforts. I believe we need to focus on two things.

The first is good engineering. Be it a Bluetooth-based app for recording our interactions, a script for human contact tracers to follow, or a way to conduct rapid virus testing, engineered solutions must first and foremost be effective. Can an app, for example, understand when two neighbours are in physical proximity but separated by a wall? This sounds obvious, but doubts about the usefulness of interventions will prevent them from being used, and create doubt as to whether those gathering our personal data are competent to do so. And, although it should hardly need saying these days, before anyone can be expected to share their data, they must be confident that it will be properly anonymised and encrypted and stored in the most secure way possible.

The second critical way to allay fears about privacy involves data governance. There must be complete transparency around which organisations can access personal data, when and why. Nobody should have any grounds to suspect that information about their health or location could ever be used as the basis for discrimination or unfair treatment. That means quarantine measures must be consistently and fairly applied, so that there really is a sense of all being “in this together.” Finally, when new data-gathering measures are introduced in times of crisis, the public needs some reassurance that those exceptional powers will be reined back in once the immediate danger has passed. In short, to paraphrase Ronald Reagan, if the public is to trust others with their data, they need to be able to verify that it is being used in a limited and acceptable way.

If we can get both the engineering and the data governance right, trust will follow. And that matters, because if 2018’s Cambridge Analytica scandal shows us anything, it is that trust can evaporate, often surprisingly quickly. In day-to-day life, most of us behave online as if we do not care very much about privacy. But we react strongly when something goes wrong.

In that case, we willingly trusted Facebook with our personal data for the sake of a bit more entertainment and convenience. Now we have a chance to trade some modicum of our privacy to help safeguard the health of our families and our communities at large. In the process, we have the opportunity to take an important step towards a more data-driven and enlightened approach to healthcare. If, however, privacy fears cannot be overcome, it will be a failure of engineering, a failure of leadership, and a big blow to human progress.

John Browne is chairman of the Francis Crick Institute and former CEO of BP. His latest book is Make, Think, Imagine: The Future of Civilisation (Bloomsbury)