The government's data protection plans risk prioritising growth over human rights—but it's not too late to change course

The Data Protection and Digital Information Bill has been pushed back to consultation. Any new version must not weaken our vital data rights

December 19, 2022
Photo: Welsh Traveller / Alamy Stock Photo
Photo: Welsh Traveller / Alamy Stock Photo

Until September, the innocuously named but controversial Data Protection and Digital Information Bill—a piece of legislation that EU officials warned put data movements between the UK and EU at risk—was ploughing forward.

Now, the bill has joined the bonfire of unworkable Tory policies that put ideology before country. Ministers have put the bill back to consultation. But the threat hasn’t passed. In November, members of the European parliament described conversations with UK officials over data protection laws as “all about growth and innovation and nothing about human rights”. Anybody who cares about economic growth or human rights should pay attention to what happens next.

When the government introduced the bill to parliament this year, its message was clear. The bill’s aim was to “reduce the burden on businesses”. Protections in the General Data Protection Regulations (GDPR) were “unnecessary”. Data had metamorphosed from a public good into a tool for profit.

The bill threatened to weaken vital rights. For example, it would have become harder to find out what information businesses hold about us. Some organisations would no longer need to have an independent expert Data Protection Officer, use an impact assessment to justify using data where people are at high risk, or even keep adequate records of their own data use—despite the majority of respondents to the government’s consultation on the bill warning against these changes. And it left the door open for the government to use “Henry VIII” powers to wave through data use and automated decision-making permissions that may not be in the interest of citizens. 

This was a “move-fast-and-break-things” philosophy. Ironically, Liz Truss was the one to hit the brakes.

As ministers pause to take a fresh look, they need to recognise that the bill would not have delivered for business as intended. Overstating GDPR’s threat to growth puts existing protections at risk and misdirects attention from where it is needed—tackling the impact of new technologies and delivering on the potential of data as a public good.

The bill proposed a regime to run parallel to GDPR, which itself cost UK businesses billions to adopt in 2018. Ministers claimed that the legislation would be “compatible with maintaining free flow of personal data from Europe”. In reality, the bill ended up as the worst of both worlds.

Businesses who are happy to work only within the UK would have to decide whether changing things was worthwhile. For those trading overseas, this added just another layer of cost and complexity—even outside the EU—as the GDPR has become the global default. Even the Market Research Society and Advertising Association sounded the alarm—and their members might have been amongst the key beneficiaries of a bill.

A group of disabled people in Manchester earlier this year launched a legal challenge, claiming that an automated process had unfairly put their benefits at risk by earmarking their claims as fraudulent. In the workplace, hiring and firing is being outsourced to algorithms. In the digital classroom, children’s data disappears into a black hole with untold consequences for their futures. Such scenarios can play out wherever there is a David and a Goliath. And we aren’t safe, even if we keep our data locked up. Anybody can be harmed by a decision made using data, even if the data wasn’t their own.

The changes proposed in the bill could make it more difficult to prevent, detect and seek redress for sharp practices. Instead, we should be agreeing and enforcing red lines that our society will not tolerate, and preventing businesses from hiding behind “trade secrets” justifications to keep us in the dark. Moreover, people who manage our data shoulder a huge responsibility. Their decisions can be irreversible and lifechanging. We should find ways to professionalise their roles, like in the legal and medical professions.

In addition to these urgent concerns, the bill has missed important opportunities. The average person doesn’t have the time and energy to spend their days deciding who should use their data for what purpose, even for the greater good. There could be an important role for a new type of organisation: data rights intermediaries, which have a fiduciary duty to manage data rights on our behalf. The EU legislated earlier this year to recognise their role, but the UK can build on this. These organisations should be not-for-profit and accountable. Data holders should be required to work with them. We also need common standards and infrastructure that puts our data at our fingertips.

All of these concepts may feel beyond the realm of the average person. But they are already part of our lives. The Facebook/Cambridge Analytica scandal, which saw the personal data of millions exploited to manipulate our political preferences, showed us the dark side of targeting and how difficult it is to keep track of our data. The Ofqual algorithm, which assigned low-income students grades that lost them university places, showed us how automated decision-making can go wrong. And Elon Musk’s Twitter takeover has shown us how tenuous our data rights are in the hands of a powerful few. Despite this, media and political engagement on this bill has been limited.

Data has enormous potential as a public good. But we’ve not yet landed on a model to realise this potential responsibly. If this bill comes back, we need to roll up our sleeves and develop a clear vision to make it work for those who matter.