Politics

The new Data Reform Bill is a dangerous and unnecessary approach to data protection

Parliament has an opportunity to shape the legal frameworks that govern use of algorithms. But it may be taking us down a different route

June 07, 2022
article header image
The Department for Culture, Media and Sport held a consultation over proposed data protection reforms last year. Photo: Uwe Deffner / Alamy Stock Photo

Our world is increasingly data-driven. Social media giants, insurance companies and governments are collecting and processing personal data on an enormous scale. In some cases, public bodies buy data from private companies to support their decision-making. In 2018, Big Brother Watch revealed that Durham Police was using commercial consumer profiling data to assist its semi-automated risk assessment tool. The data was from the socio-geo-demographic segmentation tool Mosaic, supplied by Experian, which creates groupings and classifications that go into granular detail. The tool makes connections between individual names and groupings—“Abdi” was associated with “Crowded Kaleidoscope”: “multi-cultural” families living in “cramped” and “overcrowded flats,” and “Denise” meant “Low Income Worker”: a “heavy TV viewer” with “few qualifications.” This pilot project ended in 2020. But it may be a sign of what is to come. In one version of our future, public and private entities work more and more closely to harvest more and more personal data, processing it in ways that lead to life-altering decisions. 

Against this backdrop the Data Reform Bill, announced in the Queen’s Speech last month, is of great significance. There is a pressing need for robust data protection laws, to ensure that the use of data is fair, transparent, and accountable. 

The Bill aims to create “a world class data rights regime.” While we currently have limited detail as to what this might mean in practice, it is clear that major reform of the UK General Data Protection Regulation (UK GDPR) and Data Protection Act 2018 is high on the agenda.

One insight into the possible contents of the Bill is the Department for Digital, Culture, Media & Sport’s data consultation, which closed in November 2021. The consultation proposed removing the requirement to undertake Data Protection Impact Assessments (DPIAs); introducing a fee for Data Subject Access Requests (DSARs); reducing the independence of the Information Commissioner’s Office (ICO); and removing the safeguard that protects people against solely automated decision-making.

The reforms have been packaged as a necessary streamlining of data protection laws that will cut red tape, seize the opportunities made possible by Brexit and reduce burdens on businesses, boosting the economy. But is our current data protection actually a complex web of bureaucracy, or a framework of vital, albeit imperfect, safeguards?

The announcement of the Bill has left us at a crossroads. Parliament has an opportunity to build on the current regulatory framework to ensure that the use of our personal data is fair and transparent, and individuals can seek review and redress when things go wrong. Yet what we know about the main elements of the Bill suggests the government may be taking us down a different route.

The stated aim of the Bill is to create a culture of data protection, which will supposedly remove the need for so-called “tick box exercises.” The consultation frames DPIAs as a menial administrative task, but the requirement to conduct impact assessment ensures that systems are developed safely and lawfully, and are able to collect and process data without violating individual rights. 

The ability to access our own data has been recognised as a fundamental right. It may, however, be placed at risk in the endeavour to reduce requirements on businesses. The consultation proposed that the process for submitting DSARs, the tool by which individuals can find out what data is held on them and how it is used, should be aligned with that set out by the Freedom of Information Act (FOIA). Under FOIA, there are exemptions to the “right of access,” allowing public authorities to deny requests for information. If such exemptions were to become available to DSARs, the right to access the data about you, and data that belongs to you, will be curtailed—to the advantage of the data controller. Another suggestion was the introduction of a “nominal fee” for processing requests. A fee would put data subjects at a disadvantage compared with requesters under FOIA and would likely deter or exclude many from accessing their own data.

As the rights of data subjects are not usually enforced by the ICO, individuals may find themselves with limited avenues for review or redress if they felt requests were unfairly denied, even after potentially paying for the request to be processed. 

The ICO itself is also a target of this sweep of reforms. The Bill will reduce its independence by requiring it to be “more accountable to parliament and the public.” In the House of Lords debate following the Queen’s speech, David Anderson signalled that this will increase the “influence of government over nominally independent regulators,” meriting a “degree of wariness.”

A worrying and unwarranted regressive data protection framework may lie ahead, in which fairness, transparency, and possibilities of review and redress are reduced. But this is not the only avenue open to the government. 

The Bill presents an opportunity to enhance transparency by requiring compulsory publication of DPIAs. It could strengthen the rights of individuals in an increasingly data-driven world. It could also improve transparency standards, by introducing compulsory reporting of the use of algorithms in decision-making for public authorities, government departments and government contractors using public data. This would enable those who are subject to automated decision-making to hold government to account. Data has the potential to assist government in making fairer, more consistent decisions. But regulation must evolve alongside the growing use of technology, accommodating for emerging harms. Now is the time to get it right.