Why on earth is the government mucking about with our data privacy laws?

Thursday 9 September, evening: the UK Government published their long-awaited proposal for a new UK data protection regime. The new framework is the peak of a journey which Open Rights Group has followed closely, starting from the National Data Strategy and down to the TIGRR report and the Digital Regulation Plan.

We will analyse and react to Government consultation thoroughly, but there is already enough to confirm our worst fears. Behind the fig leaf of ‘tougher penalties and fines for nuisance calls and text messages’, the UK Government has put forward a deregulating approach that would enable data uses based on commercial viability, with little regard to the externalities and resulting harms for UK residents. The UK Government will also try to game the EU adequacy system and allow international transfers of EU data to third countries with lower data protection standards, in an attempt to gain a competitive advantage against EU member states.

This proposal marks a quite fundamental departure from the principles, enshrined in the data protection frameworks of the European Union and the Council of Europe, that innovation and the use of personal data should be centred on human rights and designed to serve mankind. If implemented, it is also likely to undermine the already weak UK adequacy decision — an outcome that would harm local businesses and damage UK aspirations to become an international standard-setter in this field.

But how does the Government justifies their new approach? And is it really as bad as it sounds?

The Government’s case for gutting data protection

The DCMS press release does an outstanding job of summarising the UK Government failure to articulate a clear case for scrapping GDPR.

For instance, the new data regime would be meant to be ‘ushering in a new golden age of growth and innovation’ by ‘unleash[ing] data’s power across the economy’. However, according to the National Data Strategy, the digital sector already ‘contributed £151bn in output’, ‘accounted for 1.6 million jobs’, and ‘the UK attracted more international venture capital investment into technology businesses in 2020 than France and Germany combined’.

The Government also claim that the new data regime would simplify ‘data use by researchers and developers of AI and other cutting-edge technologies’ in order to ‘cement the UK position as a science and tech superpower’. However, they cannot help mention that already and whilst under GDPR, ‘researchers from Moorfields Eye Hospital and the University College London Institute of Ophthalmology [made] a breakthrough in patient care using AI technology’.

In other words, the UK can proudly boast of themselves being an innovation and science powerhouse, home of a thriving digital economy and beautiful breakthroughs in the field of AI and health-tech… three years after adopting GDPR, the very same framework that has been described by corporate lobbyists as prescriptive, burdensome, and a threat to Europe’s ambitions in the field of AI.

Being willing to scrap a regulation that actually works may sound very confusing until you realise that corporate lobbyists were holding Government’s hand during this announcement. The press release includes, among others, comments from the Centre for Information and Policy leadership (CILP) — whose members include Apple, Amazon, Facebook, Google, and a long list of IT behemoths, social media companies and other multinational corporations — and TechUK — a trade association that also happen to represent the interests of Amazon, Apple, Facebook, Google, together with a long list of tech businesses.

The Covid pandemic…and how data can harm us

The claim that the proposed data protection framework would enable ‘unprecedented and life-saving use of data to tackle the COVID-19 pandemic’ deserves a section of its own. If anything, the UK’s Covid response marked a gargantuan failure of the techno-solutionist approach that this Government is so keen to promote.

Without pretending to draw a complete picture, attempts to build machine learning models to counter the pandemic ended in disaster and loss of livesThe gold rush to digital contact tracing failed, and the NHSX App was quickly shelved as soon as notifications popped in. Palantir, a US surveillance company, was granted an unprecedented contract to collaborate with the NHS and access patients’ dataTest and Trace was carried out illegally, with contact tracing data being shared on social media at best or used to harass women at worst. Local Councils drew Covid risks scores based on highly sensitive traits — such as “unfaithful & unsafe sex”, financial details, and dangerous pets. And, of course, there were plenty of data breaches in Englandin Wales, and after the vaccine rollout.

Far from being exceptional, these examples are but a taste of how personal data can be weaponised against individuals. New technologies have proven to be an effective means to conduct surveillance in public places, at work, at school, while the use of personal data to discriminate in financial services, housing, and employment is on the rise.

Commonsense is needed but, until now, has been noticeably lacking

Governments in Europe, the US, and around the world are rushing to regulate technologies and corporations whose power is a growing source of concern. Indeed, even China is introducing strong data protection rules to fill in the gaps with GDPR.

Thus, you may wonder where is the ‘common sense’ in deregulating UK data protection without a convincing reason, in the face of a growing weaponisation of personal data against individuals and consumers? If we get off Government turbocharged bandwagon for a moment, however, there are lessons that we could draw from these experiences.

Firstly, rushed attempts to set up large data pools and ground-breaking digital systems during Covid taught us that more data sharing isn’t a recipe for success in itself: the value of data rests in data quality and our capacity to understand what the data mean and how to use them. In turn, fine-tuning data and capacity building requires patience, progressing by small steps, and effort to gain and maintain public trust over their collection and use.

The ‘do first, apologise later’ approach the UK Government would enshrine into law is inherently incompatible with the notion of trust, but that isn’t the only factor that risks undermining it. Poor enforcement has been the Achille’s heel of GDPR, driven by a complex coordination mechanism among EU Data Protection Authorities. If anything, departing from the EU block left the Information Commissioner’s Office with free hands and the opportunity to enforce without engaging with Regulators in Ireland or Luxembourg.

Furthermore, effective enforcement of GDPR wouldn’t benefit individuals alone. Today’s digital market is rigged with unsafe products given for free to resource-constrained institutions — such as schools and NGOs — or forced on legitimate businesses by leveraging market power — such as with adtech products. Enforcing GDPR, then, would ease compliance for those companies and small organisations that operate in good faith, but whose efforts are jeopardised by the business interests of tech behemoths.

In conclusion, UK doesn’t need to reform its data protection laws under the auspices of corporate lobbyism. What it does need, instead, is the long-overdue enforcement of GDPR.

This article first appeared on the Open Rights Group blog and is reproduced here with the kind permission of the author.