13 November 2019

Tech companies must start taking privacy seriously – or risk a techlash

By Derek McAuley

The use of data will revolutionise our world. It can remind us to set an alarm before we go to bed, save us countless hours choosing what to watch, and even reduce our energy use, saving money and the planet. But at what cost?

Most of what we hear about data is alarming. It seems that all the apps, platforms and websites we use steal our data with the hope of one day monetising everything they’ve learned about us. At times, this narrative isn’t far from the truth.

US tech giants set the standard. The prevailing operating model sees companies harvest massive amounts of data, often ignoring many countries’ data rules, then hoping to become rich when a use for the data is found. Recent concerns over Facebook’s ability to manage our data is a perfect example of a failed system. Facebook grew up, like most of the tech giants, in the US and abided by US data law. On becoming international, they re-jigged their T’s and C’s with a bit of GDPR dust while changing very little of their actual operating practices. This created the environment in which concerns about data use have come to the forefront.

With the uncertain times that lie ahead for the UK economy, government needs to be doing more to tackle the major players who aren’t held to the same account as home-grown tech firms. The law exist – it needs to be robustly enforced.

Techlash, the term used for the growing backlash against an industry once thought of as prophets leading us towards a more profitable and convenient future, has merged with concerns over data.

This has created a kickback from users who are fed up of having their trust abused and their privacy breached. The #DeleteFacebook movement is only the beginning of users turning their backs on technologies that, if used responsibly and sensitively, could revolutionise our productivity and efficiency. Tech has a huge part to play in balancing the economy by allowing anyone, anywhere to access services and consumer opportunities, safely.

For example, our healthcare infrastructure is on the brink of becoming tech centred. The NHS has pledged to be a world leader in artificial intelligence and machine learning within 5 years by using AI and personal data to make services accessible anywhere, thus saving the tax-payer money and potential patients time in unnecessary trips to surgeries. However, the prevailing model for future NHS services continues to be one of data surveillance. Patients too often feel judged by the use of their data, rather than helped. Without trust in the NHS’s ability to use data responsibly , we will never achieve this, missing out on the financial and social rewards.

Like a modern-day Damocles, do we choose to go back to our old way of life or embrace the greater danger which also offers us the possibility of greater reward?

More and more experts are calling for individual control to re-bridge the chasm of lost trust in tech, a genuine effort to tidy up the mess that is data collection and control. This, however, presumes an engaged and diligent user. But to achieve our goals, we must consider the needs of the vulnerable in our society. 11.3 million people in the UK don’t have the digital skills they need to thrive in today’s world. Requiring them to become digital experts, engaged with their data in order to benefit from tech, is not the best way to drive the nation’s productivity and economy, or protect the young and elderly from exploitation.

The solution comes in two stages. Firstly, we should build future technologies to fully embrace “privacy by design and default”. In many scenarios personal data must be processed in innovative ways but this can be done without the data actually being shared. For example, the BBC Box provides media recommendations without taking any personal data.

Secondly, we need to build auditing into platforms. This will enable behaviour monitoring of apps and platforms, in order to provide an independent verification of trustworthiness or otherwise.

DataBox is a model for this second step. Worked on by myself, my colleagues from the University of Nottingham, and our peers at Cambridge University and Imperial College London, DataBox is a computer in an individual’s home that collates, curates and mediates access to an individual’s personal data. This enables apps to process data in private. It also logs what, if any, information apps are actually harvesting. The log can be viewed by users themselves or a third party who can assess the transparency of the apps used, developing a rating and a review of the way the app has behaved. This information is the thing that is critically lacking. We don’t feel we can trust tech companies so we need a system that can easily verify what they are doing. Hardly a radical new concept, this type of auditing is already a familiar model in the finance, investment and banking sectors.

The DataBox model sees tech being used for its primary function to benefit the individual user. It processes private data in private, analyses it and provides users with value. The data isn’t passed on or sat on as an investment. It’s a trustworthy process with sensitivity built into the architectural core of the tech.

Without these efforts to re-instil trust, vast swathes of our population will revolt against tech and be left behind. This, in turn, will reduce our nation’s efficiency and productivity, slowing down our health service, minimalizing our financial growth and stunting our ability to trade globally. In short, trust is essential to a functioning economy.

Click here to subscribe to our daily briefing – the best pieces from CapX and across the web.

CapX depends on the generosity of its readers. If you value what we do, please consider making a donation.

Derek McAuley is the Professor of Digital Economy and Horizon Director at the Faculty of Science, University of Nottingham