Open Data Institute’s 2020 virtual summit

The ICO’s work, as regulator for both data protection and freedom of information, sits comfortably alongside that of the ODI. There is a shared passion for transparency wherever possible, and a shared recognition of the value of data, and the benefits that can come from sharing data.

But the shared principle I most want to focus on today is trust. And in particular, trust as an imperative if the type of innovations being discussed today are to achieve their full potential.

Listening to discussions today, I’m hearing a tone of positivity in the face of challenge.

Yes, our world is very different to twelve months ago.

This time last year, few of us would have expected – or welcomed – state track and trace systems, for instance, or needing to provide personal data to pop into a café.

And I hear the concerns over what comes next.

These are unpredictable times.

But I hear more positive notes too.

There’s an oft quoted line from Sun Tzu: “In the midst of chaos, there is also opportunity”.

And that opportunity has already come through in today’s sessions.

The pandemic has brought an acceleration in the uptake of digital services that would otherwise have taken years. This uptake isn’t just about more people using video calls to talk to their families or new platforms to join conferences like this.

We have seen a change in attitude.

People are facing a threat to their health, and feel compelled to accept approaches they may have been less sure of otherwise.

But it does feels as though people are demanding innovation, rather than organisations having to persuade them to accept change.

Health Secretary Matt Hancock said that 99 per cent of GP practices now offer video consultations, compared with less than 10 per cent before the pandemic, and attributed that change to public demand.

And as the Minister, John Whittingdale, touched on in his opening, this change in attitude is also reflected in media coverage of the positive impact data is having on the country’s response to the pandemic.

In my office, we were able to support work to share information between public authorities and supermarkets, so they could share information to support people shielding during COVID-19.

That example shows too the greater appreciation of what data can offer, something we’ve seen on a greater scale around work to track the spread of the virus nationally. It feels like the ODI’s encouragement of an open, trustworthy data ecosystem has never been more relevant or important.

And the trust aspect of that is absolute.

It feels like a time of change, not just directly through the pandemic, but in race relations too, and society’s attitude to how to help and support others.

None of us would have chosen the winds that have blown through this year, but that doesn’t mean we can’t adjust our sails to make the most of them.

And so today I want to talk about how we encourage that change, and the crucial role trust plays in that.

I want to talk about the role regulation plays in enabling that innovation.

And I want to talk a little about the areas where I see data ethics and social inequality overlap.

I’ll start by going back to that example of data sharing between public authorities and supermarkets. With the ICO’s advice and guidance, that data sharing allowed supermarkets to understand who most needed delivery spots at a time when demand for those spots was outstripping supply.

It’s an example that illustrates some of the challenges that come in answering who decides how organisations can use personal data.

The public authorities had to consider whether the data sharing was appropriate. Whether it was lawful. Whether they were confident that the information would not be misused by the supermarkets.

The supermarkets had to be confident in the accuracy of the data.

And most fundamentally, there had to be a consideration of the individuals whose data was being shared. Including considerations of transparency, consent and fairness.

We disregard people’s concerns about their privacy at our peril.

Because what sits at the core of all of those considerations is trust. The supermarket data sharing innovation worked because it had people’s trust.

That’s crucial, because people’s personal data is just that: personal. And so it rightly provokes strong reactions.

I see that first hand in my role. I’ve spoken to parents who have seen the devastating effects of how their children’s data was used. Used to predict what content the children might engage with. Used to push inappropriate content in their direction – self harm comes to mind.

I’ve heard how the gangs matrix run in London could change the life of a young man wrongly labelled on a database, which was then shared with other public authorities.

Or the young adults I spoke to who had spent their childhoods in care, and now wanted to access their records – their personal data – as they tried to piece together the memories of their upbringing.

And so we cannot, we must not, underestimate how much people care about their personal data. Any innovation that relies on personal data must take the time to make sure it has public trust.

That is not a conflict. I agree strongly with the Minister’s comments earlier, that protecting people’s rights and encouraging growth are not in conflict. And trust is central to that, giving people confidence to support innovation.

We must take the time to explain to people what is happening to their data. Explaining why an app needs their data, where their data is going, how an algorithm works, what AI processing is going on.

When we ask ‘who decides how we can use personal data’, our first answer must be the people whose data it is.

The relationship between trust in data use and innovation has long been recognised.

Data protection law was born in the 1970s out of a concern that the potential of emerging technology would be lost if society didn’t embrace innovation.

That concern remains relevant now, and there is still work to do to encourage trust in innovative data use. The law my office regulates is part of that, reassuring people they can support innovation, safe in the knowledge there are checks and balances in place to protect them, with an independent regulator who has their back.

And for businesses, regulation means a fair market place, where corner-cutting or deceitful actions of rival companies do not go unchecked.

The result is a strong data protection law, and an active regulator to ensure those rules are followed.

And so when we ask ‘who decides how we can use personal data’, we must consider the role of the regulator as acting on behalf of the people whose data is being used.

My office wants to help innovation succeed. As a modern regulator, our approach is focused on working alongside organisations, helping them to make changes and improvements to comply with the law to reduce mistakes and misuse of people’s data. Working to get it right at the outset results in better outcomes for businesses and customers.

Helping organisations get it right is the work that we are largely set up to deliver – about three quarters of my staff work in roles of this type.

That work includes our report into the extraction of data from the mobile phones of victims of sexual assault, which set out expectations of the police that have since been accepted as a sensible and empathetic way forward.

We have worked with credit reference agencies to improve their handling of people’s personal information.

And our website is a treasure trove of guidance on data sharing, and use of AI, and data security, and anonymisation – there’s so much information on there.

Our fines are important when serious failings happen, but this collaboration and advice is our bread and butter work.

I’ll flag our Sandbox work here too, which helps organisations using personal data to develop innovative services. It has already been used to great effect, for instance in a projects looking to use data to support student mental health wellbeing at universities, or a scheme looking to detect fraud by bringing together pseudonymised data from a range of financial institutions. If you’re working on something innovative in data, I’d urge you to take a look at the details on our website. You may be a candidate for the sandbox.

Of course, central to the advice we offer through our sandbox is an emphasis on trust. There are clear requirements in the law around trust, from the role of transparency to the importance of fairness.

That fairness aspect feel particularly important in the context of some of the very important societal conversations we’ve seen across 2020.

Personal data informs decisions that can shape where someone lives, how much they earn and even how much products cost, through access to special deals or discounts. It can shape whether someone is offered a product or service at all, and as we saw earlier this year, it can shape life-changing decisions like A-level results. If those decisions suffer from bias, or if they’re rooted around assumptions and inferences based on people’s race or gender, then there’s a fair chance that data protection rules are not being followed.

This is the area of law where data ethics and societal inequality overlap.

And it comes back to trust. When we say ask ‘who decides how we can use personal data’, simply saying ‘people’ is too general – and too simple -an answer.

Or to put it another way, are there enough voices being heard about how personal data is used?

It’s a question I have to consider within my own office. Like any regulator, consultations are a big part of how we make sure our advice and guidance is practical, understandable and effective. Our Age Appropriate Design Code – which aims to protect children’s data within an internet that wasn’t designed with youngsters in mind – is a good example.

We collaborated and engaged with an enormous number of stakeholders, from parents and children’s groups to app developers and web designers.

But with such an important piece of work, there’s always that thought of whether we could have done more. We wanted to involve the people who the code was aimed at ultimately benefiting, but did we do enough to understand the different perspectives that black and minority ethnic groups may have? Did we consider how the experience of websites is different for children with disabilities?

This questioning of ourselves, asking whether we have been truly consultative, is so important for all of us.

I think all of us would agree that the talk we hear of public consultation from organisations often doesn’t match the proper public voice that could inform the design of final products. We need to work harder, as regulators, as innovators, to reach out to those whose experiences can inform a truly representative consultation. And maybe we need to approach these conversations not through the prism of complex legislation, but through the prism of trust.

I’ll draw my comments to a close there, and am happy to take questions.

I would conclude with a brief reflection.

2021 will mark a new start.

So much has happened this year, from the pandemic, to the Black Lives Matter movement, to the UK leaving the EU.

It feels like we can build on this change and challenge and find opportunity.

We can find opportunity in the growing positivity towards digital innovation.

In the realisation of the value of the data sets that exist within public and private sector, of the gradual acceptance of the value of sharing this data, and crucially, of a growing acceptance that such innovation must lead to benefits that are shared by all in society.

2021 can be a year of opportunity

But none of that is possible, none of that is sustainable, without trust. Society relies on trust. Commerce relies on trust. Government relies on trust. Data is no different: trust is imperative.

Source

Spread the love

Related posts

Leave a Comment