By Simon McDougall, Executive Director – Technology Policy and Innovation, ICO
18 November 2019
How do we balance the interests of society against individual rights, on issues like facial recognition technology?
How do we allocate rights and responsibilities in a world of connected devices and real-time automated decisions?
How much thought does the law require organisations to put into what is ‘right’, and what their customers would reasonably expect to happen to their data?
We face an increasing number of these types of questions, as we consider the practical implications of growing use of personal data.
I’m sure many working in the digital economy might be tempted to look in my direction for the answer. It’s the regulator’s job to tell us what the law says, right?
But the reality, as always, is somewhat more complex. The GDPR is a principle-based law. That is its great strength. It sets out general rules, which can then be applied to a range of situations, whether that’s how a local running club looks after members’ contact details, or how big data can be used in health research. It means the law can continue to be relevant, even as technologies are developed that weren’t even thought of when the law was being drafted.
Interpretation of that law is complex though, and increasingly we see broad ethical questions being raised around how data is being used. There is debate around when data protection and data ethics overlap, where they are separate, and where they may even conflict.
Working through these issues demands a broad range of skills. In his book 21 Lessons, Noah Yuval Horari suggests that, to explore a question of data ownership, “we had better call upon our lawyers, politicians, philosophers and even poets to turn their attention to this conundrum”.
We have lawyers at the ICO. We have policy experts, well versed in the debates politicians had in drafting the law. And while we haven’t gone so far as to bring in poets (yet!), we are looking to understand how our world interacts with that of philosophy and ethics.
We’ve just appointed our first data ethics adviser, Ellis Parry, to consider these challenges over the next year. Ellis joins the ICO with a wealth of experience from his time leading data protection for BP and AstraZeneca, working through a range of data protection and ethics issues along the way.
Engaging in data ethics is an innovative step, but I don’t think it’s one that will prove unique. An international consultation in 2018 found more than four in five believe authorities should play a role in this area.
I am delighted Ellis is joining us. His brief is not to establish a large data ethics function at the ICO – this is a vibrant area for debate and exploration, but it does not follow that the ICO seek to ‘own’ it. Ellis will help ensure that the ICO contributes to data ethics discussions in a way that meets the aims of our Information Rights Strategic Plan, and in doing so helps to uphold information rights in the UK. It promises to be an interesting year.
Simon McDougall is Executive Director for Technology Policy and Innovation at the ICO where he is developing an approach to addressing new technological and online harms. He is particularly focused on artificial intelligence and data ethics. He is also responsible for the development of a framework for auditing the use of personal data in machine learning algorithms. |