It’s been an exciting, interesting and challenging first six months for the ICO Sandbox – both for those externally involved in the various projects and for the ICO staff working on the scheme. Ian Hulme discusses the progress so far.
In September 2019, we launched the beta phase of the Sandbox, the ICO initiative to support organisations innovating using data protection by design. The ten projects selected represent a range of industries and organisations looking for solutions to tackle some of the most fundamental questions for today’s society. How can organisations work together to reduce violent crime? What can universities do to better support students with their mental health? And how can new technologies improve health care? Each will potentially offer huge benefits for the public, but also have specific and complex data protection challenges.
We’ve expanded our insight through workshops conducted for each individual project, we’re looking at their processes and offering advice on how our existing guidance could be used in their unique circumstances. The work has pushed us to consider where additional guidance may help organisations with compliance. It’s given us the opportunity to consider real-world examples of the most contemporary data protection issues. And ensures innovative products with considerable public benefits do not come at the cost of privacy rights.
Even at this early stage there are key issues starting to emerge:
- Realising the benefits of data in the public sector
Some participants are working to overcome historic data sharing challenges across the public sector, others have focused on the much more recent challenge of how to incorporate big data. The opportunities afforded by personal data combined with powerful new technologies need to be effectively balanced against the rights and freedoms of data subjects especially considering the legal framework for processing and the expectations of the public.
We are working to ensure that a common understanding is developed around consent and it’s various legislative definitions to ensure that all parties understand the differences, apply it in a consistent manner whilst providing transparency information to the public.
We have increased our understanding of the role of digital identity products for vulnerable data subjects and the practical challenges in obtaining consent from children, and those with parental responsibility, where national identity services are less mature.
- The challenge of new technologies
The real world application of voice biometrics and facial recognition technology (FRT) are posing some interesting challenges. We have been examining how FRT can be used in situations where there are many other global standards and requirements that need to work alongside data protection law. This is feeding into our wider work, consolidating our thinking on an appropriate basis for processing special category data in order to assess racial bias in facial recognition.
Examining how data analytics can be used in a data protection compliant manner has meant testing our advanced understanding of certain aspects of the GDPR. This has involved assessing suitable lawful bases and conditions for processing special category data, identifying data protection risks within processing and reviewing data sources that may be used in data analytics to ensure that the purpose would not be incompatible. This will help us shape future ICO guidance.
We are looking forward to working alongside the organisations to develop some truly ground-breaking projects to a fully working solution, delivering innovative and compliant products and services for the public good. By applying the legislation to new and emerging situations, we are also developing our understanding and we are already using this to inform our wider guidance and regulatory approaches.
These projects are potentially a blueprint for future work, laying down the privacy building blocks for future organisations, combining data protection and innovation.
Here are a few words from some of our current participants and how their experiences in the Sandbox have gone so far.
Neal Cohen, Director of Privacy for Onfido, said:
“We are developing cutting edge artificial intelligence technologies to perform remote biometric identity verification. While we think this technology has the potential to do tremendous good by creating a more open and accessible world, we do see very real privacy issues in how this technology is built and then made available to the public.
“In the Sandbox, we had the opportunity to take a deep dive into our AI technology with the ICO, and together, we sought out pragmatic solutions to enable privacy and technology to co-exist. We are optimistic that our work in the Sandbox will not only benefit the humans using our AI technology but also the wider AI industry.”
Andrew Cormack, Chief Regulatory Adviser for Jisc, said:
“Our discussions with the Sandbox have been even more productive than we had hoped and have led to the development of new tools for educational institutions to conduct Data Protection Impact Assessments. We can also see how our work with the ICO – particularly on situations that might involve inferring sensitive data, such as health alerts, from observed behaviour – will help to inform their guidance in future.”
Simon Bristow, Head of Data Privacy for Novartis, said:
“We recognised early on that our project to explore voice technology within healthcare poses unique privacy questions – how can patients be provided with clear information about voice technology, and how can the accuracy and quality of data be ensured? The ICO is helping us to answer these questions and address our project risks.
“At the same time, our participation enables the ICO to gain a detailed understanding in this area, which will lead to additional support for us and other organisations.”
Ian Hulme is Director of Regulatory Assurance at the ICO. |