The Information Commissioner’s Office (ICO) is calling on social media and video-sharing platforms to improve their data protection practices so children are safer when using their services. This comes as the regulator sets out its 2024-2025 priorities for protecting children’s personal information online.
Since the introduction of its Children’s code of practice in 2021, the ICO has been working with online services including websites, apps and games to provide better privacy protections for children, ensuring their personal information is used appropriately within the digital world. There has been significant progress and many organisations have started to assess and mitigate the potential privacy risks to children on their platforms.
The new Children’s code strategy builds on the progress to date and sets out the priority areas social media and video-sharing platforms need to improve on in the coming year, as well as how the ICO will continue to enforce the law and drive conformance with the code by industry.
“Children’s privacy must not be traded in the chase for profit. How companies design their online services and use children’s personal information have a significant impact on what young people see and experience in the digital world.
“Seven out of ten children told us that they trust our Children’s code to make the internet better and safer for them. That’s why our determination to ensure online services are privacy-friendly for children is stronger than ever.
“I’m calling on social media and video-sharing platforms to assess and understand the potential data harms to children on their platforms, and to take steps to mitigate them.”
– John Edwards, UK Information Commissioner
For 2024 to 2025, the Children’s code strategy will focus on:
- Default privacy and geolocation settings. The ability to ascertain or track the location data of a child creates risks, including potentially having their information misused to compromise their physical safety or mental wellbeing. This is why children’s profiles must be private by default and geolocation settings must be turned off by default.
-
Profiling children for targeted advertisements. Children may not be aware their personal information is being collected, or that it can be used to tailor the adverts they see. This may impact children’s autonomy and control over their personal information, and it could lead to financial harms where adverts encourage in-service purchases or additional app access without adequate protections in place. Unless there is a compelling reason to use profiling for targeted advertising, it should be off by default.
-
Using children’s information in recommender systems. Content feeds generated by algorithms may use information such as behavioural profiles and children’s search results. These feeds may create pathways to harmful content such as self-harm, suicidal ideas, misogyny or eating disorders. The design of recommender systems may also encourage children to spend longer on the platform than they otherwise would, leading to children sharing more personal information with the platforms.
-
Using information of children under 13 years old. Children under the age of 13 can’t consent to their personal information being used by an online service, and parental consent is required instead. How services gain consent, and how they use age assurance technologies to assess the age of the user and apply appropriate protections, are important for mitigating potential harms.
Further cooperation with other UK regulators such as Ofcom and international counterparts will also be a focus for the ICO, as it aims to raise global data protection standards for the benefit of UK children.
Mr Edwards said:
“Children’s privacy is a global concern, and businesses around the world need to take steps to ensure children’s personal information is used appropriately so it doesn’t leave them exposed to online harms. This week I will be meeting with international regulators and online services to encourage stronger digital protections for children.”
John Edwards is attending the IAPP Global Privacy Summit 2024 in Washington DC this week, where the ICO will be collaborating and sharing expertise in areas such as children’s privacy, artificial intelligence and advertising technology. The Commissioner will also be in Seattle and San Francisco to meet with big tech companies and AI developers, where he will be reinforcing the ICO’s regulatory expectations to the rapidly expanding generative AI development sector and pushing the dial on its priorities for children’s privacy.
For more information on the ICO children’s code strategy, visit ico.org.uk/childrenscode.
Notes to editors
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
- The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations.
- The ICO can take action to address and change the behaviour of organisations and individuals that collect, use and keep personal information. This includes criminal prosecution, non-criminal enforcement and audit.
- The ICO’s strategic priorities are set out in ICO25, which includes safeguarding and empowering people, particularly vulnerable groups who are exposed to the greatest risk of harm.
- The UK Government included provisions in the Data Protection Act 2018 to create world-leading standards that provide proper safeguards for children when they are online.
- As part of that, the ICO was required to produce an age-appropriate design code of practice to give guidance to organisations about the privacy standards they should adopt when offering online services and apps that children are likely to access and which will process their personal information.
- The first draft of the code went out to consultation in April 2019. It was informed by initial views and evidence gathered from designers, app developers, academics and civil society. You can read the responses here.
- To report a concern to the ICO telephone our helpline 0303 123 1113 or go to ico.org.uk/concerns.