The Information Commissioner Officer (the “ICO”) has set out its 2024-2025 priorities for protecting children’s personal information online. The ICO initially introduced its Children’s Code of Practice (the “Code”) back in 2021, and many organisations have already implemented changes to comply with the Code, for example YouTube turned off autoplay and turned on take a break and bedtime reminders by default for accounts under 18 years of age.
Since the introduction of the Code the ICO has been calling on and working with online service providers to ensure that they improve the protection of children’s privacy. John Edwards, the UK Information Commissioner, stated that “children’s privacy must not be traded in the chase for profit”.
The ICO’s new code develops on the improvements that organisations have already taken and highlights the ICO’s priorities and areas for development. The ICO aims to target the current risks to children’s privacy and develop strategies to reduce or eliminate these risks. It stated that its priority in 2024-2025 is to drive further improvements in how social media platforms (SMPs) and video sharing platforms (VSPs) protect children’s personal information online. For 2024 to 2025, the ICO has stated that it will commit to prioritising.
Default privacy and geolocation settings
The ICO states that children’s SMPs and VSPs profiles, by default, must be set to private and have the geolocation settings turned off. If the location data of a child can be tracked via their profile it can potentially lead to the misuse of information and could compromise the physical and mental wellbeing of the child, including facilitating bullying. Personal information on such profiles can include a user’s name, gender, pronouns, interests, and identifiable images. Geolocation data is data taken from a user’s device that indicates its geographical location, including GPS data or data about connection with local Wi-Fi equipment.
Profiling children for targeted advertisements
SMPs and VSPs can gather large quantities of data about their users, including children, drawing on the posts and content they view, people they interact with, accounts they follow, and groups they join. The ICO states that, targeted advertising should also be turned off by default on children profiles, unless providers can show that there is a compelling reason to use the profile for targeted advertising. Targeted advertising can affect a child’s autonomy and control over their personal information as they may not be aware that their information is being collected or that their adverts are tailored and could not be in the best interests of the child if there is excessive data collection. It can also lead to financial harm, for example if adverts suggest in-service purchases.
The use of children’s information in recommender systems
Recommender systems are algorithm generated and use personal information and profiling to learn the users preferences and interests in order to suggest content. These content feeds may use children’s personal information from their profile or search results, which can create pathways and expose the child to potentially damaging content, such as self-harm or eating disorders. Recommender systems may also encourage a child to spend more time on the online platform and lead to a loss of autonomy where children are unable to make informed decisions about what information to share with a service because they do not understand how it would be used or are not given sufficient tools to control how it is used by recommender systems.
The use of information of children under 13 years of age
The ICO has warned providers to ensure that appropriate protection measures have been implemented, particularly for children under the age of 13 as they cannot consent to their personal information being used. How online providers gain consent and assess the age of their users are essential in mitigating any potential harm to children. Such harm could included a loss of control of personal information, psychological harms from children accessing content they are too young to view and financial harms from engaging with in-app purchases or subscriptions without adequate parental oversight.
As part of this process, the ICO will take various steps to ensure that the identified risks are reduced:
- The ICO will gather evidence by inviting input from a range of stakeholders and by engaging with key platforms to clarify the lawful bases they rely on to process children’s personal information and their approach to age assurance.
- The ICO will engage with parents, carers and children to fully understand how they can help protect and safeguard the privacy of children.
- Areas where organisations need certainty and guidance will be identified and the relevant support provided.
- The ICO will focus on the most serious risks currently being faced and if necessary use their regulator enforcement power, such as the use of fines and notices, to ensure compliance.
On 2 August 2024, the ICO published a progress update. The ICO had reviewed a selection of service providers, created accounts using proxies for children and reviewed a range of academic and governmental sources. The ICO found that some children’s profiles were not set to private by default on platforms and some even allowed children to be contacted by strangers. The ICO has written to various platforms to voice these concerns and called on them to change their practices to ensure that children are sufficiently protected. It is also working closely with Ofcom on age assurance to help ensure an aligned approach across the data protection and online safety regimes, which demonstrates the seriousness of the how this matter is being monitored.
If you would like assistance or guidance on how to ensure that your digital services safeguard children’s personal information and comply with UK data protection law speak to our Data Protection team today for legal advice and assistance.