Time |
Session |
09.30 |
Registration |
09.50 |
Welcome
|
09.55 |
Introduction
|
10.00 |
Identifying child users: Exploring age assurance methods
-
More Info
-
This session will focus on the implementation and assessment of age assurance systems and methods used to identify child users online.
The UK’s Online Safety Act requires the largest online platforms to use age verification or age estimation to prevent children of any age from encountering content that is harmful to them.
Meanwhile the European Commission is developing a digital identity wallet that will include age verification, that will enable compliance with some aspects of the Digital Services Act.
In addition, Australia’s new law on a social media ban for users under 16, set for late 2025, has again sparked discussions in the UK and other countries about similar measures. This session will address responses from different stakeholders, the methods, benefits and challenges of implementing age assurance in practice.
This session will centre on how to ensure that children can access age-appropriate content, services and safety settings online while also protecting their sensitive data.
- Colette Collins-Walsh, Head of UK Affairs, 5Rights
- Julie Dawson, Chief Policy & Regulatory Officer, Yoti
- Andy Lulham, Chief Operating Officer, Verifymy
- Ross Phillipson, Partner, A&O Shearman, Australia
- Chair: Ellie Colegate, PhD Researcher, Online Safety in the UK, ICT Law, University of Nottingham and EPSRC Horizon Centre for Doctoral Training
|
11.30 |
Coffee Break |
11.50 |
Navigating and managing risks: Framing consent and parental controls around the best interests of children
-
More Info
-
This session navigates the risks associated with processing children’s data on the basis of consent and the exercising of parental responsibility.
It will explore how companies identify and navigate risks, whether ‘an age of consent’ approach is in the best interests of children, and the practicalities of making subjective assessments relating to consent, competence, best interests, and parental responsibility. Are parental controls effective and how often are they used for those over 13?
The EU GDPR stipulates that where consent is the chosen basis for data processing of an Information Society Service, a child is able to provide this only from the age of 16. This age may be reduced to a prescribed minimum of 13, where a member state reduces this age limit. In other jurisdictions and contexts a child may be defined as under 18 for certain purposes.
This difference creates an array of compliance requirements for multi-jurisdictional companies.
The panel will explore the notion of children’s best interests in light of issues associated with data processing. How can integration of the two be achieved in a datafied landscape consisting of many stakeholders?
- Beatrice Cavicchioli, Senior Privacy & Product Counsel, k-ID
- Jonathan Dunne, Director, Regulatory Affairs, Google
- Jen Persson, Director, Defend Digital Me
- Dr Claire Bessant, Socio-legal Scholar and Associate Professor, Northumbria University
- Maisie Robinson, LLM Student, University of Birmingham School of Law
- Chair: Anna Gamvros, Partner and Head of Asia Pacific Privacy and Cyber, A&O Shearman, Hong Kong/Australia
|
13.15 |
Lunch |
14.00 |
Design: Making data rights understandable and accessible for children
-
More Info
-
The aim of this session is to encourage working together to educate children and help them make informed decisions about their data and have safe, healthy and fun online experiences.
This session will explore how different companies have designed their services aimed at children, designing child friendly privacy mechanisms that empower children to understand and use their data rights and developing online services and platforms with child users in mind. It will consider design and User Experience (UX) relevant amongst other things, to privacy notices (what, how and when information is shared) and default settings.
Under the GDPR, children have the same rights as adults over their personal data. For example, this means they have the right to be provided with a clear privacy notice which explains how their data will be processed in a “concise, intelligible and accessible” way that child users can understand.
The UK Age Appropriate Design Code, issued bv the Information Commissioner, sets out 15 standards all focused on data protection by design and ensuring accessibility.
This session will also cover the role Data Protection Impact Assessments can play in identifying and mitigating risks to children.
- John Kavanagh, Privacy and Security, Public Policy Europe, TikTok
- Erin Stephens, Data Protection Officer, BBC
- Laurie-Anne Ancenys, Partner, Head of the Paris Tech & Data Practice, A&O Shearman, France
- Chair: Hannah Heilbuth, PhD Researcher, University of Nottingham and EPSRC Horizon Centre for Doctoral Training
|
15.30 |
Tea Break |
15.50 |
What is right for children and their data? Recommendations
-
More Info
-
This session will draw together insights and recommendations from across the day. It will consider:
1. The emerging best practices in age assurance
2. The management of child consent
3. Exercises of parental responsibility for children’s data
4. Data protection design for children
5. How to ensure the diverse voices and experiences of children are central to approaches taken by companies, policy makers and regulators
6. Emerging risks such as Generative AI
The panel will explore practical actions that can be taken to improve upon good practice and challenges that require future work.
- Siobhan Pointer, Director of Children’s Strategy, ICO
- Patricia Kosseim, Information and Privacy Commissioner of Ontario (online)
- Beatrice Cavicchioli, Senior Privacy & Product Counsel, k-ID
- Colette Collins-Walsh, Head of UK Affairs, 5Rights
- Steve Wood, Special Advisor on Data Protection, A&O Shearman, UK
- Chair: Dr Oliver Butler, Assistant Professor in Law, University of Nottingham
|
17.15 |
Close |