Risk, Hazard and Harm
The UK’s new Data (Use and Access) Bill has attracted attention for its greater flexibility than the EU GDPR, as the government takes care to avoid the UK losing its EU “adequacy” status. Less attention is given to the news that there will be a new statutory duty for the ICO to consider specific risks to children when carrying out its work.
Issues relating to children have been a priority for Data Protection Authorities in the UK, Ireland, France and elsewhere. Consequently, the Global Privacy Assembly in Jersey last October took the initiative to hold a session featuring a panel of teenagers to give their views on their use of social media and their opinions on the benefits and harms.
My article shows that they understand both ends of the spectrum. At one end, they see mobile devices as a tool for discovery and self expression. At the other end is the easy slide into social media addiction and strains on their mental health.
The discussion on finding a balanced way forward is mainly conducted by legislators and regulators who are much older than the young digital natives they are seeking to protect. It was certainly a shock to me to discover that some 80% of children living in developed Western countries have a digital footprint before they are two years old, (largely due to the actions of their family members). This does not necessarily lead to harm, however difficult to define or assess its value. But we could see here the risk of the surveillance society when combined with unfettered facial recognition software.
Matthew Johnson, Director of Education, MediaSmarts, Ottawa, Canada, expressed at the GPA the neat distinction between risk and hazard as applied to children as follows:
“If a risk is the possibility that a person might experience harm, a hazard is a risk which they cannot manage because they either are not fully aware of it or not developmentally ready for it. Some risks are hazards for everyone, while others will be risks for some but hazards for others.
This distinction … comes from a recognition that children need not an absence of risk but a safe way to learn to manage risk.
Digital media literacy, therefore, is about teaching young people how to manage appropriate risks, to help them recognize hazards they’re not ready to manage, and to empower them and the adults who care for them to advocate for online spaces free of things that are hazards to everyone.”
In a GPA session on online harms, Anu Talus, Chair of the European Data Protection Board (EDPB) and Finland’s Data Ombudsman declared:
“Harms do not feature in the EU GDPR. However, the notion of risk appears in over 40 recitals and articles. It has become one of the criteria which data controllers should take into account when deciding on the most appropriate technical and organisational measures.”
The notion of hazard, as a signpost to harm, is not much discussed. However, I think that it could be useful, together with the phrase “guard rails,” which is the metaphor often deployed.
Controllers must take responsibility for processing personal data and DPAs must deal with complaints in an appropriate manner. Both should deploy resources and can adopt corrective measures according to the risk to the data subject. DPAs decide on their enforcement priorities, for example, the risks of harms, or sectors to pursue on their own initiative at national or EU level.
Identifying harms
Emily Keaney, Deputy Commissioner, ICO, said that identifying harms is at the heart of what the ICO tries to get organisations to do. Some organisations need help in thinking through this issue. What kind of actions might cause harm? There are potential risks to processing people’s data, but the ICO wants organisations also to benefit from growth.
The ICO has published a harms taxonomy(1) listing wider societal harms which includes a tool for considering different types of harm, including their severity and likelihood. This is also related to the ICO’s recently published fining guidance. Harms include actual or potential harm to people: physical, psychological, economic or financial, discrimination, reputational harm or loss of human dignity.
Professor Andy Phippen, the University of Bournemouth said there is much more interest in ‘stranger danger’ than in data breaches. Excessive tracking of children can be problematic in terms of their rights which depend on context.
Jade Nester, Head of Data Public Policy, Europe, TikTok, said that the company uses Privacy by Design and other teams to try to be creative in its harms mitigation strategy.
Ofcom works with the ICO and other regulators in the Digital Regulation Cooperation Forum. On 16 December last year, Ofcom published its first major policy statement for the Online Safety regime. Providers now have a duty to assess the risk of illegal harms in their services, with a deadline of 16 March 2025 and the prospect of substantial fines.
I look forward to meeting you and the speakers from some of these influential organisations at our timely 11 March event on children’s privacy when the UK’s Data (Use and Access) Bill is progressing through Parliament.
Best regards,
Stewart Dresner
Publisher, Privacy Laws & Business
January 2025
REFERENCES |
News & Blogs |
January 2025 Report Contents |
Next |