DPAs turn up the volume on children’s issues
DPAs turn up the volume on children’s issues but some companies seem not to hear.
Chat GPT is the lead article on the front page of this edition, reflecting its specific importance to the privacy law community partly because of its impact on children and vulnerable adults.
While AI and the metaverse sometimes seem to reflect the science fiction end of the spectrum, children’s issues are a longstanding theme which many national regulators are addressing. Every company with a marketing channel accessible to children (even if they do not address young people directly) should take children’s issues seriously. Companies should ask themselves: what are our practices on transparency? What practical steps are we taking on communicating with our audiences of different ages using different formats?
Regulators in many countries from Ireland to California regard the UK ICO’s Age appropriate design code: a code of practice for online services (children’s code)(1) as influential. Norway’s Data Protection Authority is active in protecting children in the context of schools and gaming. Ireland's Data Protection Commission has published guidance partly based on feedback received from children.
The aim is for children to be protected and empowered through knowledge. But what if companies are not paying attention to such initiatives? For some companies, it takes regulatory intervention to shift from watching to taking action.
A model of good corporate behaviour in the last two months is when OpenAI, the company which owns ChatGPT, had their operations in Italy paused for a month. The company entered into discussions with the Garante. Then, the company fixed several of the problems identified by the regulator, an outcome which reflected well on both sides.
By contrast, on 31 May, the US Federal Trade Commission announced that it was fining Amazon and demanded consumer compensation in two separate cases in which dangers to children are cited. The cases relate to the gathering of data for training AI. To quote the FTC:
Amazon/Alexa: The proposed order requires Amazon to pay $25 million and delete children’s data, geolocation data, and other voice recordings. The FTC then goes further in clearly accusing the company of deceit.
“FTC and Department of Justice charge Amazon with violating children’s privacy law by keeping kids’ Alexa voice recordings forever and undermining parents’ deletion requests.”(2)
“The Federal Trade Commission and the Department of Justice will require Amazon to overhaul its deletion practices and implement stringent privacy safeguards to settle charges the company violated the Children’s Online Privacy Protection Act Rule (COPPA Rule) and deceived parents and users of the Alexa voice assistant service about its data deletion practices….”
“Amazon’s history of misleading parents, keeping children’s recordings indefinitely, and flouting parents’ deletion requests violated COPPA and sacrificed privacy for profits,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection. “COPPA does not allow companies to keep children’s data forever for any reason, and certainly not to train their algorithms.”
Amazon/Ring: In a separate case, also announced on 31 May, the FTC charged home security camera company Ring (bought by Amazon in 2018) with “compromising its customers’ privacy by allowing any employee or contractor to access consumers’ private videos and by failing to implement basic privacy and security protections, enabling hackers to take control of consumers’ accounts, cameras, and videos.”(3)
The details are certainly damaging for Amazon’s reputation. “As a result, hackers continued to exploit account vulnerabilities to access stored videos, live video streams, and account profiles of approximately 55,000 US customers, according to the complaint. Bad actors not only viewed some customers’ videos but also used Ring cameras’ two-way functionality to harass, threaten, and insult consumers - including elderly individuals and children - whose rooms were monitored by Ring cameras, and to change important device settings …. For example, hackers taunted several children with racist slurs, sexually propositioned individuals, and threatened a family with physical harm if they didn’t pay a ransom.”
Investors can raise standards on children’s issues
It is well known that companies’ annual shareholder meetings are often the time and place where campaigners and major pension funds raise environmental issues with top management. However, the potential role of investors on children’s privacy issues has attracted little attention.
At last month’s CPDP Conference in Brussels, Luca Tosoni, Legal Director, Norway’s DPA, explained that Norway’s Sovereign Wealth fund is using its investment as a force for good on children’s issues. On average, the fund holds 1.5% of all of the world’s listed companies. The fund has a stake in 9,228 companies worldwide, including Apple, Nestlé, Microsoft and Samsung.
Its ethos is “Respecting children’s rights is an inherent part of good business practice and risk management. We expect companies to integrate children’s rights into their corporate strategy, policies, risk management, and reporting.”
The fund uses this potentially influential position by drawing management’s attention to their responsibilities to “affect children’s welfare through the marketing and use of [companies’] products and services, including children’s digital safety online and the right to privacy.”
The fund stresses the importance of metrics. “Companies should, where relevant, define qualitative and quantitative indicators that enable monitoring and tracking of their impact on children’s rights. The performance of preventive and corrective actions should be tracked.”(4)
Will other investors follow this principled lead?
The economic role of data to be explored in PL&B events
While the classic starting point for data protection law in Europe, represented by the European Union and the Council of Europe, is privacy rights and company legal duties, many jurisdictions place data protection rights within a policy framework of promoting innovation in products and services.
Several sessions at Who’s Watching Me?, PL&B’s 36th International Conference, 3-5 July in Cambridge and online, can be seen in this context, including children’s issues, gaming, the metaverse, EU initiatives and privacy trends in the UK, Mexico, the Middle East and Asia.
Our next event, Harnessing Data, Valuing Privacy, on 14 September in London, addresses the tension between the benefits of collecting and using data and protecting privacy rights while doing so.
We shall address these issues with our host Alex Dittel, Partner, Wedlake Bell; Tom Reynolds, Chief Economist at the UK’s ICO; and David Jevons, Partner, Oxera who will address issues, such as perspectives on the value of data in different contexts - commercial value versus privacy, and private impact versus social impact.
We look forward to meeting you at these events.
Publisher, Privacy Laws & Business