The fence at the top of the cliff

After English language hosting of the Global Privacy Assembly (GPA) in the last two years in Bermuda and then Jersey, it was a welcome contrast for the Privacy Laws & Business editorial team to be in Seoul, Korea for this year’s event in September. We met some of our Report subscribers for the first time, such as the data privacy regulators and companies from Japan, Hong Kong and Qatar.

We have articles reporting from the conference in this October edition of PL&B International Report (Australia's privacy protections for children and agentic AI) and we will have more in the December edition.

Some sessions and one-to-one conversations showed me some developments which are indicators of future trends.

First, a memorable statement by Michael Webster, Privacy Commissioner, New Zealand when explaining that Data Protection Authorities want to act when issues impact many people: “We want to be the fence at the top of the cliff rather than the ambulance at the bottom.”

World Bank as a data privacy influencer

International companies may well be surprised that data privacy laws are developing in poorer countries which may not yet appear in their risk registers.

Taylor Reynolds, Practice Manager, Digital Safeguards at the Washington DC-based World Bank (WB), explained that it is influential in developing the regulatory infrastructure in poorer countries. The organisation regards privacy laws as a necessary part of digital transformation plans intended to improve governance and economic development in low and middle income countries.

A WB blog published on 2 October 2025 explained the link between raising economic engagement and the need for a reliable law-based identity system:

“…smartphones are the primary way that people in developing countries access the internet, so often those without them are cut off from the communication, information, and income-generating opportunities that digital channels provide. … Lack of identification (ID) also remains a barrier to accessing and using mobile technology. A SIM card — essential for connecting to mobile networks and storing owner information — often requires ID to purchase it. In many countries, services like mobile money and digital work platforms also require the SIM to be registered in the user’s name to verify identity and enable transactions or payments.”(1)

Reynolds declared that there are five major challenges in establishing new national Data Protection Authorities: stable funding to maintain independence; staffing and training; enforcement capacity to choose priorities; the need to raise public awareness; and DPAs’ expanding mandates, such as AI.

The WB gives grants to the poorest countries, and loans where repayment is feasible, backed up by technical assistance. Reynolds gave examples of Jordan where the WB convened workshops for both public and private sectors. In Somalia, data privacy is part of a digital initiative where the WB has helped with the scaling-up of digital identity services. In May this year, the WB brought together 11 national DPAs to share experience and develop DPA technical skills.

I asked whether the WB was influenced by US rather than European models of regulating privacy? Reynolds stated that they did not have one preferred model. Rather, the aim is to fit data privacy into the national development objectives. If the staff in the recipient country make a credible case for visiting another country to learn from its way of working, for example, on conducting investigations or tackling cyber-security, the WB will give financial support for travel for this purpose.

Is there a risk of perception of US dominance when delivering aid? No because the help is delivered from more than 130 local offices around the world.

Data Governance for EdTech

While social media and online games attract a mass of attention about children and other vulnerable people, educational technology is developing fast, collecting an increasing quantity of children’s data, which was the focus of a GPA conference session to launch a new UNICEF publication Data Governance for EdTech.(2)

As with any tech, there are opportunities, such as improving data literacy nurturing good digital citizens; giving a voice to children; and analytics on learning data to help inform decisions on individuals.

The edtech issue is that computer programs and apps are made widely available, at low or no cost to schools, giving students and teachers access to useful resources but which collects personal data in ways not immediately obvious to students and their teachers. Edtech companies may offer free equipment and software to schools, but users “pay” with their data.

It is difficult to define an educational purpose. Online exam supervision (proctoring) software uses video, audio and screen sharing. Also, it tracks movements, computer interactions, and may use AI to detect irregular behaviour by students. There is a risk that edtech tools could become a determinant of users’ identity without them realising this risk. Another risk is that the personal data can be sold for unrelated purposes.

Those who guide the purchase of edtech resources include teachers, librarians, and procurement staff, but few of them are aware of the need for balance between the opportunities and risks. Help is available in the form of a publication in French and English on the CNIL’s website.(3) The CNIL has also been active in issuing guidance on AI and mobile apps.

AI agreement links Catalan and Brazil DPAs

At conferences, to widen my circle of contacts, I generally sit next to people I don’t know. At the conference gala dinner, I joined a table with Meritxell Borràs i Solé, Commissioner and Xavier Urios, Legal Advisor, at the Catalan Data Protection Authority (Autoritat Catalana de Protecció de Dades) and their Basque colleagues.

The Commissioner informed me that the next day, 18 September, she would sign a memorandum of understanding with Waldemar Gonçalves, the President of Brazil’s DPA (the ANPD), to collaborate on the design, implementation, and promotion of tools for the development of Artificial Intelligence systems that are “safe, trustworthy and respectful of fundamental rights.”(4)

The Catalan-developed fundamental rights assessment model is used to guide AI developers in identifying potential risks to fundamental rights and how to mitigate them. The agreement will provide technical support and exchange of knowledge and experience with the ANPD regarding the deployment of regulatory AI sandboxes to define good practices in the use of AI systems before they reach the market.

While at the GPA, 19 DPAs, including Spain’s AEPD, adopted a statement on AI governance.

Fresh perspective

Attending this GPA for PL&B gave me a fresh perspective on Korea’s tech-driven economy, the 10th largest in the world. There is a tangible societal focus on education and business, and privacy law is fitted into this model. Mediation is the preferred way of the Personal Information Protection Commission of Korea to resolve data privacy disputes avoiding the time and expense of litigation characteristic of the US. Presentations by leading Korean companies and academics showed that the EU-Korean digital trade agreement,(5) which includes data privacy elements, provides a win-win.

Next PL&B events

We have announced our next two PL&B events in London on:

Subscribers to PL&B Reports may attend these events for free and others are welcome for a fee.

We look forward to meeting you at both events.

Regards

Stewart Dresner
Publisher, Privacy Laws & Business

October 2025

REFERENCES
  1. World Bank - Mobile phone ownership is widespread. Why is digital inclusion still lagging?
  2. UNICEF - Data Governance for EdTech
  3. CNIL - The CNIL publishes two FAQs on the use of AI systems in educational establishments
  4. Catalan Data Protection Authority - Catalonia and Brazil to cooperate on the development of trustworthy AI systems
  5. European Commission - EU-South Korea Free Trade Agreement and Digital Trade Agreement

News & Blogs

October 2025 Report Contents

Next