ICO fines TikTok £12.7 million in a case concerning children’s data

The ICO announced on 4 April that it will fine TikTok £12.7 million for misusing children’s data. The ICO estimates that 1.4 million UK children under 13 were on TikTok in 2020, contrary to its terms of service. The regulator based its fine on TikTok not relying on parental consent, and not running sufficient checks on who was using their platform. The ICO says that TikTok also failed to carry out adequate checks to remove underage children from the platform.

The ICO’s investigation was initiated by ICO staff who were concerned about their children (under 13) using the platform and not being removed.

John Edwards, Information Commissioner, said:

“TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had.”

The ICO originally intended to fine TikTok £27 million. Taking into consideration the representations from TikTok, the regulator decided not to pursue the provisional finding related to the unlawful use of special category data. That means this potential infringement was not included in the final amount of the fine set at £12.7 million.
The fine reflects the fact that the ICO is now enforcing its Children’s Code with vigour. The regulator has made the use of children’s data a regulatory priority, and said last September that it was looking into over 50 different online services, and had four ongoing investigations.

A TikTok spokesman said:

"TikTok is a platform for users aged 13 and over. We invest heavily to help keep under 13s off the platform and our 40,000 strong safety team works around the clock to help keep the platform safe for our community. While we disagree with the ICO's decision, which relates to May 2018 - July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps."

TikTok further commented on how it tries to identify users who are under the age of 13:

"We take a number of steps to keep underage users off TikTok. Our safety team is specifically trained to be alert to signs that an account may belong to an underage user, flag any suspected underage account and send them for review. We use other information, such as keywords and in-app reports from our community, to help further identify and remove potential underage accounts. We were also the first platform to voluntarily disclose the number of suspected underage accounts removed from the platform. We're constantly exploring new and innovative solutions to keep U13s off the platform."

TikTok has a content moderation team to identify and remove users that are suspected to be under the age of 13. This number was over 17 million accounts worldwide in the last three months of 2022. TikTok has in recent years established a European Privacy and Data Protection team based in Ireland and the UK.

See ICO - ICO fines TikTok £12.7 million for misusing children’s data

At PL&B’s Conference, there will be two sessions dealing directly with children’s issues with company, privacy advocate and law firm speakers:

  • Privacy in the gaming industry
  • The Age Appropriate Design Code Crosses the Pond: Keeping Pace with Kids’ Data Regulation