UK privacy watchdog imposed a £12.7 million ($15.8 million) fine on TikTok for multiple violations of data protection regulations, particularly in how the app managed the personal information of minors. According to the Information Commissioner’s Office (ICO), in 2020, TikTok permitted as many as 1.4 million children under the age of 13 to use the app, violating its own policies.
The ICO stated that companies providing “information society services” to children under the age of 13 must obtain consent from their parents or guardians. According to the regulator, TikTok did not follow this rule and should have been aware that children under the age of 13 were using the app. Additionally, the ICO, an independent public agency, stated that TikTok did not do enough to locate and delete underage users from the app, despite concerns raised by some senior employees.
According to the report, the UK General Data Protection Regulation was violated by TikTok in various ways from May 2018 to July 2020. The Information Commissioner’s Office (ICO) alleges that TikTok did not properly inform users in a straightforward way about how their data is handled and shared, among other things.
Consequently, TikTok users, including children, were unlikely to be able to make informed decisions about whether and how to interact with the app. Furthermore, the ICO stated that TikTok failed to ensure that it was processing the information it held on UK users legally, fairly, and transparently.
TikTok has asserted that it has taken several actions to address the violations for which it is being fined. While an age-gate is still in use, requiring users to provide their date of birth to establish an account (allowing them to lie to bypass the requirement if they are underage), TikTok says it has strengthened its systems and training for its safety moderation staff to identify indications that an account may belong to someone under the age of 13 so that they can notify accounts and send them for review.
TikTok also claims that it promptly responds to requests from parents to remove accounts of minors, and uses other information provided by users, such as keywords and in-app reports, to assist in identifying potential accounts belonging to minors.
TikTok also claims to have increased transparency and accountability in this field. It produces regular reports on the number of underage users eliminated from the platform, reporting over 17 million suspected underage accounts removed globally in the final three months of 2022.
However, it does not provide this information on a country-by-country basis. Additionally, TikTok offers family pairing to assist parents in monitoring their children’s use of the app.
The fine issued to TikTok by the UK’s Information Commissioner’s Office (ICO) is lower than originally anticipated. The ICO initially warned the company that it could face a fine of up to £27 million ($33.7 million) after publishing the preliminary findings of its investigation, which began in February 2019. The investigation was conducted at around the same time that the Federal Trade Commission imposed a $5.7 million penalty on TikTok for violating child privacy regulations.