ICO (UK) - TikTok ICO: Difference between revisions
No edit summary |
No edit summary |
||
Line 76: | Line 76: | ||
== Comment == | == Comment == | ||
' | We have observed a tendency of DPAs in the EU to impose sanctions or restrict access to online service providers based on the lack of effective mechanisms for verifying the user's age. As examples, we can mention: [[AEPD (Spain) - PS/00554/2021|ES PS/00554/2021 (pornographic site)]], [[Garante per la protezione dei dati personali (Italy) - 9870832|IT 9870832 (ChatGPT)]], [[Garante per la protezione dei dati personali - 9574709|IT 9574709 (TikTok)]]. | ||
== Further Resources == | == Further Resources == |
Revision as of 14:40, 11 April 2023
ICO - TikTok ICO | |
---|---|
Authority: | ICO (UK) |
Jurisdiction: | United Kingdom |
Relevant Law: | UK GDPR |
Type: | Investigation |
Outcome: | Violation Found |
Started: | |
Decided: | |
Published: | 04.04.2023 |
Fine: | 12,700,000 GBP |
Parties: | TikTok Information Technologies UK |
National Case Number/Name: | TikTok ICO |
European Case Law Identifier: | n/a |
Appeal: | Unknown |
Original Language(s): | English |
Original Source: | ICO (in EN) |
Initial Contributor: | Bernardo Mendonca |
The UK DPA launched an investigation against TikTok and fined it £12.7 million for misusing children’s data.
English Summary
Facts
The UK DPA launched an investigation against TikTok to ascertain the possible processing of children's data without proper parental consent. The investigations revealed that TikTok allowed over 1 million UK children under the age of 13 to use its platform in 2020, contrary to its own terms of service. The DPA also found that personal data belonging to these children were used without parental consent and that TikTok “did not do enough” to implement an age verification mechanism or to remove the underage children. Additionally, TikTok failed to provide proper information to its users about how their data were collected and processed.
Holding
Under UK GDPR, when data controllers offer information society services to children under 13, they must obtain consent from their parents or carers. The DPA held that TikTok failed to do that between May 2018 and July 2020 even though it ought to have been aware that under 13s were using its platform. According to the UK Information Commissioner, 'that means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll'.
Moreover, the DPA stated that TikTokand failed to provide proper information to its users about how their data were collected, used, and shared in a way that is easy to understand. In the DPA's view, TikTok did not ensure that personal data belonging to its UK users were processed lawfully, fairly, and in a transparent manner.
For these reasons, the DPA found a breach of several articles of the UK GDPR and imposed a fine of £12.7 million on TikTok, taking into account the serious impact on its users.
Comment
We have observed a tendency of DPAs in the EU to impose sanctions or restrict access to online service providers based on the lack of effective mechanisms for verifying the user's age. As examples, we can mention: ES PS/00554/2021 (pornographic site), IT 9870832 (ChatGPT), IT 9574709 (TikTok).
Further Resources
Share blogs or news articles here!
English Machine Translation of the Decision
The decision below is a machine translation of the English original. Please refer to the English original for more details.
More than one million UK children under 13 estimated by the ICO to be on TikTok in 2020, contrary to its terms of service. Personal data belonging to children under 13 was used without parental consent. TikTok “did not do enough” to check who was using their platform and take sufficient action to remove the underage children that were. The Information Commissioner’s Office (ICO) has issued a £12,700,000 fine to TikTok Information Technologies UK Limited and TikTok Inc (TikTok) for a number of breaches of data protection law, including failing to use children’s personal data lawfully. The ICO estimates that TikTok allowed up to 1.4 million UK children under 13 to use its platform in 2020, despite its own rules not allowing children that age to create an account. UK data protection law says that organisations that use personal data when offering information society services to children under 13 must have consent from their parents or carers. TikTok failed to do that, even though it ought to have been aware that under 13s were using its platform. TikTok also failed to carry out adequate checks to identify and remove underage children from its platform. The ICO investigation found that a concern was raised internally with some senior employees about children under 13 using the platform and not being removed. In the ICO’s view TikTok did not respond adequately. “There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws. “As a consequence, an estimated one million under 13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll. “TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.” - John Edwards, UK Information Commissioner Details of the contraventions The ICO found that TikTok breached the UK General Data Protection Regulation (UK GDPR) between May 2018 and July 2020 by: Providing its services to UK children under the age of 13 and processing their personal data without consent or authorisation from their parents or carers; Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. The original ICO notice of intent for TikTok set the fine at £27 million. Taking into consideration the representations from TikTok, the regulator decided not to pursue the provisional finding related to the unlawful use of special category data. That means this potential infringement was not included in the final amount of the fine set at £12.7 million. Since the conclusion of the ICO’s investigation of TikTok, the regulator has published the Children’s code to help protect children in the digital world. It is a statutory code of practice aimed at online services, such as apps, gaming platforms and web and social media sites, that are likely to be accessed by children. The code sets out 15 standards to ensure children have the best possible experience of online services. For more information visit ico.org.uk/childrenscode. Click to toggle details Notes to editors The Information Commissioner’s Office (ICO) upholds information rights in the public interest, promoting openness by public bodies and data privacy for individuals. The ICO has specific responsibilitiesset out in the Data Protection Act 2018 (DPA 2018), the UK General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000, Environmental Information Regulations 2004 and Privacy and Electronic Communications Regulations 2003. Since 25 May 2018, the ICO has the power to impose a civil monetary penalty (CMP) on a data controller of up to £17million (20m Euro) or 4% of global turnover. This CMP was issued under the DPA2018 for infringements of the UK GDPR. Any monetary penalty is paid into the Consolidated Fund, which is the Government’s general bank account at the Bank of England, and is not kept by the ICO. To report a concern to the ICO telephone our helpline 0303 123 1113 or go to ico.org.uk/concerns.