DPC (Ireland) - IN-21-9-1: Difference between revisions
No edit summary |
No edit summary |
||
Line 31: | Line 31: | ||
|GDPR_Article_Link_1=Article 5 GDPR#1a | |GDPR_Article_Link_1=Article 5 GDPR#1a | ||
|GDPR_Article_2=Article 5(1)(f) GDPR | |GDPR_Article_2=Article 5(1)(f) GDPR | ||
|GDPR_Article_Link_2=Article | |GDPR_Article_Link_2=Article 5 GDPR#1f | ||
|GDPR_Article_3=Article 24 GDPR | |GDPR_Article_3=Article 24 GDPR | ||
|GDPR_Article_Link_3=Article | |GDPR_Article_Link_3=Article 24 GDPR | ||
|GDPR_Article_4=Article 25 GDPR | |GDPR_Article_4=Article 25 GDPR | ||
|GDPR_Article_Link_4=Article | |GDPR_Article_Link_4=Article 25 GDPR | ||
|GDPR_Article_5=Article 60 GDPR | |GDPR_Article_5=Article 60 GDPR | ||
|GDPR_Article_Link_5=Article | |GDPR_Article_Link_5=Article 60 GDPR | ||
|GDPR_Article_6=Article 63 GDPR | |GDPR_Article_6=Article 63 GDPR | ||
|GDPR_Article_Link_6= | |GDPR_Article_Link_6=Article 63 GDPR | ||
|GDPR_Article_7= | |GDPR_Article_7= | ||
|GDPR_Article_Link_7= | |GDPR_Article_Link_7= |
Latest revision as of 12:25, 20 September 2023
DPC - IN-21-9-1 | |
---|---|
Authority: | DPC (Ireland) |
Jurisdiction: | Ireland |
Relevant Law: | Article 5(1)(a) GDPR Article 5(1)(f) GDPR Article 24 GDPR Article 25 GDPR Article 60 GDPR Article 63 GDPR |
Type: | Investigation |
Outcome: | Violation Found |
Started: | |
Decided: | 01.09.2023 |
Published: | |
Fine: | 345,000,000 EUR |
Parties: | TikTok |
National Case Number/Name: | IN-21-9-1 |
European Case Law Identifier: | n/a |
Appeal: | Unknown |
Original Language(s): | English |
Original Source: | DPC (Ireland) (in EN) |
Initial Contributor: | mg |
Following the an EDPB binding decision, the Irish DPA fined TikTok €345 million for several violations concerning the principles of fairness, transparency and privacy-by-default in the context of processing activities involving data of minors.
English Summary
Facts
In 2021, the Dutch and French DPAs asked the Irish DPA to provide mutual assistance pursuant to Article 61 GDPR in the context of the processing activities undertaken by TikTok, the controller. The Irish DPA was Leading Supervisory Authority (LSA) under Article 56 GDPR.
TikTok allegedly processed personal data of children in lack of sufficient identification mechanisms and implementing user settings that were not private. In particular, TikTok did not identify children under the age of 13 and made social media contents of all minors public-by-default.
Holding
National procedure leading to the Irish Draft Decision
First, the Irish DPA assessed whether TikTok’s default settings complied with the principles of data minimisation (Article 5(1)(c) GDPR), integrity and confidentiality (Article 5(1)(f) GDPR) and privacy by design and by default (Article 25 GDPR). These principles should be complied with by means of appropriate technical and organisational measures (Article 24 GDPR).
The Irish DPA noted that new TikTok users were presented with a pop-up window where they could choose between ‘Go Private’ or ‘Skip’. If the user decided to skip, their account was made public by default. The DPA also stressed that TikTok’s privacy policy lacked transparency concerning the processing of minors’ data. Therefore, the controller violated the above-mentioned provisions.
Second, the DPA examined whether age verification mechanisms were sufficient to guarantee compliance with the same provisions mentioned above.
TikTok asked users to confirm their date of birth via an age gate. Children who used a date of birth that showed they were less than 13 were blocked and did not have a second opportunity to create an account. TikTok did not require ID documents (‘hard identifiers’) to be uploaded. However, TikTok declared to have removed children below 13 after reports by other users.
Aware of that fact that no age verification mechanism is sufficient to fully guarantee that children below 13 do not have access to the service, the DPA embraced a risk-based approach. According to the DPA, Tiktok never assessed the existence of these risks in its DPIA. However, the DPA also acknowledged that the controller put in place several measures – most of which are blackened in the text of the decision, but including content moderation – to prevent children below 13 from accessing its services. In light of the above, the DPA initially found no violation with regard to this point.
Third, the DPA found that TikTok did not comply with transparency requirements pursuant to Articles 12 and 13 GDPR, in particular because the privacy policy did not explain the implications of the public-by-default option. However, the DPA did not identify a broader violation of the principle of transparency (Article 5(1)(a) GDPR).
Relevant and reasoned objections by other DPAs and consistency mechanism
The Irish DPA adopted a Draft Decision under Article 60(3) GDPR and other supervisory authorities raised relevant and reasonable objections pursuant paragraph (4) of the same provision. As it was not possible to reach consensus concerning the age verification mechanism, the Irish DPA referred the case to the EDPB in order to apply the consistency mechanism (Article 63 GDPR).
The relevant and reasonable objections concerned the age verification mechanism and the use of dark patterns by TikTok.
Concerning the age verification mechanism, the EDPB stressed that the principle of ‘appropriateness’ is strictly related to the principle of ‘effectiveness’. The EDPB noted how the age verification gate could be easily dodged, as also shown by the high number of TikTok users below 13. According to Article 25(1) GDPR, measures taken by the controller should take into account the ‘state of the art’ of technology at a given time. The EDPB endorsed the Italian DPA’s statement according to which age verification can be effectively performed with the help of a ‘trusted third party’. However, the EDPB considered not to have conclusive information to ascertain the ‘state of the art’ of age verification technologies during the period taken into account by the investigation. Therefore, despite serious doubts about the effectiveness of the measures, the EDPB could not ascertain the existence of a violation concerning this point.
Concerning the use of ‘dark patterns’, the EDPB stressed that the Irish DPA took into account only potential violations of the principle of transparency, ignoring the principle of fairness, which is also established by Article 5(1)(a) GDPR. The EDPB found that TikTok violated such a principle by ‘nudging’ users to accept non-private options in several ways.
The EDPB adopted a binding decision and the Irish DPA had to amend its draft decision accordingly.
In light of the above, the Irish DPA ordered TikTok to bring its processing activities in compliance with the GDPR. The DPA also issued a €345 million administrative fine.
Comment
Share your comments here!
Further Resources
Share blogs or news articles here!
English Machine Translation of the Decision
The decision below is a machine translation of the English original. Please refer to the English original for more details.