CNIL (France) - SAN-2022-020

From GDPRhub
Revision as of 13:47, 7 December 2022 by Fz (talk | contribs)
CNIL - Délibération SAN-2022-020
LogoFR.png
Authority: CNIL (France)
Jurisdiction: France
Relevant Law: Article 3(2)(a) GDPR
Article 5(1)(e) GDPR
Article 12 GDPR
Article 13 GDPR
Article 13(2)(a) GDPR
Article 21 GDPR
Article 25(2) GDPR
Article 32 GDPR
Article 35(1) GDPR
Article 55(1) GDPR
Article 56 GDPR
Type: Investigation
Outcome: Violation Found
Started: 17.11.2020
Decided: 10.11.2022
Published:
Fine: 800,000 EUR
Parties: Discord
National Case Number/Name: Délibération SAN-2022-020
European Case Law Identifier: n/a
Appeal: n/a
Original Language(s): French
Original Source: CNIL (in FR)
Initial Contributor: n/a

The French DPA imposed a fine of €800,000 on Discord, an online communication platform. Among other things, the controller had no data retention policy, did not secure the data with a password that was strong enough, and should have conducted a data protection impact assessment.

English Summary

Facts

The French DPA (CNIL) started an investigation into Discord, a company based in the United States (controller). This controller provided a free of charge online service that allowed data subjects to communicate online using text, voice - and video.

The investigation service of the DPA determined several shortcomings.

During the investigation, the controller stated that it did not have a written data retention policy. The investigation service confirmed that there were 2,474,000 French data subject accounts in the controller’s database that had not been used for more than three years and 58,000 accounts that had not been used for more than five years. During the procedure, the controller added a data retention policy, which described that the controller would delete user accounts after two years of inactivity.

The investigation service found that the information the controller provided regarding data retention periods was incomplete. There were no specific periods or criteria for determining these retention periods. The controller changed this element of the privacy policy during the procedure.

The investigation service also addressed a specific problem about the application for Microsoft Windows: when a data subject, logged in to a voice room, closed the controller’s application window by clicking on the "X" icon at the top right of the application, the application would continue to run in the background and the data subject would remain logged in. However, in the majority of Microsoft Windows applications, clicking on the "X" will close the application. This 'background minimization' was activated after the first install of the controller's software. The data subject was not informed about this background minimization. During the procedure, the controller implemented a pop-up window to alert data subjects that the application was still running, when the application window was closed for the first time. The controller also informed data subjects that this setting (remain logged in after closure of application) could be changed in the settings.

At the time of the online investigation, when creating an account, the controller accepted a password of six characters including letters and numbers. The controller also adjusted this during the proceedings: it now required data subjects to use a password of at least eight characters, with at least three of the four different character types. Also, after ten unsuccessful login attempts, the controller now required a captcha prompt to be solved, which was previously not the case.

The investigation service also determined that the controller had previously deemed it unnecessary to carry out a data protection impact assessment (DPIA). During the procedure, the controller carried out two impact assessments, in which the controller concluded that its processing was not likely to result in a high risk to individuals' rights and freedoms.

Holding

Competence of the DPA

The DPA determined that the controller processed personal data of French data subject and held that the GDPR was applicable pursuant of Article 3(2)(a) GDPR. The DPA determined that the controller offered services intended for data subjects in the European Union by considering several factors. Among other factors, The DPA considered for example that almost all pages on the controller’s website and in the controller’s application were available in French at the time of the investigation.

The DPA determined that it was competent to handle this case because the one-stop shop" mechanism (Article 56 GDPR) did not apply in this case, because the controller did not have an establishment on the territory of any EU Member State. Therefore, each national supervisory authority was competent to monitor GDPR compliance on the territory of this member state (Article 55 GDPR).

Failure to define and respect a data retention period appropriate to the purpose (Article 5(1)(e) GDPR)

The DPA confirmed that the controller did not have a written date retention policy at the time of the investigation. The DPA also confirmed that there were 2,474,000 French data subject accounts in the controller’s database that had not been used for more than three years and 58,000 accounts that had not been used for more than five years. The DPA held that this was a violation of Article 5(1)(e) GDPR, because the controller could not rely on the contractual relationship to indefinitely keep storing accounts of data subjects who were inactive, but had not unsubscribed. The reason for this was because a new account could be created free of charge. Therefore, an inactive data subject who wished to use the service again, could do so by recreating a new account.

Failure to comply with the obligation to provide information (Article 13 GDPR)

The DPA stated that at the time of the investigation, the information regarding data retention periods was incomplete. There were no specific periods or criteria for determining these periods. The DPA held that this was a violation of Article 13 GDPR, because retention periods were stated in a generic manner and were not sufficiently explicit.

Failure to ensure data protection by default (Article 25(2) GDPR)

The DPA also found a violation of Article 25(2) GDPR regarding the controllers “X” icon at the top right corner of its Windows application. The DPA determined that the behaviour of the controller's application was different in comparison with other Windows applications. The DPA considered that the fact that data subjects would click the “X” icon in the controller’s application, without actually closing the application, could lead to a situation where this data subject could still be heard by other members in the voice room, when the data subject actually thought he/she had left the voice room.

The DPA stated that data subjects could not reasonably expect the application to keep running after clicking the 'X' icon, because communication apps in general either inform the data subject about this 'background minimization' or provide the option to data subjects to enable it themselves. The DPA stated that because of this situation, the data subject's personal data could be communicated to third parties without the data subject necessarily being aware of this. The DPA noted that this setting, without sufficiently clear and visible information, could present significant risks for data subjects, in particular for intrusion into their private life.

Failure to ensure the security of personal data (Article 32 GDPR)

At the time of the online investigation, the controller accepted a password of six characters including letters and numbers for creating a user account. The DPA considered that the controller's passwords were not strong enough, taking into account the undemanding password policy and the volume of personal data processed by the controller. This resulted in a risk of compromise for the user accounts in question, including the personal data these accounts contained. The DPA referred to its own recommendations for passwords (in deliberation No. 2017-012 of 19 January 2017), which entailed that passwords should compromise at least eight characters, containing at least three or four categories of characters (upper case, lower case, numbers and special characters) and that authentication should include a limitation on access of the user account, such as a timeout of access after several failed requests to login.

Failure to carry out a data protection impact assessment (Article 35 GDPR)

The controller previously considered that it was not necessary to carry out a DPIA. The DPA considered that the controller should have done so, looking at the large scale of personal data processed and the fact that the controller's service was also intended used by children aged fifteen, of which the controller was fully aware, according to the DPA.

Fine

The DPA imposed a fine of 800,000 euros on the controller. The amount of the fine was based on several factors, and took into account the efforts made by the controller throughout the procedure to become GDPR compliant.

Comment

The DPA also investigated breaches of Articles 12 and 21 GDPR, which were determined by the investigation service. However, the DPA did not follow its investigation service in these instances and held that the controller did not violate these articles.

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the French original. Please refer to the French original for more details.


Deliberation of the restricted formation no SAN-2022-020 of 10 November 2022 concerning the company DISCORD INC.

The Commission nationale de l'informatique et des libertés, meeting in its restricted formation composed of Mr Alexandre LINDEN, chairman, Mr Philippe-Pierre CABOURDIN, vice-chairman, Ms Anne DEBET, Ms Christine MAUGÜÉ, Mr Alain DRU and Mr Bertrand du MARAIS, members;

Having regard to Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of personal data and the free movement of such data;

Having regard to Law No. 78-17 of 6 January 1978 on information technology, files and freedoms, in particular Articles 20 et seq;

Having regard to Decree No. 2019-536 of 29 May 2019 taken for the application of Law No. 78-17 of 6 January 1978 relating to information technology, files and freedoms;

Having regard to Deliberation No. 2013-175 of 4 July 2013 adopting the internal rules of procedure of the Commission nationale de l'informatique et des libertés ;

Having regard to Decision No. 2020-272C of 14 August 2020 of the President of the National Commission on Data Processing and Individual Liberties to instruct the Secretary General to carry out or have carried out a mission to verify any processing of personal data relating, in whole or in part, to data relating to the marketing or use of the products or services associated with the "DISCORD" trademark;

Having regard to the decision of the President of the Commission nationale de l'informatique et des libertés appointing a rapporteur before the restricted formation, dated 24 December 2021;

Having regard to the report of Mrs Valérie PEUGEOT, Commissioner-Rapporteur, notified to the company DISCORD INC. on 25 February 2022;

Having regard to the written observations submitted by DISCORD INC. on 15 April 2022;

Having regard to the reply of the rapporteur to these observations, notified on 12 May 2022 to the company's counsel;

Having regard to the written observations of DISCORD INC. received on 12 July 2022;

Having regard to the other documents in the file;

Were present, at the meeting of the restricted formation of 15 September 2022 :

- Mrs Valérie PEUGEOT, Commissioner, heard in her report;

As representatives of DISCORD INC:

- [...]

The company DISCORD INC. having had the floor last ;

The restricted formation adopted the following decision:

I. Facts and procedure

1. DISCORD INC (hereinafter "the Company"), headquartered at 444 De Haro Street #200, San Francisco, CA 94107 (UNITED STATES), was established in 2015. As of January 2021, it had approximately 300 employees.

2. For the years 2019 and 2020, the company had revenues of approximately $[...] and approximately $[...] respectively.

3. DISCORD is a voice over IP (technology that allows users to chat via their microphone and/or webcam over the Internet) and instant messaging software, allowing users to create servers, as well as text, voice and video rooms. DISCORD is thus a platform for people with similar interests to share and communicate. The software is available on Windows, Mac, Linux, iOS and Android and can also be accessed directly through a web browser, from the URL "https://discord.com", or via an application. Popular among the gaming community for providing a way to communicate with each other and develop a community outside of the games themselves, DISCORD has become a comprehensive social network with a wide range of ways to interact. The application became very popular during the Covid-19 pandemic containment, especially with a young audience.

4. The software is free to use as a whole, but DISCORD offers the possibility of subscribing to improve one's profile, add features on servers, have more throughput for file exchange, etc.

5. In January 2021, approximately [...] DISCORD user accounts were registered worldwide, including over [...] in France.

6. The company does not have an establishment in the European Union but has appointed a representative, in accordance with Article 27 of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 (hereinafter "GDPR"), namely the Irish company VERASAFE.

7. Pursuant to Decision No. 2020-272C of the President of the Commission nationale de l'informatique et des libertés (hereinafter the "Commission" or the "CNIL") of 14 August 2020, the CNIL carried out an online control mission on the "discord.com" website and on the DISCORD mobile application on 17 November 2020.

8. On 29 December 2020, an off-site inspection mission to the company was also carried out by sending a questionnaire to the company.

9. On 5 and 12 February 2021, the company sent elements of its response to the CNIL. By e-mail dated 8 March 2021, the CNIL delegation requested additional information from the company's counsel, which was sent by the company's counsel on 23 and 24 March 2021.

10. For the purposes of investigating these elements, the Commission's Chairperson appointed Ms Valérie PEUGEOT as rapporteur on 24 December 2021, on the basis of Article 39 of Decree no. 2019-536 of 29 May 2019.

11. On 25 February 2022, the rapporteur sent the company a report detailing the breaches of the GDPR that she considered to have occurred in this case. The report proposed that the Commission's restricted formation impose an administrative fine in respect of the breaches of Articles 5(1)(e), 12, 13, 21(1), 25(2), 32 and 35 of the GDPR. It also proposed that the decision to impose a fine should be made public, but that it should no longer be possible to identify the company by name after a period of two years from its publication.

12. On 15 April 2022, the company submitted its observations in response to the penalty report.

13. The rapporteur replied to the company's observations on 12 May 2022.

14. On 12 July 2022, the company submitted new observations in response to the rapporteur's observations.

15. By letter dated 10 August 2022, the rapporteur informed the company's counsel that the investigation was closed, pursuant to Article 40, III, of amended Decree no. 2019-536 of 29 May 2019.

16. By letter dated 11 August 2022, the company was informed that the case was included on the agenda of the restricted formation of 15 September 2022.

17. The company and the rapporteur presented oral observations at the meeting of the restricted formation.

II. Reasons for the decision

A. On the processing operations at issue and the applicability of the GDPR

18. Article 3(2)(a) of the GDPR provides that "This Regulation shall apply to the processing of personal data relating to data subjects who are within the Union by a controller or processor who is not established in the Union, where the processing activities relate to:

(a) the supply of goods or services to those data subjects within the Union, whether or not payment is required from those data subjects [...]".

19. DISCORD INC. processes personal data of users (hereinafter "the processing operations at issue") when they create a DISCORD account and for the provision of the functionalities enabled by the software.

20. The restricted panel notes, without this being contested by the company in the context of the present procedure, that DISCORD INC. processes personal data of users located in France. According to the information provided by the company during the inspection procedure, DISCORD had more than [...] users in France in January 2021. Furthermore, it emerges from the report of the online observations of 17 November 2020 that, both on a computer from the URL "discord.com" and on the DISCORD application on a mobile phone, all the pages are accessible in French, with the exception of the privacy policy, which was available in English only at the time of the online observations, but which is now accessible in French. Furthermore, the various processes implemented by DISCORD INC. through its website and its application are directly linked to the services it offers. This includes, for example, processing in connection with the creation of an account, the provision of the DISCORD messaging platform and social network or purchases made. Finally, the privacy policy of DISCORD INC. refers to the RGPD and the company VERASAFE located in Ireland has been appointed as representative under Article 27 of the RGPD.

21. 21. Consequently, the restricted panel notes that the processing operations at issue concern an offer of services intended for persons residing in the European Union and deduces that these processing operations are subject to the GDPR pursuant to Article 3(2)(a) of this Regulation.

B. On the competence of the CNIL

22. Article 55(1) of the GDPR provides that "each supervisory authority shall be competent to carry out the tasks and exercise the powers vested in it pursuant to this Regulation in the territory of its Member State".

23. Article 56(1) of the GDPR provides that "without prejudice to Article 55, the supervisory authority of the principal or sole establishment of the controller or processor shall be competent to act as lead supervisory authority with regard to the cross-border processing carried out by that controller or processor, in accordance with the procedure laid down in Article 60".

24. Furthermore, under Article 16 of the Data Protection Act, "the restricted formation shall take measures and impose sanctions against controllers or processors who fail to comply with the obligations arising from Regulation (EU) 2016/679 of 27 April 2016 and this Act [...]".

25. The restricted panel notes, without this being contested by the company in the context of the present procedure, that the "one-stop shop" mechanism provided for by Article 56 of the GDPR is not intended to apply in this case, as DISCORD INC. does not have an establishment on the territory of a Member State of the European Union. Consequently, each national supervisory authority is competent to monitor compliance with the GDPR on the territory of the Member State to which it belongs in accordance with Article 55 of the Regulation, for processing operations carried out by DISCORD INC. on persons residing on that territory. The CNIL is therefore competent to monitor compliance with the GDPR of processing operations carried out by DISCORD INC. involving persons residing on French territory.

C. On the failure to comply with the obligation to define and respect a data retention period proportionate to the purpose of the processing

26. Under Article 5(1)(e) of the GDPR, personal data must be "kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data are processed [...]".

27. 27. The rapporteur notes that the company has not defined a data retention policy and that its register of processing activities does not mention any retention period for the personal data processed. For example, the data have been kept for more than six years, since the DISCORD service was launched, and the company does not regularly delete or archive the data after a certain period. It notes that there are 2,474,000 million French user accounts in the DISCORD database that have not been used for more than three years and 58,000 accounts that have not been used for more than five years, without the company providing any particular explanation or justification for keeping these inactive accounts.

28. The rapporteur recalls that the CNIL's reference framework on the processing of personal data implemented for the purpose of managing commercial activities of 3 February 2022 specifies - with regard to commercial activities involving the creation of an online account by customers - that the data is intended to be kept until the user deletes the account. However, it points out that users often stop using these accounts without deleting them, which leads them to remain indefinitely. In such cases, the Commission recommends that accounts should be considered inactive after two years and deleted at the end of this period, unless the user expresses the wish to keep the account active.

29. In its defence, the company states that it did not have a written data retention policy in February 2021, but argues that it was nevertheless in compliance with Article 5 of the GDPR, as it had determined and implemented retention periods directly encoded in the DISCORD service itself. It states that the retention period implemented corresponds to the duration of the contractual relationship with its users, as well as to periods determined in accordance with its legal obligations and its security obligations, which it is required to respect, without however specifying them.

30. In addition, the company raises the unenforceability of the CNIL recommendations, in particular the CNIL reference framework of 3 February 2022, which is subsequent to the online audit conducted on 17 November 2020, and reserves the hypothesis "for commercial activities that involve the creation of an online account by customers (e.g. dating sites or social networks), [where] data may be kept until the user deletes the account". The company also points to the specific nature of the Discord Service, which is a communication service involving the maintenance of so-called inactive accounts for the benefit of the users themselves.

31. The Panel notes that in the course of the monitoring procedure the company stated: "Discord does not have a written data retention policy. [The company [...] is currently developing a data retention policy to delete inactive accounts when the company can conclude that the user has abandoned his account". In this respect, the register of processing activities provided by the company during the monitoring procedure does not mention any retention period for the personal data processed.

32. The findings of the CNIL inspection delegation confirm that there were 2,474,000 French user accounts in the DISCORD database that had not been used for more than three years and 58,000 accounts that had not been used for more than five years.

33. The restricted formation recalls that the obligation to keep data "for no longer than is necessary for the purposes for which they are processed [...]" results from Article 5(1)(e) of the GDPR, which is a mandatory provision. The Commission has consistently held that the retention of online accounts created free of charge without any action by users beyond a certain period of time leads to the retention of data indefinitely, in breach of the GDPR. The restricted formation considers that the company cannot rely in this case on the maintenance of a contractual relationship to keep indefinitely the accounts of users who are totally inactive but who have not unsubscribed, since the account was created free of charge and an inactive user who wishes to use the service again can do so by recreating an account at any time.

34. Thus, the panel considers that the company has failed to fulfil its obligations under Article 5(1)(e) of the GDPR, the nature of the service offered to users being irrelevant.

35. 35. It notes, however, that DISCORD INC. now has a written policy on the duration of the retention of personal data processed, which provides in particular for the deletion of accounts after two years of inactivity by the user. The panel therefore considers that the company has complied with the obligations arising from Article 5(1)(e) of the GDPR.

D. On the breach of the transparency obligation

36. Article 12(1) of the GDPR provides that "the controller shall take appropriate measures to provide any information referred to in Articles 13 and 14 as well as any communication under Articles 15 to 22 and Article 34 in respect of the processing to the data subject in a concise, transparent, comprehensible and easily accessible manner, in clear and simple language, in particular for any information specifically intended for a child".

37. 37. The rapporteur noted that, during the inspection carried out on 17 November 2020, the CNIL delegation observed that after clicking on the link entitled "Privacy" in the footer, a page opened in the browser with the words "DISCORD PRIVACY POLICY". While the privacy policy was easily accessible from the registration form, it was only available in English, as of 23 June 2020 at the time of the online check.

38. In its defence, the company states that the privacy policy was already communicated to users in French at the time of the CNIL inspection. However, a technical problem that occurred on 16 November 2020 temporarily prevented the French translation of the privacy policy from appearing on the website during the inspection. It added that it had identified the technical problem and quickly implemented the necessary measures to resolve it, specifying that the French version of the privacy policy had become accessible again on 3 December 2020.

39. In the light of these elements, the rapporteur proposes that the restricted formation should not uphold the breach of Article 12 of the GDPR.

40. The restricted formation takes note of the elements provided by the company and considers that this breach is not established.

E. On the breach of the obligation to inform individuals

41. Article 13 of the GDPR lists the information to be provided by the controller to data subjects when their personal data are collected directly from them. Article 13(2) of the GDPR provides that "in addition to the information referred to in paragraph 1, the controller shall provide the data subject, at the time the personal data are obtained, with the following additional information necessary to ensure fair and transparent processing :

(a) the period of time for which the personal data will be kept or, where this is not possible, the criteria used to determine that period [...]".

42. The guidelines on transparency under Regulation (EU) 2016/679, clarifying the provisions of Article 13, state that "the retention period [...] should be formulated in such a way that the data subject can assess, depending on his or her situation, what the retention period will be in the case of specific data or for specific purposes. The controller cannot simply state in general terms that personal data will be kept for as long as the legitimate purpose of the processing requires. Where appropriate, different storage periods should be mentioned for the different categories of personal data and/or the different purposes of processing, including periods for archival purposes.

43. The rapporteur notes that the retention periods were stated in a generic manner, without being sufficiently explicit, as they were specified in the following terms: 'We generally retain personal data for the time necessary to fulfil the purposes defined in this document. To dispose of personal data, we may anonymise it, delete it or take other necessary steps. Data may remain for some time in the form of back-up copies or for commercial purposes. The rapporteur therefore concludes that there is a breach of the obligation to provide information.

44. In its defence, the company states that Article 13(2)(a) of the GDPR does not require the retention period as such to be provided, but instead leaves it to the controller to provide the 'criteria used to determine the retention period'. It added that in order to comply with this obligation, DISCORD INC. provided users with the said criteria, namely the duration necessary to achieve the purposes explicitly described in the privacy policy. Finally, the company adds that it has developed an information note that provides more details regarding the retention of personal data and that a link to the page "How long does Discord keep your information" has been included directly in the Privacy Policy.

45. The Panel considers that at the time of the online check, the retention periods were stated in a generic manner and were not sufficiently explicit. There was a lack of information on the retention periods as there were no specific periods or criteria for determining the retention periods. In any event, the restricted panel recalled that recourse to "criteria used to determine this duration" is only permitted when it is not possible to provide a precise duration. However, this is not the case in the present case with regard to the processing operations carried out by the company. As a result, individuals could not know the retention periods established by DISCORD INC. even though this information is important in order to guarantee "fair and transparent processing" since it helps to ensure that users have control over the processing of their data.

46. 46. Consequently, the Panel considers that the company has failed to fulfil its obligations under Article 13(2)(a) of the GDPR. It nevertheless notes the measures taken by DISCORD INC. in the course of the procedure and considers that the company has now complied on this point.

F. On the breach of the obligation to respect the right to object

47. Article 21(1) of the GDPR provides that "the data subject shall have the right to object at any time, on grounds relating to his or her particular situation, to the processing of personal data relating to him or her based on Article 6(1)(e) or (f), including profiling based on those provisions. The controller shall no longer process the personal data, unless it can demonstrate compelling legitimate grounds for the processing which override the interests and rights and freedoms of the data subject, or for the establishment, exercise or defence of legal claims.

48. The rapporteur noted that among the personal data processing operations carried out by DISCORD INC. is the processing operation "Use data to improve Discord", the purpose of which is to use the information collected via the services to "help [the company] improve the content and functionality of the services, better understand [its] users and improve the service". According to the company's processing register, the purpose of improving the service leads to the collection of the following data: IP address, user ID, operating system, chat servers joined, contacts/friends, games played, possible subscription to the premium version of discord, purchases made, functionalities used, activities on the platform, etc. The company adds that when the user objects to their data being used to improve the DISCORD service, they must go to the settings and deactivate the functionality. In this case, the company removes the association of the alias with the user ID, which then, according to the company, prevents it from being able to associate the data collected with the pseudonymous alias with the user ID. The rapporteur noted that the information provided by the company shows that the personal data of the person concerned continue to be processed, even though the user has expressed his wish to object. The mere break in the link between, on the one hand, the usage data processed and stored with the pseudonym alias and, on the other hand, the user's identifier associated with his or her account does not appear to be sufficient to consider that the user's right to object to the processing of his or her data for this purpose would be duly taken into account and effective.

49. In its defence, the company considers that the possibility of deactivating the "Use data to improve Discord" function with a slider button does not constitute the exercise of the right to object within the meaning of Article 21(1) of the GDPR. It stated that it had made a clear distinction in its replies to the delegation of control:

- on the one hand, a setting option that can be used via an online slider button without the need for a justification relating to the user's particular situation: in this case, the data are pseudonymised;

- on the other hand, the right to object, which is exercised in accordance with Article 21(1) of the GDPR. In this case, users shall submit their request to DISCORD or its representative, giving reasons relating to their particular situation to enable the company to assess whether this condition is met. This right shall not be exercised via the slider button.

50. The company considers that sanctioning it on this point would have the effect of dissuading organisations from implementing "privacy by design" and of pushing them to provide only the objection mechanism provided for in Article 21 of the GDPR.

51. In the light of these elements, the rapporteur proposes that the restricted formation should not retain this failure.

52. The restricted formation notes that it emerges from the elements communicated by the company that the existence of the slider button allowing the deactivation of the "Use data to improve Discord" function is a setting that is not intended to constitute the exercise of the right to object within the meaning of Article 21 of the RGPD and that DISCORD INC. offers an objection procedure that complies with this article.

53. In these circumstances, the Panel considers that the breach of Article 21 of the GDPR is not established.

G. On the breach of the obligation to ensure data protection by default

54. Article 25(2) states that "the controller shall implement appropriate technical and organisational measures to ensure that, by default, only personal data which are necessary for each specific purpose of processing are processed. This applies to the amount of personal data collected, the extent of their processing, their storage period and their accessibility. In particular, these measures shall ensure that, by default, personal data are not made accessible to an indeterminate number of natural persons without the involvement of the natural person concerned.

55. The rapporteur notes that, by default, the user has to perform several actions to exit the DISCORD application on Windows and Linux. The application is set to remain active even when the user closes the main window (by selecting the "X" icon in the top right-hand corner), thus allowing voice communication to continue without taking up any space on the computer desktop. Only a small indicator shows that the application is active. This indicator was present in the taskbar, which is located at the bottom right of the screen in Microsoft Windows, next to the date and time. The rapporteur concludes that this setting of the application led to the possibility that the user's personal data could be communicated to third parties through the voice lounge, even though the user thought, in the absence of specific information that was sufficiently visible and clear, that the collection of such data had ceased when he or she had chosen to close the application window.

56. In its defence, the company states that one of the primary functionalities of DISCORD is to be able to chat with friends, often while doing something else, such as playing a video game or surfing the web. According to the company, the user only wants to see the game he is playing on his screen and any intrusion on his screen would affect his game and disturb him. It considers that when a user logged into a voice chat room clicks on the "X" icon in the top right-hand corner, he or she does not think of leaving the application and is aware that he or she is still logged into the chat room. According to the company, they are repeatedly told that to leave a chat room they must click on the "logout" button (a phone icon with a cross in a red circle).

57. In addition, the company considers that it is essential to take into account the functioning of similar applications in order to assess the level of expectation of DISCORD users. The company refers to the EDPS Guidelines on data protection by design and by default, which state that "processing should correspond to the reasonable expectations of the data subjects". According to the company, the rapporteur cannot conclude that there is a breach of data protection by design without first determining whether users have been misled or whether the processing and operation of the application is unexpected, harmful or discriminatory to its users. According to the company, the user can legitimately expect the DISCORD application to function in this way because, on the one hand, applications on Windows and Linux operating systems function in the same way (according to the company, when the user clicks on the "X" icon, he or she expects the application to function in the same way), the user expects that this action will only close a window and put the application in the background) and, on the other hand, the user's expectation when clicking on the "X" icon is to put the application in the background so that they can continue chatting while performing other actions on their computer. Furthermore, the company explains that the closing modalities can be changed through a setting available to users, who can decide on a single action to be performed to close the application.

58. Finally, the company states that it has now implemented a pop-up window that tells users of Windows and Linux operating systems that the DISCORD application is still running when the window has been closed and that these settings can be changed directly by the user.

59. Firstly, the restricted formation notes that, if a user connected to a voice room closes the application window by clicking on the "X" icon at the top right (in Microsoft Windows), he or she is in fact only putting the application in the background and not leaving it; he or she is therefore still connected to the voice room. However, in Microsoft Windows, and more generally in the symbolism commonly used in computing, clicking on the "X" at the top right of the last visible window of an application usually exits it. Minimising the application in the background is usually achieved by clicking on a "-" icon. However, the behaviour of DISCORD is different. Therefore, the restricted formation considers that the user should be given specific information so that he is aware of this difference. However, this was not the case at the time the online check was carried out.

60. The restricted formation specifies that, if there are applications with communication functions that are in reality only reduced in the background after the user has clicked on the close cross, generally, either the applications that have such a behaviour inform the user with a pop-up window when the cross is first clicked that the application will go into the background but will continue to function; or, by default, the background reduction behaviour is not activated and it is up to the user to set it manually. In the present case, before the pop-up window mentioned above was put in place, background minimisation took place by default from the first use after installation, without any warning or clear information.

61. Consequently, the company cannot validly maintain that the application's operation corresponds to the user's expectations, insofar as other applications inform the person or allow the user to make this specific setting himself.

62. Secondly, the restricted formation noted that this default setting of the application - which provided that it was not exited when the main window was closed - led to the possibility that the user's personal data could be communicated to third parties without the user necessarily being aware of this. Indeed, the user was not necessarily aware that his or her words continued to be transmitted and heard by the other members present in the voice room. The restricted formation noted that such a setting, in the absence of sufficiently clear and visible information, presented significant risks for users, in particular of intrusion into their private life.

63. Consequently, the Restricted Panel considers that the company failed to fulfil its obligations under Article 25(2) of the GDPR, which requires data protection by default.

64. It nevertheless notes that DISCORD INC. has now set up a "pop-up" window which, when the window has been closed for the first time, alerts those connected to a voice room that the DISCORD application is still running and that these settings can be changed directly by the user.

H. On the breach of the obligation to ensure data security

65. According to Article 32 of the GDPR, "taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of the processing as well as the risks, varying in likelihood and severity, to the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia, as necessary: [...]

(b) means to ensure the continuing confidentiality, integrity, availability and resilience of the processing systems and services;

c) [...]

(d) a procedure to regularly test, analyse and evaluate the effectiveness of the technical and organisational measures to ensure the security of the processing".

66. The rapporteur noted that, when creating an account on DISCORD, a password consisting of six characters including letters and numbers was accepted. The rapporteur considered that such passwords, without sufficiently complex criteria and without any additional security measures, do not ensure the security of the personal data processed by the company and prevent unauthorised third parties from having access to these data.

67. In its defence, the company contests the rapporteur's analysis and considers that it has put in place measures to ensure a high level of security for its users' access to its system, including measures to prevent brute force attacks: limiting login attempts to one per second; verification by email or SMS to validate the login when the company receives a login request from an IP address outside the area of the previous login IP address; rejection of commonly used and compromised passwords and implementation of a "captcha" for logins from new IP address ranges.

68. 68. The company also made changes to its password security processes in the course of the sanction proceedings:

- it now requires French users to set passwords that are at least eight characters long, with at least three of those characters being lowercase letters, uppercase letters, numbers or special characters;

- after ten unsuccessful login attempts, the company requires the resolution of a "captcha".

69. The restricted formation considers that the length and complexity of a password remain basic criteria for assessing its strength. In this respect, it notes that the need for a strong password is also emphasised by the Agence nationale de sécurité des systèmes d'information.

70. By way of clarification, the restricted panel recalls that to ensure a sufficient level of security and meet the requirements of password strength, if authentication provides for restricted access to the account, the CNIL recommends, in its deliberation No. 2017-012 of 19 January 2017, that the password should comprise at least eight characters, containing at least three of the four categories of characters (upper case, lower case, numbers and special characters) and that authentication should include a restriction on access to the account, such as a timeout on access to the account after several failed attempts (temporary suspension of access, the duration of which increases with the number of attempts), the implementation of a mechanism to protect against automated and intensive submissions of attempts (such as a "captcha") and/or the blocking of the account after several unsuccessful authentication attempts.

71. In the present case, the Panel notes that a password of six characters including letters and numbers was accepted at the time of the online check. The restricted formation considered that, in view of the undemanding rules governing their composition, as well as the volume of personal data to be protected, the robustness of the passwords accepted by the company was too low, leading to a risk of compromise of the associated accounts and the personal data they contained, despite the additional security measures put in place before the sanction procedure.

72. Under these circumstances, in view of the risks incurred by individuals, the restricted panel considers that the above facts constitute a breach of Article 32 of the GDPR, since the company's password management policy was not sufficiently robust and binding to guarantee data security within the meaning of that Article.

73. However, the Commission notes that, in the course of the sanction procedure, the company made changes in this respect and complied with the provisions of Article 32 of the GDPR.

I. On the failure to carry out a data protection impact assessment

74. Article 35(1) of the GDPR provides that 'where a type of processing, in particular through the use of new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall carry out an analysis of the impact of the proposed processing operations on the protection of personal data prior to the processing'.

75. Recital 91 of the GDPR provides, inter alia, that an impact assessment "should apply in particular to large-scale processing operations intended to process a considerable volume of personal data at regional, national or supranational level, which may affect a significant number of data subjects [...]".

76. The rapporteur considers that the company should have carried out a data protection impact assessment (hereinafter "DPIA"), in the light of two criteria that make it possible to consider that the processing was likely to give rise to a high risk: the collection of data on a large scale and the collection of data concerning vulnerable persons. The rapporteur considers that the processing operations carried out by the company are likely to result in a high risk to the rights and freedoms of natural persons and concludes that DISCORD INC. has failed to comply with the obligations of Article 35 of the RGPD by not carrying out a data protection impact assessment.

77. In its defence, DISCORD INC. states that it considered that a DPIA was not necessary as it processes only very limited data, namely those necessary to allow users to create their account, to provide its services, to fulfil its commitments to its users and to comply with its legal obligations; it does not carry out any of the processing operations listed in Article 35(3) of the GDPR as requiring an impact assessment; it does not carry out any of the processing operations for which the CNIL has considered that a DPIA is required, in accordance with the list of processing operations for which such an analysis is necessary published on 6 November 2018; it does not process data on "children", since it only addresses users over the age of fifteen who have a sufficient degree of maturity to use its services.

78. The company further points out that the G29 guidelines on DPIA and how to determine whether processing is "likely to result in a high risk" for the purposes of Regulation (EU) 2016/679 recall that a DPIA is not automatic even when the processing meets two of the nine criteria set out among those to be considered.

79. Finally, the company explains that, even though it was not required to do so under the GDPR, it has since carried out two DPIAs for its processing related to the DISCORD service and its core services, which concluded that the processing is not likely to result in a high risk to individuals' rights and freedoms.

80. Firstly, the panel considers that by processing the data of more than [...] users in France, DISCORD INC. is carrying out large-scale processing of personal data. Moreover, the panel notes that the application is also intended to be used by children aged fifteen, of which DISCORD INC. is fully aware, since it itself states that it is "committed to protecting the privacy of children and has therefore put in place measures to ensure that no child under the minimum age defined for each country can access Discord services and create an account".

81. The restricted formation recalls that, according to recital 38 of the GDPR, "children deserve specific protection with regard to their personal data because they may be less aware of the risks, consequences and safeguards involved and of their rights in relation to the processing of personal data" and that, pursuant to Article 1 of the International Convention on the Rights of the Child, "a child means every human being below the age of eighteen years". Although, pursuant to Article 8 of the GDPR and Article 45 of the Data Protection Act, a minor may consent alone to the processing of personal data in relation to the direct provision of information society services from the age of fifteen, the fact remains that a minor between the ages of fifteen and eighteen is still a child, and therefore a vulnerable person.

82. The restricted formation recalls, by way of clarification, that the aforementioned G29 guidelines on DPIA, amended and last adopted on 4 October 2017, set out a list of nine criteria to be taken into account in order to give a more concrete vision of processing operations that require an impact assessment due to a high inherent risk. These criteria include the collection of personal data on a large scale and the collection of data on vulnerable persons. The Guidelines add that "in most cases, the controller may consider that a processing operation meeting two criteria requires an DPIA".

83. With regard to the first criterion relating to large-scale data collection, the Guidelines explain that account should be taken, in particular, of the number of data subjects, the volume of data and/or the range of different data elements processed, the duration or permanence of the data processing activity and the geographical scope of the processing activity. With regard to the second criterion relating to the collection of data relating to vulnerable persons, the Guidelines indicate that the processing of data relating to vulnerable persons is a criterion because of the increased power imbalance between the data subjects and the controller, which means that the former may be unable to consent or object easily to the processing of their data or to exercise their rights. Among these vulnerable persons, the Guidelines mention children "who may be seen as unable to object or to give informed and considered consent to the processing of their data.

84. Consequently, the Panel considers that the company should have carried out an impact assessment of the data processing implemented, in view of the volume of data processed by the company and the use of its services by children.

85. Secondly, the Restricted Panel notes that, while the processing operations carried out by the company are not included in the "list of types of processing operations for which an analysis relating to data protection is required" published by the CNIL (Deliberation No. 2018-327 of 11 October 2018), they are also not included in the "list of types of processing operations for which an impact analysis relating to data protection is not required" (Deliberation No. 2019-118 of 12 September 2019).

86. Thirdly, the restricted panel notes that the company has, in the context of the present procedure, carried out two DPIAs, which were transmitted to the CNIL and concluded that the processing is not likely to give rise to a high risk for the rights and freedoms of individuals. It nevertheless notes that although the impact assessments carried out concluded that there was no high risk, it is nonetheless imperative that they be carried out beforehand in order to ensure this.

87. In view of all these elements, the restricted panel considers that the company has failed to comply with the obligations of Article 35 of the GDPR.

III. On the corrective measures and their publicity

88. Under the terms of Article 20, III, of the Act of 6 January 1978 as amended, "When the data controller or its processor does not comply with the obligations resulting from Regulation (EU) 2016/679 of 27 April 2016 or from this Act, the president of the Commission nationale de l'informatique et des libertés may also, where applicable, after having sent him the warning provided for in I of this article or, where applicable, in addition to a formal notice provided for in II, refer the matter to the restricted formation of the Commission with a view to pronouncing, after an adversarial procedure, one or more of the following measures: [...]

7° With the exception of cases where the processing is implemented by the State, an administrative fine that may not exceed 10 million euros or, in the case of a company, 2% of the total annual worldwide turnover for the previous financial year, whichever is higher. In the cases referred to in Articles 5 and 6 of Regulation (EU) 2016/679 of 27 April 2016, these ceilings shall be increased to EUR 20 million and 4% of the said turnover respectively. The restricted formation shall take into account, in determining the amount of the fine, the criteria specified in the same Article 83."

89. Article 83 of the RGPD provides that "each supervisory authority shall ensure that administrative fines imposed pursuant to this Article for infringements of this Regulation referred to in paragraphs 4, 5 and 6 are, in each case, effective, proportionate and dissuasive", before specifying the elements to be taken into account in deciding whether to impose an administrative fine and in deciding the amount of that fine.

90. Firstly, as regards the principle of imposing a fine, the company asked the restricted formation not to impose a fine on it, insofar as it contested all the breaches of which it was accused. As regards the amount of the fine, the company considers that its good faith and willingness to cooperate were not effectively taken into account in the rapporteur's proposal.

91. The restricted formation recalls that it must take into account, when imposing an administrative fine, the criteria specified in Article 83 of the GDPR, such as the nature, seriousness and duration of the breach, the measures taken by the controller to mitigate the damage suffered by the data subjects, the degree of cooperation with the supervisory authority and the categories of personal data affected by the breach.

92. The Panel notes that the breaches committed by the company concern obligations relating to the fundamental principles of personal data protection and that five breaches are established.

93. The restricted panel then notes that the processing implemented by DISCORD INC. concerns a very large number of persons located in France, since more than [...] DISCORD user accounts were registered in France in January 2021, including minors.

94. Consequently, the restricted formation considers that an administrative fine should be imposed with regard to the breaches constituted by Articles 5(1)(e), 13, 25(2), 32 and 35 of the GDPR.

95. The restricted formation recalls that breaches relating to Articles 5(1)(e) and 13 of the GDPR are breaches of principles which may be subject, under Article 83 of the GDPR, to an administrative fine of up to EUR 20 000 000 or up to 4 % of the worldwide annual turnover in the preceding business year, whichever is higher.

96. The Restricted Section also recalls that administrative fines must be both dissuasive and proportionate. In particular, it considers that the company's activity and financial situation must be taken into account when determining the amount of the administrative fine. In this respect, it notes that DISCORD INC. has a turnover of approximately USD [...] in 2019 and more than USD [...] in 2020.

97. 97. Furthermore, the panel notes the efforts made by the company to comply throughout the procedure, as well as the fact that its business model is not based on the exploitation of personal data.

98. 98. Therefore, in view of these elements, the restricted panel considers that the imposition of an administrative fine of EUR 800 000 appears justified.

99. Secondly, with regard to the publicity of the decision to impose a penalty, the company maintains that such a measure would cause it unjust harm and does not appear justified in view of the level of protection of personal data that it provided to its users at the time of the inspection and that it continues to provide to all its users. It considers that such publicity would lead users to believe that their personal data are not being processed in accordance with the law, whereas they are properly informed of the processing of their data, the security of their data is ensured by robust measures and the exercise of their rights is respected.

100. The restricted formation considers that the publicity of the sanction is justified in view of the number of persons concerned, the number of breaches committed and their seriousness.

FOR THESE REASONS

The CNIL's select committee, after deliberation, decides to :

- to impose an administrative fine of 800,000 (eight hundred thousand) euros on DISCORD INC. for breaches of Articles 5(1)(e), 13, 25(2), 32 and 35 of the GDPR;

- to make public, on the CNIL website and on the Légifrance website, its decision, which will no longer identify the company by name at the end of a period of two years from its publication.

The Chairman

Alexandre LINDEN

This decision may be appealed to the Council of State within four months of its notification.