CNIL (France) - SAN-2019-001

From GDPRhub
Revision as of 13:01, 19 October 2020 by Roka (talk | contribs)
CNIL - SAN-2019-001
LogoFR.png
Authority: CNIL (France)
Jurisdiction: France
Relevant Law: Article 4(11) GDPR

Article 6(1)(a) GDPR

Article 7 GDPR

Article 12 GDPR

Article 13 GDPR

Type: Complaint
Outcome: Upheld
Decided: 21.01.2019
Published: 22.01.2019
Fine: 50,000,000 EUR
Parties: Google LLC Vs. noyb and La Quadrature du Net
National Case Number: SAN-2019-001
European Case Law Identifier: n/a
Appeal: Conseil d'Etat
Original Language:

French

Original Source: CNIL (in FR)

The CNIL imposed a record fine of €50 million on Google.

English Summary

Facts

The NGO noyb filed a complaint with the CNIl related to the follwing practice: Google is conditioning the use of a phone running Android to the acceptance of Google’s terms and conditions and privacy policy, making the device unusable otherwise.

The NGO la Quadrature du Net (LQDN) filed a complaint with the CNIL about Google's lack of lawful basis to process personal data for targeted advertising purposes.

The CNIL decided to gather the two complaints and decide on them in a single decision, following an extensive investigation.

Google's main arguments were that the complaints are inadmissible and there was a violation of the company's right to a fair trial (art. 6 ECHR), in particular because of the language used (French) and the imparted time to respond.

Dispute

Does the Google's acceptance system for terms and conditions and privacy policy are in line with the transparency and information obligations?

Is there a legal basis for the processing?

Is the case admissible? Is the company's right to a fair trial violated?

Holding

On the admissibility, the CNIL replied that the admissibility of the complaints would in any case have no influence on the legality of the procedure because the CNIL’s competency is not subject to the receipt of a complaint, the DPA can initiate proceedings ex officio on the basis of its own findings.

On the alleged violation of the defendant's rights to a fair trial, the CNIL rejected both arguments.

On the failure to comply with transparency and information obligations:

In essence, the CNIL acknowledged that Google has made progress in terms of transparency and control given to users over their personal data. It then comes to the notion information accessibility, according to which the data subject must be able to determine in advance which processing operations will be performed. The CNIL notes that Google has scattered the information in several documents, not all of which are directly accessible, and that Google's design choices fragment the information (buttons and links must be clicked to access the relevant information). According to the CNIL, the amount of information to be read before data processing operations can be identified is too large. Finally, the data subject will have to cross-reference the information to understand what processing operations are being carried out.

The CNIL therefore concluded that there is a general lack of accessibility of information; Interestingly, the CNIL also concedes that exhaustive information, from the first level, would be counterproductive and would not respect the requirement of transparency.

The CNIL goes on to point out that the processing operations carried out by Google are "particularly massive and intrusive" and that the data come from many sources.

The information provided to the user must be clear and comprehensible, in accordance with Art. 12 GDPR, and it is in the light of the processing operations carried out that the clear and comprehensible nature must be analysed.

In short, with regard to the information made available by Google, the CNIL considered that:

-         the purposes of the processing operations are described in a way that is far too generic given the scope and consequences of the processing operations carried out;

-         the description of the purposes does not allow users to measure the extent of the processing and the degree of intrusion into their private sphere;

-         the description of the data collected is imprecise and incomplete.

The lack of clarity and understandability must also be analysed according to the legal basis on which the processing operation is based (in this case: consent). The CNIL states that Google's formulations do not allow the user to distinguish between personalized advertising (carried out using user data, and on the basis of their consent) and other forms of targeting based on legitimate interest.

Finally, the CNIL stresses that Google's efforts with regard to the tools it makes available to users (information pop-up, privacy check-up, dashboard) only partially contribute to the objective of transparency. As the information must be provided at the time the data is collected, the tools in question are only made available once the Google Account has been created, in other words after a multitude of data processing operations have been carried out.

Regarding the lack of a legal basis for the implementation of processing operations:

Google declared that it only relies on consent for processing operations related to targeted advertising, and complies with the GDPR in this respect.

Consent must be informed. In view of the dissemination of the information, the CNIL considers that this requirement is not met. In particular, it is not possible to view the Google services, sites and applications referred to by Google in its Terms of Use and Privacy Policy.

Consent must be specific and unambiguous. The CNIL noted first of all that when creating a Google account, the user has the possibility to modify certain parameters. However, settings related to account customization and display of targeted ads were enabled by default. The CNIL concluded that:

- consent was not validly obtained because it was not given through a positive act but by an opposition to the processing operation (opt-out)

- consent was not specific because the acceptance of the T&Cs and the privacy policy was only possible in bulk, thus preventing a granular choice of processing. The CNIL noted that at the very least, before the user is given the choice of accepting or refusing everything, he or she should be given the opportunity to give his or her specific consent for each processing operation.

For all these reasons, the CNIL decided to impose a penalty of € 50 million and an additional to have the decision published.

On June 19th 2020, the Supreme Administrative Court (Conseil d'Etat) confirmed fully the CNIL's decision in CE - N° 430810.

Comment

Share your comments here!

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the original. Please refer to the French original for more details.

Deliberation SAN-2019-001 of 21 January 2019
 
Print

National Commission for Information Technology and Civil Liberties
Deliberation No. SAN-2019-001 of January 21, 2019
Deliberation of the restricted formation n° SAN - 2019-001 of January 21, 2019 pronouncing a pecuniary sanction against the company GOOGLE LLC
Status: EFFECTIVE

The National Commission for Information Technology and Civil Liberties, in its restricted formation, composed of Mr Jean-François CARREZ, President, Mr Alexandre LINDEN, Vice-President, Mrs Dominique CASTERA, Mrs Marie-Hélène MITJAVILE and Mr Maurice RONAI, members ;

Having regard to Council of Europe Convention No. 108 of 28 January 1981 for the Protection of Individuals with regard to Automatic Processing of Personal Data ;

Having regard to Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of personal data and on the free movement of such data ;

Having regard to Law No. 78-17 of 6 January 1978, as amended, relating to data processing, data files and liberties, in particular Articles 45 et seq;

Having regard to decree n° 2005-1309 of 20 October 2005 as amended, taken for the application of the law
No. 78-17 of 6 January 1978 as amended relating to data processing, data files and liberties ;

Having regard to deliberation n° 2013-175 of 4 July 2013 adopting the internal regulations of the National Commission for Data Processing and Liberties;

Having regard to Decision No. 2018-199C of 20 September 2018 of the President of the National Commission for Data Processing and Liberties to instruct the Secretary General to carry out or have carried out a mission to verify any processing relating to the use of the Android operating system for multifunction mobile phones, including the creation of a Google account;

Having regard to the decision of the President of the Commission nationale de l'informatique et des libertés appointing a rapporteur to the restricted formation, dated 2 October 2018;

Having regard to the report of Mr François PELLEGRINI, Commissioner-Rapporteur, dated 22 October 2018;

Having regard to the written observations submitted by Google LLC. on 22 November 2018;

Having regard to the observations in response by the Commissioner-Rapporteur of 7 December 2018;

Having regard to the observations in response submitted by Google LLC. on 4 January 2019 and the oral observations made during the session of the restricted formation;

Having regard to the other documents on file;

Were present, at the session of the restricted formation of January 15, 2019:

Mr. François PELLEGRINI, Commissioner, heard in his report;

As representatives of Google LLC. :

As counsel for the company Google LLC. ;

[…]

Ms. Eve JULLIEN, Government Commissioner, who made no comment;

The company having had the last word;

After deliberation, adopted the following decision:

Facts and procedure

Founded in 1998, Google LLC. (hereinafter Google or the Company) is a limited liability company incorporated under U.S. law and headquartered in Moutain View, California, USA.

A wholly owned subsidiary of ALPHABET since 2015, the company had a turnover of $109.7 billion (approximately EUR 96 billion) in 2017. It has more than 70 offices in about 50 countries and employs about 70,000 people worldwide. In France, it has one establishment, Google France Sarl, located at 8 rue de Londres in Paris (75009), which has approximately 600 employees and a turnover of approximately 325 million euros in 2017.

Since its inception, the company has developed a variety of services for businesses and individuals (e.g. the Gmail email service, the Google Search search engine, YouTube etc.). It has also designed the operating system for Android mobile terminals, which includes the Google Play application store. In addition, the company is active in advertising sales.

In 2016, this operating system had 27 million users in France.

On 25 and 28 May 2018, the Commission nationale de l'informatique et des libertés (hereinafter CNIL or the Commission) received two collective complaints lodged under Article 80 of Regulation (EU) No 2016/679 of the European Parliament and of the Council of 27 April 2016 (General Data Protection Regulation, hereinafter RGPD or the Regulation) by the None Of Your Business (hereinafter NOYB) association and the La Quadrature du Net (hereinafter LQDN) association respectively. Cumulatively, these complaints include the claims of 9974 individuals.

In its complaint, the NOYB association states in particular that users of Android mobile terminals are required to accept Google's privacy policy and general terms and conditions of use of services and that without such acceptance, they will not be able to use their terminal.

The LQDN association considers that, irrespective of the device used, Google does not have a valid legal basis for processing personal data for the purposes of behavioural analysis and advertising targeting.

On 1 June 2018, the CNIL submitted the above-mentioned complaints to its European counterparts via the European information exchange system with a view to designating a possible lead authority in accordance with the provisions of Article 56 of the RGPD.

In application of decision no. 2018-199C of 20 September 2018 of the President of the Commission, an online check was carried out on the following 21 September to verify the conformity of any processing relating to the use of the Android operating system for mobile equipment, including the creation of a Google account, with the law of 6 January 1978 relating to information technology, files and liberties (hereafter referred to as the Data Protection Act or the Act of 6 January 1978) and the RGPD.

The online control report n° 2018-199/1 was notified to the companies GOOGLE LLC. and Google France SARL on September 24th and 25th, 2018.

The two companies were also notified of the above-mentioned complaints by letters from the CNIL on September 28, 2018.

In order to investigate these elements, the President of the CNIL appointed Mr. François Pellegrini as rapporteur on October 2, 2018, pursuant to Article 47 of the Act of January 6, 1978.

At the end of his investigation, the rapporteur had Google LLC. and Google France SARL notified the companies Google LLC. and Google France SARL, on 22 October 2018, of a report detailing the breaches relating to Articles 6, 12 and 13 of the RGPD which he considered to be constituted in the present case.

That report proposed to the restricted formation of the CNIL that it impose a financial penalty of EUR 50 million on Google LLC, which was made public. It was also proposed that the report should be published in a publication, newspaper or other medium designated by the panel.

The report also included an invitation to attend the meeting of the restricted formation on 10 January 2019. The body was given one month to submit its written comments.

By letter of 7 November 2018, the company requested a hearing with the rapporteur, which was not granted by letter of 13 November 2018. On the same date, the company also made a request for an in camera hearing and a postponement of the hearing, which was not granted, by letter dated 15 November 2018.

On 22 November 2018, the company filed written comments on the report. These comments were answered by the rapporteur on 7 December 2018.

By letter of 11 December 2018, the company, which had 15 days from receipt of the rapporteur's reply, requested the Chairman of the Panel to postpone the meeting and to extend the deadline for submitting further comments. The request was accepted by the Chairman of the restricted panel on 13 December 2018, who decided, firstly, to postpone the deadline for submitting these observations by two weeks - until 7 January - and, secondly, to postpone the meeting until 15 January 2019.

On 4 January 2019, the company submitted new observations in response to those of the rapporteur.

All of the observations were reiterated orally by the company and the rapporteur at the session of the restricted panel on 15 January 2019.

II Reasons for the decision

1 On the competence of the CNIL

Article 55(1) of the GPSR provides: Each supervisory authority shall be competent to exercise the tasks and powers conferred on it in accordance with this Regulation in the territory of the Member State to which it belongs .

Article 56(1) of the EPMR provides: Without prejudice to Article 55, the supervisory authority of the principal or sole establishment of the controller or processor shall be competent to act as lead supervisory authority regarding the cross-border processing carried out by that controller or processor, in accordance with the procedure laid down in Article 60 .

The company maintains first of all that the CNIL is not competent to carry out this procedure and that it should have forwarded the complaints received to the Irish Data Protection Commission (hereinafter DPC), which, as Google's lead authority, would be responsible for dealing with complaints relating to cross-border processing, in accordance with the cooperation procedure laid down in Article 60 of the DPC. The company considers that Google Ireland Limited should be considered as its principal place of business within the European Union for some of the cross-border processing it carries out, and in particular those that are the subject of the complaints received by the CNIL. Consequently, the data protection authority should, in its view, be regarded as its lead supervisory authority and, as such, be responsible for dealing with complaints received by the CNIL.

As evidence of the fact that Google Ireland Limited is its principal place of business within the Union, it points out that this company has been Google's head office for its European operations since 2003 and that it is the entity in charge of several organisational functions necessary to carry out these operations for the Europe, Middle East and Africa zone (general secretariat, taxation, accounting, internal audit, etc.). It also states that the conclusion of all advertising sales contracts with clients based in the European Union is the responsibility of this company. The company employs more than 3,600 people and has, among other things, a dedicated team in charge of managing requests made within the European Union in relation to confidentiality and a privacy officer. Finally, it states that an operational and organisational reorganisation is under way to make Google Ireland Limited the data controller for certain processing of personal data concerning European nationals.

It also considers that the definition of principal place of business must be distinguished from that of controller and that, if the European legislature had intended the concept of principal place of business to be interpreted as the place where decisions on processing operations are taken, it would have expressly so stated.

It then considers that, given the cross-border nature of the processing of advertising personalisation, the significant number of Android users in Europe and the issues raised in relation to such processing, the cooperation and coherence mechanisms provided for in Articles 60, 64 and 65 of the RGPD should have applied. In particular, it specifies that the European Data Protection Committee (hereinafter EDPS ) should have been seized in case of doubt as to the determination of the lead authority.

Finally, the company considers that any informal discussions that may have taken place between the other European supervisory authorities on this procedure are without legal effect as they took place without the presence of the EDPS.

On Google Ireland Limited's status as principal place of business

(a) as regards a controller established in several Member States, the place of its central administration within the Union, unless decisions as to the purposes and means of processing personal data are taken at another establishment of the controller within the Union and that establishment has the power to enforce those decisions, in which case the establishment which has taken such decisions shall be regarded as the principal establishment .

Recital 36 of the EPMR states: The principal establishment of a controller in the Union should be determined according to objective criteria and should involve the effective and actual exercise of management activities determining the main decisions as to the purposes and means of the processing in the context of a stable system.

The restricted formation considers that it follows from these provisions that, in order to qualify as a principal place of business, the establishment concerned must have decision-making powers with regard to the personal data processing operations in question. The status of principal establishment presupposes the effective and real exercise of management activities which determine the main decisions as to the purposes and means of processing.

Therefore, the existence of a principal establishment is assessed in concreto, in the light of objective criteria, and the principal establishment cannot automatically correspond to the head office of the controller in Europe.

The restricted training notes that this analysis is also the one adopted by all European supervisory authorities, as attested by the EDPS guidelines of 5 April 2017 on the designation of a lead supervisory authority for a controller or processor (WP244). These guidelines state that: the central administration is the place where decisions are taken as to the purposes and means of processing personal data, and this place has the power to enforce those decisions.

The restricted formation further notes that the same guidelines indicate that: The General Regulation does not allow for forum shopping . The conclusions cannot be based exclusively on statements by the organisation in question .

The decision-making powers of Google Ireland Limited must therefore be assessed in order to determine whether it can be classified as a principal place of business.

In that regard, the restricted formation notes that Google Ireland Limited does indeed have many financial and human resources which enable Google to provide services in Europe, in particular through the sale of advertising services.

However, although that evidence attests to that participation, the restricted training considers that it does not allow Google Ireland Limited to be qualified as a principal place of business. The evidence provided does not in itself show that Google Ireland Limited had, at the date on which the proceedings were brought, any decision-making power as to the purposes and means of the processing covered by the privacy policy presented to the user when he created his account, when configuring his mobile telephone for Android. These elements only reveal the involvement of this entity in the context of different activities of the company (financial and accounting activities, sale of advertising space, contracting, etc.).

The restricted training also notes that Google Ireland Limited is not mentioned in the Company's Privacy Policy dated 25 May 2018 as the entity where the main decisions are taken as to the purposes and means of processing covered by the privacy policy presented to the user when creating his account, when configuring his mobile phone under Android.

It also points out that Google Ireland Limited has not appointed a data protection officer who would be responsible for any processing of personal data that it might carry out in the European Union. The Commission further notes that the Android operating system is developed solely by Google LLC.

Finally, the restricted formation notes that the company itself indicated, by letter dated 3 December 2018 addressed to the CPD, that the transfer of responsibility from Google LLC. to Google Ireland Limited for certain processing of personal data concerning European nationals would be finalised on 31 January 2019. It subsequently indicated that it would update its privacy policy, which will come into force on 22 January 2019.

In the light of all these elements, the restricted formation considers that Google Ireland Limited cannot be considered as the principal place of business of Google LLC. in Europe within the meaning of Article 4(16) of the RGPD, since it has not been established that it has a decision-making power with regard to the processing covered by the privacy policy presented to the user at the time of the creation of his account when configuring his mobile phone under Android.

In the absence of a lead institution allowing the identification of a lead authority, the CNIL was competent to initiate this procedure and to exercise all of its powers under Article 58 of the RGPD.

(b) On the application of cooperation and consistency procedures

In the first place, the company argues that the CNIL should have referred the matter to the EDPS because of the uncertainty as to which supervisory authority should act as lead authority.

The restricted formation considers first of all that the absence of a principal place of business of a controller within the European Union does not in itself create uncertainty as to the identification of a supervisory authority that can act as lead authority. It only results from the absence of a principal place of business that the identification of a lead authority is not necessary and that the one-stop-shop mechanism is not intended to apply.

The restricted formation then notes that the CNIL immediately communicated the complaints received to all the supervisory authorities, via the European information exchange system, with a view to identifying a possible lead authority, in accordance with the provisions of Article 56 of the RGPD.

The restricted formation notes that, in the context of this procedure, no supervisory authority, nor the Chairman of the Committee, has deemed it necessary to refer the matter to the EDPS because of uncertainty as to the identification of the lead authority or the competence of the CNIL.

It further notes that the analysis concluding that Google LLC. had no principal place of business in Europe for the processing operations covered by the complaints, and the resulting lack of a lead authority, was shared by the CPD.

The Commission thus notes that the DPC publicly stated on 27 August 2018 - in a press article in the Irish Times - that it was not the lead authority for the processing operations that could be carried out by the company: the DPC is not Google's 'principal regulator' (nor, in data protection terms, its 'lead supervisor') [...] Google LLC, a US company, is the controller and Google has no access to the one-stop-shop mechanism. […]. The current position is that Google is subject to the control of all European regulators [...] .

It does not therefore follow from the investigation that there would have been any doubts or diverging views among the supervisory authorities that would have required a referral to the EDPS under Article 65 of the GDMP. Moreover, taking into account the guidelines already adopted at European level to guide national authorities in identifying the possible lead authority, there was no new issue justifying a referral to the EDPS by the CNIL under Article 64.

In view of all these elements, the restricted formation considers that the CNIL was not obliged to refer the matter to the EDPS with a view to identifying a lead authority.

Secondly, while the company argues that the CNIL should have cooperated in the investigation of complaints and in the follow-up to be given to them, the restricted formation recalls, as mentioned above, that the CNIL communicated the complaints to all supervisory authorities of the European Union, via the European information exchange system, as soon as they were received, with a view to identifying a possible lead authority.

The limited training thus shows that a cooperation procedure has indeed been initiated with the supervisory authorities, in accordance with the provisions of Article 56 of the PGRD, initially solely on the question of identifying the respective competences of these authorities.

The restricted training notes that this stage of disseminating information and determining a possible lead authority is a prerequisite for the possible application of the one-stop-shop mechanism provided for in Article 60 of the GDMP.

The training then notes that since this step and the resulting exchanges did not lead to the identification of a lead institution or, consequently, a lead authority, no other obligation of cooperation was subsequently imposed on the CNIL, in particular under article 60 of the RGPD.

Lastly, the restricted formation recalls that, in the interests of consistency, in accordance with the guidelines set out in Article 63 of the RGPD, the CNIL informed and consulted its European counterparts on several occasions on the investigations it had carried out and took the utmost account of the guidelines adopted by the EDPS with a view to ensuring uniform application of the Regulation.

In the light of these elements, the restricted training considers that the procedures for cooperation and consistency have not been disregarded.

Moreover, the restricted training notes that, except where expressly provided for, which is not the case here as regards the determination of the lead authority, the supervisory authorities are not required to inform controllers when implementing cooperative actions or to enable them to participate in exchanges between authorities.

2. on procedure

First, the company maintains that the admissibility of the complaints filed by the associations None Of Your Business and La Quadrature du Net has not been established.

The Panel considers that the question of the admissibility of the abovementioned complaints does not in any event have any influence on the legality of the present procedure, since referral to the Panel is not necessarily conditional on receipt of a complaint and may result from a self-referral by the Commission on the basis of the findings made by the Commission's services. It points out that the CNIL's task is to monitor the application of the Regulation and ensure compliance with it and that, to that end, it has the power to conduct investigations, in accordance with Article 57(1)(a) and (h) of the RGPD.

Moreover, the restricted formation notes that Article 80 of the EPMR provides for the possibility for a person to mandate a non-profit association, which has been validly constituted in accordance with the law of a Member State, whose statutory objectives are of public interest and which is active in the field of the protection of the rights and freedoms of data subjects in the context of the protection of personal data, to lodge a complaint on their behalf.

With regard to LQDN, the restricted formation notes that it is a French association created on 3 January 2013. According to its statutes, the purpose of this association is, inter alia, to carry out actions to ensure the defence of fundamental rights and freedoms in the digital space [...].

As regards NOYB, the Commission notes that it is a non-profit-making association validly constituted on Austrian territory since 12 June 2017. According to its statutes, its purpose is, inter alia, to represent the rights and interests of users in the digital field (including consumer rights, fundamental rights of privacy, data protection, freedom of expression, freedom of information and the fundamental right to an effective remedy).

The training also notes that these two associations have received a representation mandate under Article 80 of the DPMR from the persons who have seized them.

Secondly, the company argues that the proceedings brought against it infringed its right to a fair trial as provided for, inter alia, in article 6 of the Convention for the Protection of Human Rights and Fundamental Freedoms.

On this point, it maintains, firstly, that the report proposing a penalty and the rapporteur's replies to its observations were sent to it solely in French and, secondly, that the refusal to extend the deadline for submitting its initial observations limited the time available to it to prepare its defence. It also considers that the postponement of the sitting and the additional time finally granted to her to produce her second observations were still not sufficient.

The panel notes first of all that the notification of a penalty report in French is in accordance with the legal obligation laid down in Article 111-1 of the Code on relations between the public and the administration, which provides that The use of the French language is prescribed in exchanges between the public and the administration, in accordance with the provisions of Law No. 94-665 of 4 August 1994 on the use of the French language.

Moreover, there is no legal or supranational provision requiring the CNIL to translate the documents it produces.

In addition, the company has an establishment on French territory, the company Google France SARL. This company has several hundred employees and has been notified of all the documents relating to the procedure. It also notes that the main documents supporting the procedure (Confidentiality rules, Terms of use, etc.) were the company's own documents, which are also available in English on other media.

In the light of these elements, the Panel considers that the company had in any event sufficient material and human resources to ensure that the documents were translated into English in sufficient time to enable it to examine them and make its comments within the time limit set for it.

The Panel then recalls that Article 75 of Decree 2005-1309 of 20 October 2005 as amended provides that the controller has a period of one month to make comments in response to the report sent to it, followed by a further period of fifteen days after the deadline set for the rapporteur to respond.

The restricted panel underlines that the requests made by the company on 11 December 2018 for an extension of the deadline for submitting its comments in response to the elements provided by the rapporteur and for a postponement of the meeting were granted. This postponement gave the company an additional 15 days to produce its second observations in relation to the initial deadline and thus prepare its defence for the restricted panel session. It was also able to present its oral observations on the day of the restricted panel session in addition to its written submissions.

Finally, the panel points out that the findings of fact in the present procedure mainly concerned institutional documents drawn up by the company itself.

In the light of these elements, the Panel considers that Google LLC.'s rights of defence have been guaranteed.

3. on the scope of the investigations

In its defence, the company argues first of all that the rapporteur has confused the Android operating system with the Google account, despite the fact that they are separate services which carry out different processing activities.

In particular, it states that when configuring their mobile device under Android, users have a clear choice whether or not to create a Google account and that the Privacy Policy explains to them how Google services can be used with or without a Google account (e.g. viewing YouTube videos without creating an account etc.).

She then argues that the scope of control chosen by the CNIL - namely the creation of a Google Account when configuring a new device using the Android operating system - is limited in that it represents a case that concerns only 7% of users.

Finally, it states that the findings were made on an older version of the Android operating system.

First of all, the limited training indicates that it does not call into question the existence of separate services, linked respectively to the Android operating system and to the Google Account, implementing different processing activities.

However, it observes that the facts covered by the investigations correspond to the scenario chosen to carry out the online monitoring, namely the path of a user and the documents to which he could have had access during the initial configuration of his mobile equipment using the Android operating system. This path included the creation of an account. These facts therefore relate to the treatments covered by the privacy policy presented to the user when he created his account during the configuration of his mobile phone using Android.

Next, while it is true that the user does indeed have the choice to create an account and has the possibility to use some of the company's services without having to create an account, the Commission notes, however, that when configuring a device under Android, the possibility of creating a Google account or connecting to an already existing account appears naturally at the beginning of the configuration process, without any specific action on the part of the user.

The user is also prompted to create or sign in to a Google Account because when he clicks on the Learn More or Ignore links available at this stage of device configuration, he is presented with the following information: Your device works best with a Google Account, If you don't have a Google Account, you will not be able to do the following [...] Enable device protection features.

The restricted training thus considers that this path, when creating an account, creates a continuum of use between the treatments operated by the operating system and those operated through the Google Account, and justifies the scenario chosen for online control. However, this sequence of information and choices presented to the user does not preclude a differentiated analysis of the different processing activities involved under the legal framework on the basis of all the facts found in this monitoring scenario.

Moreover, with regard to the company's observations that this scenario would only concern 7% of users - most users of a device running Android connecting to a pre-existing account - the restricted training reminds that under the terms of Article 11.I.2 of the French Data Protection Act, the CNIL has a broad discretionary power as to the scope of the controls it may undertake. A specific control scenario, such as the one used in the present case, may make it possible to make findings that reflect a more comprehensive confidentiality policy.

Moreover, the Panel notes that the company states in its submission dated 7 December 2018 that: the scope of the processing of personal data that occurs for Google Account holders when using a device running Android is largely similar to the processing that occurs for Google Account holders when they use Google services on a computer or device not running Android and that [.Providing the same Privacy Policy and Terms of Service helps ensure consistency and user awareness and, importantly, serves as a reminder to existing account holders of the nature of the data collection and the purposes for which it is collected. Consequently, users who would configure their mobile phone under Android by associating an existing account to it are, with regard to the information communicated to them, in a similar situation to users who would create an account.

Finally, with regard to the version of the Android operating system used to make the findings, the restricted training notes that the argument put forward by the company is irrelevant since the documents provided by the company show that a user's path is similar in a more recent version of the operating system.

Moreover, the restricted panel notes that the statistics on the distribution of the use of successive versions of the Android operating system, made available on the official website of Android developers (https://developer.android.com/about/dashboards/), show that the version used during the monitoring is among the most frequently used versions (statistics covering a period of one week in October 2018 based on the connection data of the terminals that have connected to the Google Play Store).

4. on the failure to comply with transparency and information obligations

Article 12(1) of the General Data Protection Regulation provides: 1. the controller shall take appropriate measures to provide any information referred to in Articles 13 and 14 as well as any communication under Articles 15 to 22 and Article 34 regarding the processing operation to the data subject in a concise, transparent, comprehensible and easily accessible manner, in clear and simple terms, in particular for any information specifically intended for a child. Information shall be provided in writing or by other means, including, where appropriate, by electronic means. Where the data subject so requests, the information may be provided orally, provided that the identity of the data subject is proved by other means .

Article 13(1) of the same text provides that: Where personal data relating to a data subject are obtained from the data subject, the controller shall provide him/her with all the following information at the time when the data are obtained :

the identity and contact details of the controller and, where appropriate, of the representative of the controller ;

where applicable, the contact details of the Data Protection Officer ;

the purposes of the processing operation for which the personal data are intended and the legal basis of the processing operation;

where the processing is based on Article 6(1)(f), the legitimate interests pursued by the controller or by a third party ;

the recipients or categories of recipients of the personal data, if any; and

where applicable, the fact that the controller intends to carry out a transfer of personal data to a third country or to an international organisation, and the existence or absence of an adequacy decision by the Commission or, in the case of transfers as referred to in Articles 46 or 47 or in the second subparagraph of Article 49(1), the reference to appropriate or adequate safeguards and the means of obtaining a copy thereof or the place where it was made available; [...] .

The Rapporteur considers that the information provided to users by society does not meet the objectives of accessibility, clarity and comprehensibility laid down in Article 12 and that certain information made mandatory by Article 13 is not provided to users.

In defence, the company considers that the information it disseminates to its users meets the requirements of Articles 12 and 13 of the RGPD.

First of all, the company considers that the Privacy Policy and Terms of Use document, accessible when creating an account, constitutes first level information in line with the EDPS Guidelines on Transparency within the meaning of EU Regulation 2016/679 (WP260) of 25 May 2018. The EDPS clarifies that this document provides a good overview of the processing operations carried out and that the legal basis of the processing operations need not be mentioned in this first level of information. Information on the storage period of the data can be found in the Export and Delete Your Information section of the Privacy Rules.

The company then argues that the information to be given to individuals must, in the light of Articles 12 and 13 of the Regulation, be assessed as a whole. It explains that the information it provides is also provided, in addition to the documents entitled Confidentiality Rules and Terms of Use, Confidentiality Rules and Terms of Use, through several other terms and conditions. It explains that additional information messages may appear when creating an account under each of the privacy settings. In addition, an email message is sent to users when they create an account stating that: You can change the privacy and security settings of your Google Account at any time, create reminders to remember to check your privacy settings, or check your security settings. This email message contains clickable links to a variety of settings tools.

These other control tools, which are available to users after they create their account from the account management interface, include a tool called the Privacy Check-up that allows users to choose the privacy settings that are right for them, including personalized ads, position history, web activity, and applications.

The company also features a Dashboard tool that allows users to see the full picture of their use of Google services such as Gmail or Youtube.

Finally, the company points out that when a user has clicked on Create an account without having disabled the settings for personalized ads, a pop-up window confirming the creation of an account appears to remind the user that the account is configured to include personalization features. The company states that the user path is configured in this way to slow down the progress of users who would not have spontaneously chosen more privacy-protective settings.

Beforehand, the restricted training takes note of the progress made in recent years by the company in its user information policy, in the direction of greater transparency and greater control over their data expected by users. For the following reasons, however, it considers that the requirements of the DPMR, the implementation of which must be assessed in the light of the concrete scope of the personal data processing operations in question, are not respected.

First of all, the restricted training recalls that pursuant to Article 12 of the Regulation, information must be provided in an easily accessible form. This requirement of accessibility is further clarified by the Guidelines on Transparency, where the EDPS considered that "A crucial aspect of the transparency principle highlighted in these provisions is that the data subject should be able to determine in advance what the scope and consequences of the processing operation include in order not to be taken unawares at a later stage as to how his or her personal data have been used". ...] In particular, with regard to complex, technical or unplanned data processing operations, the G29 position is that data controllers should ... define separately and clearly the main consequences of the processing: in other words, what will be the actual effect of the specific processing operation described in a privacy statement or notice on the data subject. The restricted training also recalls that the accessibility obligation of Article 12 is partly based on the ergonomic choices made by the controller.

In the case in point, the restricted training notes that the general information architecture chosen by the company does not make it possible to comply with the obligations of the Regulation. Indeed, the information that must be communicated to individuals under Article 13 is excessively scattered in several documents: Confidentiality Rules and Terms of Use, displayed during account creation, then Terms of Use and Confidentiality Rules, which are accessible in a second step by means of clickable links in the first document. These different documents contain buttons and links that must be activated to view additional information. Such an ergonomic choice leads to a fragmentation of information, forcing the user to multiply the number of clicks necessary to access the different documents. The user must then carefully consult a large amount of information before being able to identify the relevant paragraph(s). However, the user's work does not stop there, as he will still have to cross-check and compare the information collected in order to understand what data is being collected according to the different settings he may have chosen.

The restricted training notes that, given this architecture, some information is difficult to find.

For example, in the case of advertising personalization treatments, to know what information is collected from him for this purpose, a user must perform many actions and combine several documentary resources. First of all, the user must read the general document Privacy Policy and Terms of Use, then click on the More Options button and then on the Learn More link to display the Advertisement Personalization page. He will then have access to a first description of the treatment relating to the personalization of the advertisement which proves to be incomplete. In order to complete the information regarding the data processed for this purpose, the user will have to consult in full the section on offering personalised services contained in the Privacy Policy document, which can be accessed from the general document Privacy Policy and Terms of Use.

Likewise, with regard to the processing of geolocation data, the restricted training notes that the user is required to follow the same path devoid of any intuitive character with regard to the information relating to geolocation data. The user will have to complete the following steps: Consult the Confidentiality Rules and Terms of Use, click on More Options and then on the Learn More link to display the Position History page and read the text displayed. However, as this text is only a short description of the processing, to access the rest of the information, the user must return to the Privacy Policy and consult the section Information about your geographical location. The information will still not be complete since this section contains several clickable links relating to the various sources used to geolocate it.

In the two cases described, five actions are necessary for the user to access the information related to the personalization of the ads and six for the geolocation.

The restricted training also points out that if the user wishes to have information on the retention periods of his personal data, he must first consult the Privacy Rules found in the main document, then go to the section entitled Export and delete your information and finally click on the hyperlink click here contained in a general paragraph on retention periods. It is therefore only after four clicks that the user accesses this information. Moreover, the restricted training notes that the title chosen by the company to Export and delete your information does not easily allow the user to understand that this is a section allowing access to the information relating to the retention periods. Consequently, the restricted formation considers in this case that the multiplication of the necessary actions, combined with a choice of non-explicit titles does not meet the requirements of transparency and accessibility of information.

All of these factors result in an overall lack of accessibility of the information provided by the company in the context of the processing operations in question.

Secondly, the restricted formation considers that the clear and comprehensible nature of the information provided, as required by Article 12 of the GDMP, must be assessed taking into account the nature of each processing operation in question and its concrete impact on the data subjects.

First of all, it is essential to stress that the data processing operations carried out by the controller are particularly massive and intrusive.

The data collected by Google come from a wide variety of sources. This data is collected both from the use of the telephone, from the use of the company's services, such as the Gmail email service or the Youtube video platform, but also from data generated by user activity when they visit third party sites using Google services, thanks in particular to Google analytics cookies deposited on these sites.

In this respect, the Privacy Policy shows that at least twenty services offered by the company are likely to be involved in the processing, which may concern data such as web browsing history, application usage history, data stored locally on the equipment (such as address books), equipment geolocation, etc. As a result, a large amount of data is processed as part of these services via or in connection with the Android operating system.

The investigation of the case shows that, in addition to data from external sources, the company processes at least three categories of data:

data produced by the individual (for example, name, password, telephone number, e-mail address, means of payment, content created, imported or received, such as writings, photos or videos) ;

data generated by his or her activity (e.g. IP address, unique user identifiers, mobile network data, data related to wireless networks and Bluetooth devices, timestamps of actions performed, geolocation data, technical data of the devices used including data related to sensors (accelerometer, etc.), videos viewed, searches performed, browsing history, purchases, applications used, etc.);

data derived or inferred from data provided by that person or his activity. With respect to this category, the Privacy Rules list a number of purposes that can only be accomplished by generating data from the other two categories of data. For example, the personalization of the ads that the company creates requires inferring the users' interests from their activity in order to be able to offer them to advertisers. In the same way, the purposes of providing personalized content, research and recommendations require inferring new information from the information declared, produced or generated by the person's activity.

Moreover, while the very large number of data processed makes it possible to characterize the massive and intrusive nature of the processing carried out, the very nature of some of the data described, such as geolocation data or the content consulted, reinforces this observation. Considered in isolation, the collection of each of these data is likely to reveal with a significant degree of precision many of the most intimate aspects of people's lives, including their habits, tastes, contacts, opinions and even their movements. The result of the combination of these data considerably reinforces the massive and intrusive nature of the processing in question.

Consequently, it is in the light of the specific characteristics of these processing operations of personal data as just recalled that the clear and comprehensible character, within the meaning of Article 12 of the GDPR, of the information provided for in Article 13 of the Regulation must be assessed. The restricted formation considers that these requirements are not met in this case.

Specifically, the restricted training notes that the information provided by the company does not enable users to understand sufficiently the specific consequences of the processing operations for them.

Indeed, the purposes announced in the various documents are described as follows: to offer personalised services in terms of content and announcements, to ensure the security of products and services, to provide and develop services, etc... . They are too generic with regard to the scope of the processing implemented and its consequences. This is also the case when it is indicated to users too vaguely: The information we collect is used to improve the services offered to all our users. ...] The information we collect and the use we make of it depends on how you use our services and how you manage your privacy settings.

[...] The information we collect and how we use it depends on how you use our services and how you manage your privacy settings. As a result, the description of the purposes does not allow users to measure the extent of the processing and the degree of intrusion into their privacy that they are likely to take. In particular, it considers that such information is not provided in a clear manner, either at the first level of information provided to users through the document entitled Confidentiality rules and conditions of use, or at the other levels of information offered by the company.

The restricted training also notes that the description of the data collected, which could be of such a nature as to clarify the scope of these purposes and to avoid the user being caught unprepared at a later stage as to how his data have been used and combined, is particularly imprecise and incomplete, both in the analysis of the first level of information and in the analysis of the other documents provided.

For example, the Privacy Policy and Terms of Use document and the Privacy Policy document state: This may include more complex information (...) such as the ads you find most useful, the people you are most interested in on the Web, or the YouTube videos you might like.

In view of the above, the restricted training considers that the user is not in a position, especially when reading the first level of information presented in the Privacy Rules and Conditions of Use, to measure the impact of the main processing operations on his privacy. While it notes that exhaustive information at the first level would be counterproductive and would not comply with the transparency requirement, it considers that the first level should contain terms that objectify the number and scope of the processing operations carried out. It also considers that it would be possible, by means of other types of presentation arrangements adapted to data combination services, to provide an overall view of the characteristics of this combination from the stage of the Confidentiality Rules according to the purposes pursued.

The lack of clarity and comprehensibility should also be noted with regard to the mention of the legal basis of processing operations for personalising advertising. Indeed, the company firstly states in the Privacy Rules: We ask for your permission to process your information for specific purposes, and you are free to withdraw your consent at any time. For example, we ask for your permission to provide you with personalized services, such as advertisements [...] . The legal basis used here is therefore consent. However, the company further adds to rely on legitimate interest, in particular to carry out marketing actions to make our services known to users and especially to use advertising to make many of our services available free of charge to users.

The restricted formation stresses that, although the company has indicated before it that the only legal basis for processing relating to personalised advertising is consent, the instruction states that this clarification is not made known to users. The above wording does not allow users to clearly understand the distinction between properly personalised advertising, based on the combination of multiple user data, which according to the company is based on consent, and other forms of targeting, for example using the navigation context, based on legitimate interest. The restricted training underlines the particular importance of the requirement for clarity with regard to these processing operations, given their place in society's processing operations and their impact on individuals in the digital economy.

With respect to information on retention periods, the limited training notes that the How Google's information is retained page contains four categories:

Information retained until you delete it;

Information with an expiration date;

Information retained until you delete your Google Account;

Information that is kept for a long period of time for specific reasons.

However, we note that for the last category, only a very general explanation of the purpose of such retention is provided and no specific time period or criteria used to determine the length of time is specified. This information is one of those which must be provided to persons under Article 13(2)(a) of the Regulation.

Finally, while the company argues that multiple information tools are made available to users concurrently and after the creation of their account, the restricted training notes that these modalities do not meet the transparency and information requirements of Articles 12 and 13 of the RGPD.

First of all, the restricted training notes that the tools referred to by the company do indeed contribute to a certain extent to the objective of transparency throughout the life of the account and the use of Google's services. However, the restricted training considers that they do not sufficiently contribute to the information provided for in Article 13, which must occur at the time the data in question are obtained. As recalled in the EDPS Transparency Guidelines, Article 13 specifies the information to be provided to data subjects at the beginning of the processing cycle.

If data other than those strictly necessary for the creation of the account are collected throughout the life of the account, such as browsing or purchase history, the moment of its creation marks the entry of the user into the ecosystem of Google's services, the particularly massive and intrusive nature of which has been recalled above. This stage marks the beginning of a multitude of processing operations: collection, combination, analysis, etc. Therefore, insofar as the process of creating the account is crucial in understanding the processing operations and their impact, and since the proposed user path itself invites the data subject to focus his or her attention at this stage, the information provided for in Article 13 of the Regulation which takes place at this point must, in itself, be sufficient in the light of the requirements resulting from that provision and from Article 12 of the Regulation.

Moreover, both the pop-up window appearing when the account is created and the electronic message sent as soon as the account is created contain only summary or highly targeted information on the processing operations carried out and cannot be regarded as sufficient prior information.

This Google Account is set up to include personalization features (such as recommendations and personalized ads) that are based on information stored in your account. The email message indicates the main features of the Google Account and the existence of control tools.

The privacy check-up tool essentially allows the user to set up the information collected, such as browsing history or locations visited. Finally, the Dashboard consists of an information panel that groups together for each Google service an overview of the account holder's usage habits.

Nevertheless, these privacy check-up and dashboard tools can only be used, as well as the email mentioned above, after the account creation stage, which is essential for user information as mentioned above. Moreover, although their existence and interest are made known to users, they presuppose an active and proactive approach on their part. For these reasons, these tools cannot be considered to provide sufficient information for the application of Article 13 of the Regulation.

In the light of all these factors, the restricted group considers that a failure to comply with the transparency and information obligations provided for in Articles 12 and 13 of the Regulation is a breach of the Regulation.

5. 5. Failure to have a legal basis for the processing operations carried out in the context of the Regulation.

Article 6 of the DPMR provides that: Processing is lawful only if and insofar as at least one of the following conditions is met :

the data subject has consented to the processing of his or her personal data for one or more specific purposes ;

processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;

processing is necessary for compliance with a legal obligation to which the controller is subject ;

processing is necessary in order to protect the vital interests of the data subject or of another natural person ;

processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;

processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, unless the interests or fundamental rights and freedoms of the data subject which require the protection of personal data prevail, in particular where the data subject is a child .

The company was criticised for not validly obtaining the consent of individuals for the processing of personalised advertising. It was also considered that the company could not rely on a legitimate interest in such processing operations.

In its defence, the company stated that it relied solely on consent for advertising personalization treatments.

Article 4(11) of the Regulation specifies what is meant by consent: "any freely given specific, specific, informed and unambiguous expression of his or her wishes by which the data subject signifies his or her agreement, by means of a declaration or a clear positive act, to personal data relating to him or her being processed".

Article 7 of the same text lays down the conditions applicable to it :

1. In cases where processing is based on consent, the controller shall be able to demonstrate that the data subject has given his or her consent to the processing of personal data relating to him or her.

2. If the data subject's consent is given in the context of a written statement which also relates to other matters, the request for consent shall be in a form which clearly distinguishes it from those other matters, in a form which is comprehensible and easily accessible, and formulated in clear and simple terms. No part of such a statement which constitutes a breach of this Regulation shall be binding.

3. The data subject shall have the right to withdraw his or her consent at any time. Withdrawal of consent shall not jeopardise the lawfulness of the processing operation based on the consent given prior to such withdrawal. The data subject shall be informed thereof before giving his or her consent. Withdrawing consent is as simple as giving consent.

4. In determining whether consent is freely given, utmost account should be taken of, inter alia, whether the performance of a contract, including the provision of a service, is contingent upon it.

In the first place, the company asserts that the consent of users is informed.

It considers that simple and clear information is presented to the user when creating an account and allows the user to be aware of how the company uses the data for the purposes of personalising advertising. The Company refers in particular to the summary entitled Privacy Policy and Terms of Use, the sections dedicated to the personalization of advertisements contained in the Privacy Policy and the additional information message entitled Advertising Personalization appearing in the account creation settings options.

Second, the Company affirms that user consent is specific and unambiguous.

In particular, the Company argues that when setting up the account, the user is given the opportunity to make a choice as to whether or not the advertisement personalization will be displayed. The company considers that this possibility allows the user to express consent to the use of his data independently of the other choices he may express with respect to other purposes related to the processing associated with the Google account (e.g. YouTube search history).

It also considers that the methods of collecting consent for the purposes of personalizing the ads it puts in place are consistent with the recommendations of the CNIL of December 5, 2013 on cookies. In particular, it specifies that brief information is available on the personalization of ads followed by an "I accept" button (the Privacy Rules and Terms of Use), preceded by a "more options" button that gives users the possibility of disabling several processing operations, including for the purposes of personalizing ads.

It also maintains that the solution admitted in the public notice of the President of the CNIL No. MED-2018-023 of November 29, 2018 allows the user to consent to all purposes via a button accept all .

Finally, it considers that explicit consent for the processing of data for the purposes of personalising advertising, within the meaning of Article 9(2)(a) of the RGPD, could not be required as long as it is not sensitive data.

As to the enlightened character

Beforehand, the restricted training specifies that this enlightened nature must be examined in the light of previous developments concerning the lack of transparency and information for users when creating their account. It considers that the shortcomings previously identified necessarily have an impact on the information provided to users to ensure the informed nature of consent.

The restricted training indicates that the EDPS Guidelines of 10 April 2018 on consent within the meaning of Regulation 2016/679 (WP250) specify that the controller must ensure that consent is provided on the basis of information that allows data subjects to easily identify who the data controller is and understand what they are consenting to. It] must clearly describe the purpose of the data processing for which consent is sought.

These guidelines also specify that: For consent to be informed, it is necessary to inform the data subject of certain elements that are crucial to make a choice. ...] At least the following information is necessary in order to obtain meaningful consent ...:

the identity of the controller,

the purpose of each of the processing operations for which consent is sought,

the (types of) data collected and used,

the existence of the right to withdraw consent,

information on the use of the data for automated decision making [...] and

information on the possible risks associated with the transmission of the data due to the lack of an adequacy decision and appropriate safeguards [...] .

As it may have pointed out in relation to the failure to comply with transparency and information obligations, the restricted panel considers that information on advertising personalisation processing operations is excessively disseminated in separate documents and as such is not easily accessible. In this respect, the restricted training refers to previous developments on the multiple actions which must be taken by a user who wishes to be informed of the available information on processing operations relating to the personalisation of advertising.

Moreover, as has also been noted in relation to the failure to comply with transparency obligations, the information provided is not sufficiently clear and comprehensible in that it is difficult for a user to have an overall understanding of the processing operations to which he may be subject and their scope.

By way of illustration, the information published in the Personalisation of advertisements section, accessible from the document Confidentiality rules and conditions of use via the More options button, contains the following statement: Google may show you ads based on your activity within Google services (for example, in search or on YouTube, as well as on Google partner websites and applications). Restricted training notes that it is not possible to learn about the Google services, sites and applications to which the company refers, for example through clickable links. Consequently, the user is not able to understand the personalization processing of advertising to which they are subject, as well as its scope, even though this processing involves a plurality of services (for example: Google search, YouTube, Google home, Google maps, Playstore, Google photo, Google play, Google analytics, Google translation, Play books) and the processing of a large amount of personal data. Users are not able to have a fair perception of the nature and volume of the data that is collected.

In view of these elements, the restricted training considers that users' consent for the processing of advertising personalisation is not sufficiently informed.

Regarding the specific and unambiguous nature of consent

Recital 32 of the Regulation provides that: Consent must be given by a clear positive act by which the data subject freely, specifically, knowingly and unambiguously expresses his or her agreement to the processing of personal data relating to him or her [...]. There can therefore be no consent in the event of silence, default boxes or inactivity.

Recital 43 of the GDPR states that: Consent is presumed not to have been freely given if separate consent cannot be given to different personal data processing operations although this is appropriate in the case in question .

The EDPS Guidelines on Consent referred to above specify that: In order to comply with the specific nature of consent, the controller must ensure that : This means that a controller seeking consent for various specific purposes should provide separate consent for each purpose so that users can give specific consent for specific purposes.

In the present case, the restricted training notes that when the user creates an account, he or she has the possibility to change some of the settings associated with the account. To access these settings, the user must click on the more options button before the Create Account button. The restricted training also notes that the account personalization parameters, which contain the choice of displaying personalized ads, are pre-ticked by default, which, unless otherwise stated, indicates the user's agreement to the processing of his data for the purposes mentioned (e.g. YouTube search history, display of personalized ads, etc.). The user has the possibility of unchecking these parameters if he does not wish such processing to be carried out.

The restricted training notes that, at the time of account creation, if the user does not click on the more options button to set up his account, he must check the boxes I accept the Google terms of use and I agree that my information will be used as described above and detailed in the privacy policy . Then, he or she must press the Create Account button. A pop-up window will appear, titled Simple confirmation, which contains the following text This Google Account is set up to include personalization features (such as recommendations and personalized ads), which are based on information stored in your account. To change your personalization settings and the information saved in your account, select More options .

If More Options is not clicked, then the user must select the Confirm button to complete the account creation.

In view of the above, Restricted Training notes that while the user has the ability to change the configuration of their account settings prior to account creation, a positive action on their part is required to access the account setup options. Thus, the user can completely create his account, and accept the treatments that are linked to it, in particular the treatments of personalization of the advertisement, without clicking on More options . Consequently, the user's consent in this case is not validly collected unless it is given by means of a positive act by which the person specifically and distinctly consents to the processing of his/her data for the purposes of personalising the advertisement in relation to the other purposes of processing.

The restricted training also considers that the actions by which the user proceeds with the creation of his/her account - by ticking the boxes I accept the Google Terms of Use and agree to the use of my information as described above and detailed in the Privacy Policy and selecting Create an Account - are not to be considered as an expression of valid consent. The specific nature of the consent is not respected since the user, by these actions, agrees to all processing of personal data by the company, including the personalization of advertising.

Furthermore, the restricted training notes that when he clicks on More options to access the configuration of his account parameters, these, and in particular the one relating to the display of personalised advertisements, are all pre-ticked by default. Also, the possibility given to users to configure their account settings does not result, in this case, in a positive action aimed at obtaining consent, but in an action aimed at allowing opposition to the processing.

Finally, the restricted training notes that this analysis is corroborated by the G29 Guidelines on Consent, which state that: A controller must also be aware that consent cannot be obtained through the same action as when a data subject accepts a contract or the terms and conditions of a service. The DPMR does not allow controllers to provide default check boxes or opt-out options requiring action by the data subject to indicate his or her refusal (e.g. opt-out boxes).

The restricted training notes in this respect that, while some user paths may include a functionality allowing the user to give mutual consent to the processing of his data for different related purposes, this facility can only be considered compliant if the different purposes of processing have been presented to him separately in advance and he has been able to give a specific consent for each purpose, by a clear positive act, with the boxes not being pre-ticked. In order for this type of user pathway to be considered compliant, the possibility to give specific consent for each purpose must be offered to individuals before the possibility to accept or refuse everything, without them having to take any particular action to access it, such as clicking on more options . In view of the above, the restricted training considers that this type of user path offers different guarantees from those proposed in this case, as it allows the user to give specific and distinct consent to the processing of his/her data for a specific purpose, by means of a clear positive act, and this possibility is offered to him/her immediately and prior to the functionality to accept everything.

Therefore, in this case, by being authorized and masked by default, the processing of advertising personalization cannot be considered as having been accepted by the user by a specific and univocal positive act.

Secondly, if the company maintains that the methods for collecting consent for the purposes of personalizing advertisements that it implements are in compliance with the recommendations of the CNIL of December 5, 2013 regarding cookies, the restricted training reminds that the rules specifically applicable to cookies related to targeted advertising operations are set by the separate provisions of Article 32-II of the French Data Protection Act, resulting from the transposition of the ePrivacy Directive of July 12, 2002 (amended by Directive 2009/136/EC). The invocation of the Recommendation of 5 December 2013 is consequently, and in any event, inoperative.

Thirdly, contrary to what Google maintains, the requirements laid down as regards the collection of consent are not intended to establish a consent regime which would be more protective than that imposed by the GDMP and which would be wrongly defined in the light of the criteria imposed for the collection of consent applicable to the processing of so-called sensitive personal data.

The restricted formation notes that the modalities of expression of consent have been clarified and defined by Article 4(11) of the Regulation, which states that consent shall mean: any freely given, specific, informed and unambiguous expression of his or her wishes by which the data subject signifies his or her agreement, by a declaration or a clear positive act, to personal data relating to him or her being processed.

The same modalities for the expression of consent apply in the same way, whether consent is obtained, under Article 6 of the GDMP, for the implementation of a processing operation for a specific purpose, or whether consent is obtained, under Article 9 of the GDMP, to lift the prohibition in principle on the processing of so-called sensitive personal data .

Therefore, in order to be considered valid, the consent obtained must be a specific, informed and unambiguous expression of will, which, as the restricted panel has previously pointed out, is not the case here.

In the light of all those factors, the restricted formation considers that the consent on which the company relies for the personalisation of advertising is not validly obtained.

III - Sanctions and advertising

Article 45-III 7° of the law of 6 January 1978 provides : Where the data controller or its processor does not comply with the obligations resulting from Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 referred to above or from this Act, the Chairman of the Commission Nationale de l'Informatique et des Libertés may also, where appropriate after sending him the warning provided for in I of this Article or, where appropriate in addition to a formal notice provided for in II, refer the matter to the restricted formation of the Commission with a view to pronouncing, after an adversarial procedure, one or more of the following measures:...]: 7° Except in cases where the treatment is implemented by the State, an administrative fine not exceeding EUR 10 million or, in the case of an undertaking, 2% of the total annual worldwide turnover in the preceding financial year, whichever is the higher. In the circumstances set out in Article 83(5) and (6) of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016, these ceilings are increased to EUR 20 million and 4 % of that turnover respectively. In determining the amount of the fine, the restricted formation shall take into account the criteria set out in Article 83 of Regulation (EU) 2016/679.

The company considers that an administrative fine of EUR 50 million is disproportionate.

It points out that a formal notice would have enabled it to take steps to comply and that it does not appear that the direct imposition of an administrative fine is the most appropriate corrective measure.

The Commission further considers that not all the criteria set out in Article 83 of the RGPD have been taken into account in the assessment of the proposed fine. In this respect, it refers in particular to the impossibility to take corrective measures due to the absence of a prior notice.

The company further points to the low number of users affected by the infringements and indicates that out of [...] people who configure a device under the Android operating system per day, only [...] people create an account.

First of all, the restricted panel notes that, pursuant to Article 45 of Law No. 2018-493 of 20 June 2018, the President of the CNIL has the right to take legal action and can therefore choose, depending on the circumstances of the case, the follow-up to be given to the investigations by closing a file, issuing a formal notice or by seizing the restricted panel with a view to taking one or more corrective measures, without it being up to the restricted panel to decide on the course of action chosen by the President. The restricted panel, thus seized, is then fully competent to rule on the materiality and the qualification of the facts, then to assess whether the failures that it would have characterized justify, in its very principle, the pronouncement of one of the corrective measures mentioned in III of article 45 of the law of January 6, 1978 and, finally, to rule on the amount of a possible fine.

Furthermore, the restricted panel recalls that if an administration, when a decision is taken in the light of a set of criteria laid down by a text, must take into account all of these criteria, it is not required in the grounds for its decision to give an opinion on each of them but may confine itself to mentioning those it considers relevant and the corresponding factual elements.

In the case in point, the restricted formation considers that the facts and failures referred to above justify the imposition of an administrative fine on the company for the following reasons.

Firstly, the panel wishes to stress the particular nature of the breaches of the lawfulness of the processing and of the obligations of transparency and information. Indeed, Article 6 of the RGPD - which defines the cases in which processing is lawful - is a central provision of the protection of personal data in that it allows processing to be carried out only if one of the six conditions listed is met. Transparency and information obligations are also essential in that they condition the exercise of individuals' rights and thus enable them to retain control over their data. In this respect, both Article 6 and Articles 12 and 13 are among the provisions whose disregard is most severely sanctioned in Article 83(5) of the GDMP.

The restricted formation thus considers that the obligations provided for in terms of transparency and legal bases constitute fundamental guarantees enabling individuals to retain control over their data. Ignorance of these essential obligations therefore appears particularly serious, by their very nature alone.

Secondly, the restricted panel notes that the breaches identified continue to date and are ongoing violations of the Regulation. This is neither a one-off failure by the company to comply with its obligations, nor is it a habitual breach which the controller has spontaneously put an end to since the matter was referred to the restricted panel.

Thirdly, the seriousness of the breaches must be assessed in the light in particular of the purpose of the processing operations, their scope and the number of data subjects.

In this respect, the restricted training notes that although, according to the company, the scenario adopted for the online investigations conducted by the CNIL directly corresponds to only 7% of its users, the number of persons concerned is, in itself, particularly large. It also points out that users who configure their mobile phone under Android by associating an existing account with it are, with regard to the documents communicated to them and therefore the violations of the Regulation, in a situation similar to those creating an account for the first time, which the company did not contest in its letter of 7 December 2018.

Furthermore, the restricted formation recalls that the company is implementing data processing on a considerable scale in view of the predominant place occupied by the Android operating system on the French market for mobile operating systems and the proportion of use of computers by telephone users in France. Thus, the data of millions of users is processed by the company in this context.

The processing covered by the privacy policy presented to users when they create their account - when configuring their mobile phone using Android - also appears to be of considerable scope in view of the number of services involved - at least twenty or so - and the variety of data processed via or in connection with the Android operating system. In addition to the data provided by the user himself when creating the account and using the operating system, the restricted training reminds us that a multitude of data resulting from his activity is also generated, such as web browsing history, application usage history, equipment geolocation, purchases etc. Likewise, data are deduced from information provided by the data subject or his activity, in particular in the context of personalising advertisements. It is therefore a large amount of particularly enlightening information on people's lifestyle habits, opinions and social interactions. Consequently, the data processed by the company has a direct impact on their identity and privacy.

Moreover, the restricted training notes that multiple technological processes are used by the company to combine and analyze data from different services, applications or external sources. They undeniably have a multiplier effect on the precise knowledge that the company has of its users.

As a result, the restricted formation believes that the company has combination operations with almost unlimited potential for massive and intrusive processing of user data.

In view of the scope of the data processing operations - in particular the personalisation of advertising - and the number of data subjects, the restricted formation stresses that the breaches previously identified are particularly serious. A lack of transparency regarding these large-scale processing operations, as well as a lack of valid consent of users to the processing of personalised advertising, constitute substantial breaches of their privacy and run counter to the legitimate aspirations of individuals who wish to retain control over their data.

In this respect, the strengthening of individuals' rights is one of the main thrusts of the Regulation. The European legislator recalls that the rapid development of technology and globalisation have created new challenges for the protection of personal data. The extent of the collection and sharing of personal data has increased significantly. Technology enables both private companies and public authorities to use personal data in their activities in an unprecedented way (...) Technology has transformed both the economy and social relations (recital 6). It thus underlines that these developments require a strong data protection framework (...) with rigorous enforcement, as it is important to create the trust and confidence that will allow the digital economy to develop throughout the internal market. Natural persons should have control over their personal data. (Recital 7). Finally, the European legislator regrets that Directive 95/46/EC has not avoided the widespread public perception that significant risks to the protection of natural persons remain, in particular with regard to the online environment. (recital 9).

The restricted training considers, therefore, in view of the scale of the processing operations deployed and the overriding need for users to retain control over their data, that they must be put in a position to be sufficiently informed of the scope of the processing operations implemented and to give their valid consent to them, unless confidence in the digital ecosystem is fundamentally undermined.

Fourthly, the restricted training wishes to emphasise that the shortcomings must be put into perspective with regard to the economic model of the company, in particular the place of processing of users' data for advertising purposes via the Android operating system. In view of the benefits that the company derives from these processing operations, the company must pay particular attention to its responsibility under the DPMR in their implementation.

It follows from all of the above and from the criteria duly taken into account by the restricted training, in view of the maximum amount incurred, established on the basis of 4 % of the turnover indicated in point 2 of this Decision, that a financial penalty of EUR 50 million is justified, as well as an additional advertising penalty for the same reasons.

Account shall also be taken of the predominant place occupied by the company on the market for operating systems, the seriousness of the infringements and the interest which this Decision represents in informing the public, in determining the duration of its publication.

FOR THESE REASONS

The restricted formation of the CNIL, after having deliberated, decides :

to impose a financial penalty on Google LLC in the amount of 50 (fifty) million euros;

to send this decision to Google France Sarl with a view to its enforcement;

to make its decision public, on the CNIL website and on the Légifrance website, which will be made anonymous at the end of a period of two years from its publication.

The Chairman

Jean-François CARREZ

This decision may be appealed to the Council of State within four months of its notification.



Nature of the decision: PENALTY
Date of publication on legifrance: 22 January 2019