IMY (Sweden) - DI-2022-2351/2372/2373/2374/2375: Difference between revisions

From GDPRhub
mNo edit summary
No edit summary
 
(7 intermediate revisions by 2 users not shown)
Line 10: Line 10:
|ECLI=
|ECLI=


|Original_Source_Name_1=
|Original_Source_Name_1= Decision
|Original_Source_Link_1=
|Original_Source_Link_1=https://gdprhub.eu/images/b/bf/IMY_%28Sweden%29_-_DI-2022-23512372237323742375.pdf
|Original_Source_Language_1=
|Original_Source_Language_1=Swedish
|Original_Source_Language__Code_1=
|Original_Source_Language__Code_1=SE


|Type=Complaint
|Type=Complaint
Line 61: Line 61:
}}
}}


Google LLC reprimanded by the Swedish DPA for refusing several requests for removal of search results
The Swedish DPA reprimanded Google LLC for refusing several requests for search results' removal under [[Article 17 GDPR|Article 17(1)]] and [[Article 21 GDPR|21 GDPR]]. The DPA held that Google could not refuse to assess the continuing relevance and accuracy of the concerned web pages' contents just because they were behind a paywall. 


== English Summary ==
== English Summary ==


=== Facts ===
=== Facts ===
The Swedish DPA assessed five complaint from data subjects who had requested removal of a search result in accordance with [[Article 17 GDPR|GDPR article 17(1)]] and [[Article 21 GDPR|21]]. All five requests had been denied on the grounds that the contents of the web pages in question were unaccessible to Google due to paywalls, and that Google was therefore unable to assess whether the contents were no longer relevant or inaccurate.
The Swedish DPA assessed five complaints from data subjects who had requested removal of search results in accordance with [[Article 17 GDPR|Article 17(1)]] and [[Article 21 GDPR|21 GDPR]]. All five requests had been denied on the ground that the contents of the web pages in question were inaccessible to Google due to paywalls, and that Google was therefore unable to assess whether the contents were no longer relevant or inaccurate.


After the complaint were filed, Google had acted on the requests. Nevertheless, the Swedish DPA assessed Google's explanation for refusing the requests in the first place.
After the complaints were filed, Google acted on the requests. Nevertheless, the Swedish DPA assessed Google's explanation for refusing the requests in the first place.


=== Holding ===
=== Holding ===
The DPA held that the data subject had exercised their right to object to the processing in accordance with the GDPR. Pursuant to [[Article 17 GDPR|article 17(1)(c)]], the burden of proof for the existence of overriding legitimate grounds for the continued processing is put on the controller and not the data subject. The DPA highlighted that by refusing the requests on the grounds that Google could not access the contents, Google was pushing the burden of proof onto the data subjects. By this logic, the data subjects would have had to gather additional information from the relevant web pages themselves in order for Google to act on the requests. As a result, the data subjects would in practice be forced to pay to be able to exercise their rights.
The DPA held that the data subject had exercised their right to object to the processing in accordance with the GDPR. Pursuant to [[Article 17 GDPR|Article 17(1)(c) GDPR]], the burden of proof for the existence of overriding legitimate grounds for the continued processing is on the controller and not the data subject. The DPA highlighted that by denying the requests due to Google's alleged inability to access the contents of the relevant web pages, Google was placing the burden of proof on the data subjects. By this logic, the data subjects would have had to gather additional information from the relevant web pages themselves in order for Google to act on the requests. As a result, the data subjects would in practice be forced to pay to be able to exercise their rights.


Against this background, the Swedish DPA reprimanded Google violating the principles of GDPR article 12, 17 and 21.
Against this background, the Swedish DPA reprimanded Google for violating [[Article 12 GDPR|Articles 12]], [[Article 17 GDPR|17]] and [[Article 21 GDPR|21 GDPR]].


== Comment ==
== Comment ==
''Share your comments here!''
''The uploaded decision is one of five nearly identical decisions issued by the Swedish DPA on this matter.''  


== Further Resources ==
== Further Resources ==
Line 82: Line 82:


== English Machine Translation of the Decision ==
== English Machine Translation of the Decision ==
The decision below is a machine translation of the original. Please refer to the original for more details.
The decision below is a machine translation of the German original. Please refer to the German original for more details.


<pre>
<pre>
Now it is possible to blow the whistle to IMY
Google LLC
Diary number:
DI-2022-2374
Your diary
number:
5-4751000031502
Date:
2022-07-26
Decision following supervision under
the General Data Protection
Regulation - Google LLC's removal of
search results for the Lexbase
directory service
Decision of the Privacy Authority
The Privacy Commissioner finds that Google LLC has processed personal data in
breach of Article 12(2) of the GDPR1 , by stating in its response to the complainant's request for removal under Article 17(1)(c) and 21 on 16 November 2021, without being able to demonstrate compelling legitimate grounds for refusing the request, that the request is not granted because Google does not have access to the content of the URL in question and thus did not facilitate the complainant's exercise of his rights.
The DPA reprimands Google LLC under Article 58(2)(b) of the GDPR for breach of
Article 12(2).
Postal address:
Box 8114
104 20 Stockholm
Website:
www.imy.se
E-mail:
imy@imy.se
Phone:
08-657 61 00


The Privacy Protection Agency (IMY) is one of the authorities that, according to the so-called whistleblower act, must have an external reporting channel for whistleblowing. From now on, it is possible to blow the whistle to IMY if you have information that the person you work for, have worked for, or are looking for work with does not comply with the data protection rules.
Summary of the decision
</pre>
The Privacy Authority (IMY) has examined Google's handling of the complainant's
requests for Google's search engine not to display search results to the Lexbase
directory service for searches related to the complainant's name (removal request),
following a complaint from an individual. The open part of the directory service
indicates whether a person appears in the database, which consists of documents
from courts and public authorities, including criminal and civil judgments. Behind a paywall, access to these documents is offered to anyone. Google initially refused the requests, but in the course of the procedure has granted them. IMY has therefore not found any reason to investigate the complaint further in
this respect. However, IMY has found grounds to investigate the complaint further as regards  Google's justification in its message to the complainant as to why Google initially rejected the request. According to this reasoning, the request was rejected because Google did not have access to the content behind the payment gateway and was therefore unable to assess whether it was inaccurate or out of date. IMY states in the decision that the complainant has exercised his/her right to object to the processing under the GDPR by submitting his/her request. In such circumstances, it is the controller (Google) and not the data subject (the complainant) who bears the  burden of proof to show that there are compelling legitimate grounds for refusing the request. Such a reason could be, for example, that there is a legitimate interest for the public to be able to access a particular publication through searches related to a particular person's name on a search engine such as Google's. However, such a public interest normally requires that the person requesting removal is a public figure or has a role in public life, which it had not been shown that the complainant was, either at the time of the refusal decision or subsequently. The fact that the directory
service has a certificate of publication and that the documents behind the paywall are public does not, either individually or taken together, mean that the publication is in the public interest or can be considered to have been made for journalistic purposes. No other factor constituting decisive legitimate reasons has been brought to light. If, as in the present case, there are no compelling legitimate grounds for a request for removal based on an objection, the request shall be granted without any balancing exercise being required. Google's response in the case thus gave the erroneous impression that the complainant had to submit additional evidence to enable such a balancing exercise to be carried out. The only way for the complainant to access the evidence was to pay the owner of the directory service - in other words, the complainant was effectively forced to pay to exercise his right. Google has thus not facilitated the complainant's exercise of his rights and has processed his personal data in breach of Article 12(2) of the General Data Protection Regulation.
 
As Google has complied with the request and taken several remedial measures, and
as there were difficult trade-offs involved, IMY has considered it appropriate to refrain from imposing a penalty on Google for the breach found and to limit itself to issuing a reprimand under Article 58(2)(b) of the GDPR.As mentioned above, IMY has also considered four other complaints with similar circumstances during the same period, with the same conclusions and outcome.
 
 
Description of the supervision case
The Privacy Authority (IMY) has initiated an enforcement action against Google LLC
(Google) following a complaint that Google has denied two requests (Google ref no 5-4751000031502) for Google to not display three URLs in its search services for
searches related to the complainant's name.
The complaint
The complaint essentially states.
The complainant contacted Google on 14 November 2021 requesting that Google
remove search results relating to an internet directory service (ref no 5-
4751000031502), which show the complainant's full name, age and postal address and
that the complainant is listed in the directory service's database of court documents.
The complainant argued that the request should be granted because the websites
contain information on a criminal conviction. The complainant has served the
sentence of 10 months' imprisonment. The sentence has no connection with the
complainant's professional activity. The complainant is not a public person and has no role in public life, nor did he have one at the time of the offence. The complainant suffers great harm from the display. The complainant lost his job as a result of the conviction and has faced obstacles in both his professional and private relationships as a result of the publication. Furthermore, the pages show the complainant's full personal identity number, which is an extra-protected personal data whose processing without consent is specifically regulated by Swedish law in addition to the General Data Protection Regulation. As can be understood, the complainant has thus objected to the processing in accordance with Article 17(1)(c) of the GDPR.
 
In its refusal decision of 16 November 2021, Google denied the complainant's request for removal on the grounds that "Google does not have access to the content of the [directory service]. Therefore, based on the information available to us, we cannot conclude that the information is inaccurate or out of date. We have therefore decided not to take action in respect of this URL(s)'.
On 22 November 2021, the complainant submitted the content of the URLs in the form
of a screenshot of the website and a copy of the judgment.
Subsequently, in a refusal decision dated 3 December 2021, Google denied the
complainant's request, stating that "Google has concluded that the information about you on these URLs - taking into account all the circumstances of the case known to us - is still relevant for data processing purposes. Therefore, the reference to this document in our search results is justified on grounds of public interest".
The complainant then objected again on 5 December 2021 to the processing under
Article 21 of the GDPR and sent a reminder on 21 December. Nevertheless, in breach
of Article 12 of the GDPR, Google has not responded within one month with a review
of the request or information on the extension, thus preventing the complainant from asserting his rights as a data subject. Google has still not responded.
In particular, the complaint points out that Google did not invoke any specific legal basis, did not inform in accordance with Article 21(4) of the GDPR about the right to object under Article 21(1) and did not inform that the data subject may obtain information on the assessment upon request as proposed in the Article 29 Working Party's Guidelines on Transparency (WP260 rev.01).
What Google has stated
Google has indicated in a statement to IMY on 28 March 2022 that, after further review, Google has granted the request and executed the removal on 16 March 2022.
In its submission to IMY of 25 April 2022, Google has stated in essence the following.
General
When Google receives a request for deletion under the right to be forgotten, Google
makes an assessment in accordance with the case law of the European Court of
Justice and the guidelines of the European Data Protection Board (EDPB), in the light of the requirements of Article 17(1)(c) and Article 21(1) of the GDPR. These
requirements are part of the balancing test that Google carries out when dealing with right to be forgotten requests, where Google considers whether there are compelling legitimate reasons for the URLs to continue to be displayed in the event of a search on the data subject's name, and balances all relevant interests in accordance with the EDPB guidelines. In light of Articles 12(1), 12(2) and 12.4, Google also provides an FAQ on the process explaining how individuals make requests and how Google evaluates such requests, including examples of common scenarios where Google removes content (e.g. lack of public interest and content relating to minors) and where Google does not (e.g. strong public interest, where for example the information relates to the data subject's professional life or otherwise his/her role in public life).
The handling in this case
The URLs in this dossier link to content on the Lexbase website. Lexbase is a legal
directory service containing public information from Swedish courts and authorities. It allows users to search for individuals and companies that have been the subject of criminal or civil claims in Swedish courts. Lexbase holds a certificate of publication. It is possible to search for individuals and company names, but the full information on convictions is available behind a paywall.
As Google does not have access to the information behind the paywall, it is not
possible for Google to assess the full content of the URL. It is therefore also not
possible to verify whether the information provided by the complainant is correct.
Google has therefore assumed that the material on Lexbase is an accurate copy of
public records.
One of the factors Google takes into account when determining whether information is in the public interest is where the information comes from. Government records play a crucial role in keeping society informed about matters of public interest, and the government's decision to publish and make it publicly available is a strong indication that it believes there is a public interest in the information. Lexbase gets its information from Swedish courts, i.e. from publicly available sources. The source is reliable and the information can, as a general rule, be assumed to be in the public interest when it is available in the public domain.
Databases similar to Lexbase also exist in other jurisdictions, and what Google often sees is that registrants may be inclined to mischaracterize the content behind the paywall, provide incomplete or inaccurate information, or provide subjective descriptions of situations that speak in their favour. Cases are difficult to assess because Google can only make a judgement based on the information it receives in the case in question. It also involves assessing the credibility of different sources. The fact that some information comes from a public source, published by a government agency or a court, will naturally carry more weight when what is in the other scale is the subjective version of the complainant.
Google notes that Google is a separate data controller from the website operator
Lexbase. The distinction is why Google has an obligation to delete search results
under data protection law, even when the website owner has not necessarily violated
any law in publishing the material in question. Google is aware that Lexbase has been the subject of considerable public debate and that the incomplete display of an individual's presence in the database may be seen as problematic and have negative consequences for the individual. However, it remains the case that material on Lexbase is part of a public record and remains available from official sources.
Therefore, in these very difficult borderline cases, when balancing all available facts, Google made the judgment that the URLs should not be deleted.
Based on the above considerations, Google decided to reject the complainant's
request on 16 November and 3 December 2021. The decision also clearly stated that
the nature of the website was the reason why Google rejected the request.
Google communicated the decision and the reasons for it to the complainant in
accordance with Article 12(1) and 12(4) of the GDPR. This gave the complainant the opportunity to evaluate whether the decision taken by Google achieved the purpose of his request. The complainant had the possibility to submit a further request if he was not satisfied with the outcome. Google also informed the complainant that he could lodge a complaint with his local data protection authority.
The fact that the complainant then provided screenshots from Lexbase did not change
Google's decision. This in light of the above considerations regarding Lexbase as a
whole. It was not possible for Google to verify that the screenshots submitted by the complainant constituted all the information behind the paywall on Lexbase. It was therefore also not possible for Google to make a full assessment of the credibility of the complainant's allegations. However, Google had knowledge that the material in question on Lexbase was in all likelihood an accurate copy of a public record. In view of the freedom of information in these complex cases, where the balance of interests is difficult, the decision to refuse deletion in the complainant's case appeared to be reasonable for Google.
After receiving several requests for information from IMY, seeing requests from other registrants related to Lexbase, and reflecting more deeply on Lexbase cases, Google decided to evaluate the removal practices regarding Lexbase and other similar databases. Google carefully considered the questions raised by IMY and re�evaluated the decision in the present case. As mentioned above, Google decided to
grant the complainant's request and remove the URL after receiving IMY's request
for information.
However, for the reasons set out above, Google continues to believe that the present case is a borderline case with respect to the final decision to delete the URL. In such borderline cases, further information or reflection may legitimately lead to a different conclusion, without necessarily an initial misjudgment. Google
therefore does not consider that the earlier decision not to delete the URL was
unreasonable or incorrect based on the information available at the time. Over time, however, Google has evaluated the approach to act with caution around deletions regarding Lexbase. Many less serious crimes are listed on Lexbase and it is reasonable to remove such URLs for individuals who have no role in public life.
Google's practices are regularly updated and improved based on feedback from data
subjects and data protection authorities.
What this means in the future for Lexbase-related URL deletion requests is that
Google will ask registrants for screenshots of the entire content behind the paywall.
Google will then take this evidence into account when evaluating individual cases and, in line with EDPB guidelines, delete information where there is no public interest, for example for minor or long-standing crimes.
Finally, as regards Google's compliance with Article 14(1)(c) and 14(2)(b) of the
GDPR, Google relies on its legitimate interests in reproducing personal data made
publicly available by websites that allow indexing by search engines, and third parties' legitimate interests in accessing these data, in accordance with Article 6(1)(f) of the GDPR. Google informs users of this in the Google Privacy Policy:
"In certain circumstances, Google also collects information about you from publicly
available sources. For example, if your name appears in the local newspaper,
Google's search engine may index the article and display it to others if they search for your name." (in the section entitled: "Information Google collects")
"We process your data for our legitimate interests and for the legitimate interests of third parties, while applying appropriate security measures to protect your privacy" (in the section entitled: "Compliance and cooperation with public authorities") Users can click on "third parties" to see the following example of how Google may process personal data on the basis of Article 6(1)(f) of the GDPR: "We may also process your data if someone searches for your name and we display search results for websites that contain publicly available information about you."
In reassessing this case, Google considers that it could have expressed its decision more clearly in its communications with the complainant. Google will review these communication practices. Google has also updated its internal instructions and communicated these to all staff handling removal requests. Google regularly trains its staff and these updates will be part of this training going forward.
Grounds for the decision
Applicable provisions etc.
General Data Protection Regulation and complementary national legislation
Article 12(2) of the GDPR requires the controller to facilitate the exercise of the data subject's rights in accordance with, inter alia, Articles 17 and 21.
 
According to Article 17(1)(c) of the GDPR, a data subject has the right to have
personal data relating to him or her erased if he or she objects to the processing
under Article 21(1) and there are no overriding legitimate grounds for the controller's processing. According to Article 17(3)(a), this shall not apply to the extent that processing is necessary for the exercise of the right to freedom of expression and information.
According to Article 21(1) of the GDPR, the data subject shall have the right to object at any time, on grounds relating to his or her particular situation, to the processing of personal data concerning him or her which is based on Article 6(1)(f) of the GDPR, including profiling based on these provisions. The controller may no longer process personal data unless it can demonstrate compelling legitimate grounds for doing so which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims.
Article 85 of the GDPR requires Member States to reconcile by law the right to privacy under the Regulation with freedom of expression and information, including processing for journalistic purposes. In Sweden, this has been done through Chapter 1 of the GDPR. The second paragraph of this provision provides, inter alia, for exemptions from the provisions of the General Data Protection Regulation in the case of processing of personal data for journalistic purposes.
Court of Justice of the European Union
The European Court of Justice has ruled in GC and others that a search engine
provider must, within the limits of its responsibility, competence and ability, ensure that the processing of personal data in the search engine business complies with the requirements of the data protection rules.
The specific nature of the activity does not exempt it from the prohibitions and
restrictions on the processing of sensitive and criminal data. However, these shall be applied within the scope of the responsibilities, powers and possibilities of the search engine provider as controller of the processing carried out in the course of its activities.
The prohibitions and restrictions therefore apply to the reference to the web pages
where the data appear as a result of the display of search results, in the context of the examination which the search engine provider must carry out at the request of the data subject.
The CJEU has also clarified in GC and others how the right to erasure applies in
relation to the GDPR and criminal data, stating that the interest of internet users in accessing a website containing such data through searches related to a data subject's name must be taken into account and that this right to freedom of information is protected by Article 11 of the EU Charter of Fundamental Rights. The Court stated that while the rights of the data subject under Articles 7 and 8 of the Charter generally outweigh the interest of Internet users, the balance may be affected in particular cases by the nature of the information and its sensitivity to the privacy of the data subject and the public interest in the information,which may vary depending, inter alia, on the role played by the data subject in public life (paragraph 66). In addition, the processing of criminal data may constitute a particularly serious interference with the data subject's fundamental right to respect for private life and protection of personal data, given the sensitivity of such data (paragraph 67). It must therefore be examined, having regard to the circumstances and the seriousness of the interference, whether, in view of the important reasons of public interest which referred to in Article 9(2)(g) and subject to the conditions laid down therein, it is strictly necessary that search results linking to a website containing criminal data be displayed in searches relating to the name of the data subject in order to protect the freedom of information of Internet users who might be interested in accessing the website through such searches, a freedom enshrined in Article 11 of the Charter (paragraph 68). 4European Data Protection Board (EDPB)
The European Data Protection Board (EDPB), in its guidelines on the application of the above-mentioned provisions of the GDPR and the rulings of the European Court of
Justice on the removal of search hits, has stated, inter alia, that. 5
The right to object entails stronger safeguards for data subjects as it does not
include any limitation of the grounds on which data subjects can request deletion
under Article 17(1).
The GDPR provides a presumption in favour of the data subject and it is for the
controller to demonstrate "compelling legitimate grounds for the processing"
(Article 21(1)). This means that a search engine provider that receives a deletion
request based on the specific situation of the data subject must delete the personal data under Article 17(1)(c), unless it can demonstrate 'overriding legitimate grounds' for the display of the specific search result, which compared to Article 21(1) is 'overriding legitimate grounds [...] which outweigh the interests, rights and freedoms of the data subject'. If the search engine provider can demonstrate 'overriding legitimate reasons', including exceptions under Article 17(3), the search engine provider may justify not removing a search hit. However, if the search engine provider cannot demonstrate overriding legitimate reasons, the data subject shall be entitled to deletion under Article 17(1)(c). Indeed, requests for removal involve a balancing of the reasons relating to the specific situation of the data subject and the decisive legitimate reasons of the search engine provider. In making such an assessment, it may be relevant to consider the balance between the protection of privacy and the interests of Internet users in accessing the information, as established by the Court of Justice of the European Union, as well as the balance struck by the European Court of Human Rights in cases concerning freedom of the press.
The criteria for removal developed by the Article 29 Working Party in the
Guidelines on the implementation of the judgment of the Court of Justice in Google
Spain and Google,6 can therefore still be used to assess a request for removal
based on the right to object (Article 17(1)(c)).
In this respect, a request for deletion will be based on the specific situation of the data subject (e.g. whether a search result puts the data subject at a disadvantage when looking for a job, or harms the data subject's reputation or privacy) and this situation will be taken into account when balancing personal rights and the right to information, in addition to the classical criteria for dealing with deletion requests, such as
• if he or she is not a public figure,
• if the information in question is not related to the person's professional
life but affects his or her private life,
• if the information consists of hate speech, defamation, libel or similar offences
against his or her freedom of expression as determined by a court of law
• if the information appears to be verified facts but is factually inaccurate, and
• if the data relate to a relatively minor offence committed a long time ago and
are harmful to the data subject.
However, no assessment of these criteria is required in the absence of evidence
of compelling legitimate reasons for refusing the request.
One circumstance that may militate against removal is if the person concerned
plays a role in public life or is a public figure.
A role in public life normally refers, for example, to politicians, senior public
officials, business people and practitioners of regulated professions (e.g. lawyers
and doctors). It can be argued that the public should be able to search for
information relevant to their public roles and activities. The rule of thumb is
whether having access to the specific information - through a search on the name
of the data subject - would protect the public from improper public or professional
conduct. A public figure is someone who, by virtue of their duties or obligations,
has a certain degree of media exposure. It may be someone who holds public
office, uses public resources or plays a role in public life in politics, economics, the arts, the social sphere, sport or any other field. There may be information about public figures that is purely private and should not appear in search results, such as information about their health or family members. However, as a rule of thumb, if the person concerned is a public figure, and the information in question is not purely private, this militates against deletion. In determining the balance, the case law of the European Court of Human Rights (ECtHR) is particularly relevant.
As the European Court of Justice explains in its judgment in G.C. and others, C�136/17, Article 17(3)(a) of the GDPR 'expresses that the right to the protection of personal data is not an absolute right, but [...] must be understood in relation to its function in society and balanced against other fundamental rights in accordance with the principle of proportionality'. It therefore 'explicitly requires a balancing of the fundamental rights to respect for private life and protection of personal data, as enshrined in Articles 7 and 8 of the Charter, with the fundamental right to freedom of information guaranteed by Article 11 of the [EU Charter of Fundamental Rights]'.
The CJEU further states that "where a search engine provider receives a request
for removal relating to a link to a web page containing [sensitive personal data or
data relating to infringements] [...], it must examine, on the basis of all the
relevant circumstances of the case and having regard to the seriousness of the
interference with the data subject's fundamental right to respect for private life
and protection of personal data, as laid down in Articles 7 and 8 of the [EU
Charter of Fundamental Rights], whether, having regard to the important
[reasons] in the public interest [....] it is strictly necessary to include the link to the website in question among the search results displayed following a search for the name of the data subject in order to protect the freedom of information of internet users who may be interested in accessing that website through such a search, a freedom enshrined in Article 11 of the Charter".
In conclusion, depending on the circumstances of the case, search engine
providers may refuse a removal request if they can show that it is strictly
necessary to include the search hit in the search results in order to protect the
freedom of information of internet users.
The Article 29 Working Party has stated that in assessing whether a search hit should be removed, depending on the context, it may be relevant to consider whether the information on a website was published for journalistic purposes and that the fact that the information was published by a journalist whose profession is to inform the public is a factor to be taken into account in the balancing exercise. 8
With regard to the meaning of the concept, the EDPB has noted that the Court of
Justice of the European Union, when examining the right to removal from search
engines, distinguishes between the legitimacy that the publisher of a website may
have in disseminating information and the legitimacy of the search engine provider. In doing so, the CJEU has recognised that a publisher's activities may be exclusively for journalistic purposes, in which case the publisher would benefit from the exemptions that Member States may establish in such situations on the basis of Article 9 of the Directive (now Article 85(2) of the GDPR). In this respect, the CJEU's judgment of 28 June 2018 in M.L. and W.W. v. Germany suggests that the balancing of interests in question may lead to different results depending on the type of request (distinguishing between, on the one hand, a request for erasure directed against the original publisher - whose activities are central to the protection of freedom of expression - and, on the other hand, a request for deletion directed against a search engine provider - whose main interest is not to publish the original information about the data subject, but to enable the identification of any available information about the person and thus to establish the person's profile). 9 European Court of Human Rights
The case-law of the European Court of Human Rights shows that when data subjects
request a ban on the publication on the internet of old media reports on criminal
proceedings against them, a fair balance must be struck between the right to privacy of the persons concerned and, inter alia, the public's right to freedom of information. In seeking this fair balance, account should be taken of the important role of the press in a democratic society, which includes reporting on and commenting on court proceedings. In addition to the role of the media in conveying such information and opinions, there is also the right of the public to receive the information. In this context, the European Court of Human Rights has recognised that the public has an interest not only in being informed about a current event, but also in being able to investigate past events.
However, the scope of the public interest in criminal proceedings may vary and evolve over time, taking into account, inter alia, the circumstances of the case.
Privacy Authority IMY has provided guidance in a legal opinion11 on the various factors that are relevant to the balancing of interests when a data subject requests the removal of a search hit, in particular when the linked page relates to a publication in a news medium. Among it is also considered that if a search result leads to a publication that falls under both
the responsible publisher system of the freedom of expression principles and the
media ethics system, it should, as a starting point, be given independent weight in the balancing of interests as a factor against removal, although the balancing of interests should be made on the basis of an overall assessment of all the relevant
circumstances of the case. 12 It follows, therefore, that if an activity is covered only by the system of responsible publishers under the freedom of expression foundations, this cannot in itself be given independent weight, since this system does not, on its own, have the same protective effects for individuals as the media ethics system, and may also cover activities which have no journalistic purpose (since this is not a precondition for the granting of voluntary constitutional protection through the so�called 'certificate of publication'). Such a factor must therefore be assessed on a case�by-case basis. IMY has stated in a legal opinion on the concept of "journalistic purposes "13 that the concept should be given a broad interpretation. It emphasises that journalistic activities are those aimed at disseminating information, opinions or ideas to the public, regardless of the medium through which this is done. The concept thus has a broader meaning than in everyday language and covers not only professional journalists and traditional mass media, but all persons engaged in activities aimed at disseminating information, opinions or ideas to the public. However, IMY stresses that the concept cannot be given such a broad meaning as to include all information made available on the internet and containing personal data. In this context, IMY points out that,
according to the European Court of Justice, the concept does not cover the processing carried out by a search engine provider through the provision of the search engine and that in the Swedish constitutional legislative work it has been established that the same should apply to a purely public search service relating to criminal convictions. 14
Furthermore, the statement clarifies how Chapter 1. 7 of the Data Protection Act is to be applied when there are other parallel purposes to the journalistic purpose and what significance it has that the term "exclusively" is not included in Chapter 1. 7(2) of the Data Protection Act and Article 85 of the GDPR. Among other things, IMY considers that, based on the wording of the relevant provisions, there is no requirement that processing be carried out exclusively for journalistic purposes for the exemption to apply. 15 However, IMY considers that it should be required that the journalistic purpose is the main purpose of the processing for the exemption to apply. 16 Privacy Authority's assessment
Has Google handled the complainant's requests correctly?
IMY is required by Article 57(1)(f) of the GDPR to deal with complaints about the
incorrect processing of personal data and, where appropriate, to investigate the
subject matter of the complaint.
 
12(14)
It appears from the information in the case that Google has taken steps to respect the complainant's rights by removing the relevant search hit from searches related to the complainant's name. In the light of the above, IMY does not consider it necessary to investigate the complaint further in this respect.
However, IMY finds grounds to investigate the complaint insofar as it concerns the
wording used by Google in the refusal decision of 16 November 2021 of the
complainant's removal request concerning the Lexbase directory service, i.e. that
Google did not have access to the material behind the paywall.
The investigation shows that the open part of the directory service shows the
complainant's full name, age and location and that the complainant is "present" in the database.
It also states that "[the directory service] is a web service offering public information from Swedish courts and other authorities. Here you can find judgments in criminal and/or civil cases concerning individuals".
Google justified its rejection decision on the grounds that 'Google does not have  
access to the content of the [directory service]' and '[u]nder the data available to [Google] ... cannot conclude that the information is inaccurate or out of date'.
However, IMY notes that the complainant, through his request, objected to the
processing on the basis of Article 17(1)(c) and Article 21 of the GDPR with reference to his specific situation. Under Article 21(1), in such circumstances, the burden of proof is on the controller (i.e. Google) and not on the data subject (i.e. the complainant) to show that there are compelling legitimate grounds for refusing the request.17 Such a ground could be, for example, that there is a legitimate public interest in being able to access a particular publication through searches related to a particular person's name on a search engine such as Google's. However, such a public interest normally requires that the person requesting removal is a public figure or has a role in public life, which it had not been shown that the complainant was, either at the time of the refusal decision or subsequently. The fact that the directory service has a certificate of publication and that the documents behind the paywall are public does not, either individually or taken together, mean that the publication is in the public interest or can
be regarded as having been made for journalistic purposes. 18 Thus, in the absence of conclusive justification for refusing the request, Google should,
as a general rule, have granted it on the basis of the evidence available at that time, which it did not do. Furthermore, the reasoning given by Google to the complainant in the decision - that Google did not have access to the content behind the paywall of the directory service - gave the erroneous impression that the complainant had to submit such evidence in order to have its request granted. Since the content is behind a paywall, anyone wishing to access it has to pay the owner of the directory service (currently €98). The consequence of Google's action was thus that the data subject (the complainant) had to pay a third party (Lexbase) in practice in order to exercise his rights (right of deletion) with the controller (Google). Google has thus not facilitated the exercise of the complainant's rights under the GDPR and has thus processed the complainant's personal data in breach of Article 12(2).
 
Choice of intervention
Articles 58(2)(i) and 83(2) of the GDPR provide that IMY has the power to impose
administrative fines in accordance with Article 83. Depending on the circumstances of the case, administrative fines shall be imposed in addition to or instead of the other measures referred to in Article 58(2), such as injunctions and prohibitions.
Furthermore, Article 83(2) sets out the factors to be taken into account when deciding whether to impose administrative fines and when determining the amount of the fine.
In the case of a minor infringement, IMY may, as indicated in recital 148, instead of imposing a penalty, issue a reprimand under Article 58(2)(b). Account will be taken of aggravating and mitigating circumstances of the case, such as the nature, gravity and duration of the infringement and any relevant previous infringements.
IMY notes the following relevant facts. The breach has affected an individual and
involved difficult trade-offs between competing fundamental rights and freedoms.
During the procedure, Google has respected the complainant's rights and has taken
several remedial measures and improved its information to data subjects.
IMY concludes that, on the basis of an overall assessment of the circumstances, in
particular taking into account the difficult balancing of competing fundamental rights and freedoms, there are grounds to refrain from imposing a fine on Google for the infringement found.
In view of the above, IMY also considers that this is a minor breach within the meaning of recital 148, which requires Google LLC to be reprimanded under Article 58(2)(b) of the GDPR for the breach found.
This decision has been taken by the special decision-maker Olle Pettersson, lawyer, on the recommendation of Martin Wetzler, lawyer.
Olle Pettersson, 2022-07-26 (This is an electronic signature)
Copy to
The complainant
 
How to appeal
If you wish to appeal against the decision, you should write to the Data Protection
Authority. In your letter, please state which decision you are appealing and the change you are requesting. The appeal must be received by the Office no later than three weeks from the date on which you received the decision. If the appeal is received in time, the Office will forward it to the Administrative Court in Stockholm for review.
You can email the appeal to the Data Protection Authority if it does not contain any personal data that is sensitive to privacy or that may be covered by confidentiality. The Authority's contact details are given on the first page of the decision.</pre>

Latest revision as of 10:19, 25 August 2022

IMY - DI-2022-2351/2372/2373/2374/2375
LogoSE.png
Authority: IMY (Sweden)
Jurisdiction: Sweden
Relevant Law: Article 12(2) GDPR
Article 17(1) GDPR
Article 21(1) GDPR
Type: Complaint
Outcome: Upheld
Started:
Decided: 26.07.2022
Published:
Fine: n/a
Parties: Google LLC
National Case Number/Name: DI-2022-2351/2372/2373/2374/2375
European Case Law Identifier: n/a
Appeal: Unknown
Original Language(s): Swedish
Original Source: Decision (in SE)
Initial Contributor: n/a

The Swedish DPA reprimanded Google LLC for refusing several requests for search results' removal under Article 17(1) and 21 GDPR. The DPA held that Google could not refuse to assess the continuing relevance and accuracy of the concerned web pages' contents just because they were behind a paywall.

English Summary

Facts

The Swedish DPA assessed five complaints from data subjects who had requested removal of search results in accordance with Article 17(1) and 21 GDPR. All five requests had been denied on the ground that the contents of the web pages in question were inaccessible to Google due to paywalls, and that Google was therefore unable to assess whether the contents were no longer relevant or inaccurate.

After the complaints were filed, Google acted on the requests. Nevertheless, the Swedish DPA assessed Google's explanation for refusing the requests in the first place.

Holding

The DPA held that the data subject had exercised their right to object to the processing in accordance with the GDPR. Pursuant to Article 17(1)(c) GDPR, the burden of proof for the existence of overriding legitimate grounds for the continued processing is on the controller and not the data subject. The DPA highlighted that by denying the requests due to Google's alleged inability to access the contents of the relevant web pages, Google was placing the burden of proof on the data subjects. By this logic, the data subjects would have had to gather additional information from the relevant web pages themselves in order for Google to act on the requests. As a result, the data subjects would in practice be forced to pay to be able to exercise their rights.

Against this background, the Swedish DPA reprimanded Google for violating Articles 12, 17 and 21 GDPR.

Comment

The uploaded decision is one of five nearly identical decisions issued by the Swedish DPA on this matter.

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the German original. Please refer to the German original for more details.

Google LLC
Diary number:
DI-2022-2374
Your diary 
number:
5-4751000031502
Date:
2022-07-26
Decision following supervision under 
the General Data Protection 
Regulation - Google LLC's removal of 
search results for the Lexbase 
directory service
Decision of the Privacy Authority
The Privacy Commissioner finds that Google LLC has processed personal data in 
breach of Article 12(2) of the GDPR1 , by stating in its response to the complainant's request for removal under Article 17(1)(c) and 21 on 16 November 2021, without being able to demonstrate compelling legitimate grounds for refusing the request, that the request is not granted because Google does not have access to the content of the URL in question and thus did not facilitate the complainant's exercise of his rights.
The DPA reprimands Google LLC under Article 58(2)(b) of the GDPR for breach of 
Article 12(2).
Postal address:
Box 8114
104 20 Stockholm
Website:
www.imy.se
E-mail:
imy@imy.se
Phone:
08-657 61 00

Summary of the decision
The Privacy Authority (IMY) has examined Google's handling of the complainant's 
requests for Google's search engine not to display search results to the Lexbase 
directory service for searches related to the complainant's name (removal request), 
following a complaint from an individual. The open part of the directory service 
indicates whether a person appears in the database, which consists of documents 
from courts and public authorities, including criminal and civil judgments. Behind a paywall, access to these documents is offered to anyone. Google initially refused the requests, but in the course of the procedure has granted them. IMY has therefore not found any reason to investigate the complaint further in 
this respect. However, IMY has found grounds to investigate the complaint further as regards  Google's justification in its message to the complainant as to why Google initially rejected the request. According to this reasoning, the request was rejected because Google did not have access to the content behind the payment gateway and was therefore unable to assess whether it was inaccurate or out of date. IMY states in the decision that the complainant has exercised his/her right to object to the processing under the GDPR by submitting his/her request. In such circumstances, it is the controller (Google) and not the data subject (the complainant) who bears the  burden of proof to show that there are compelling legitimate grounds for refusing the request. Such a reason could be, for example, that there is a legitimate interest for the public to be able to access a particular publication through searches related to a particular person's name on a search engine such as Google's. However, such a public interest normally requires that the person requesting removal is a public figure or has a role in public life, which it had not been shown that the complainant was, either at the time of the refusal decision or subsequently. The fact that the directory 
service has a certificate of publication and that the documents behind the paywall are public does not, either individually or taken together, mean that the publication is in the public interest or can be considered to have been made for journalistic purposes. No other factor constituting decisive legitimate reasons has been brought to light. If, as in the present case, there are no compelling legitimate grounds for a request for removal based on an objection, the request shall be granted without any balancing exercise being required. Google's response in the case thus gave the erroneous impression that the complainant had to submit additional evidence to enable such a balancing exercise to be carried out. The only way for the complainant to access the evidence was to pay the owner of the directory service - in other words, the complainant was effectively forced to pay to exercise his right. Google has thus not facilitated the complainant's exercise of his rights and has processed his personal data in breach of Article 12(2) of the General Data Protection Regulation.

As Google has complied with the request and taken several remedial measures, and 
as there were difficult trade-offs involved, IMY has considered it appropriate to refrain from imposing a penalty on Google for the breach found and to limit itself to issuing a reprimand under Article 58(2)(b) of the GDPR.As mentioned above, IMY has also considered four other complaints with similar circumstances during the same period, with the same conclusions and outcome.


Description of the supervision case
The Privacy Authority (IMY) has initiated an enforcement action against Google LLC 
(Google) following a complaint that Google has denied two requests (Google ref no 5-4751000031502) for Google to not display three URLs in its search services for 
searches related to the complainant's name.
The complaint
The complaint essentially states.
The complainant contacted Google on 14 November 2021 requesting that Google 
remove search results relating to an internet directory service (ref no 5-
4751000031502), which show the complainant's full name, age and postal address and 
that the complainant is listed in the directory service's database of court documents.
The complainant argued that the request should be granted because the websites 
contain information on a criminal conviction. The complainant has served the 
sentence of 10 months' imprisonment. The sentence has no connection with the 
complainant's professional activity. The complainant is not a public person and has no role in public life, nor did he have one at the time of the offence. The complainant suffers great harm from the display. The complainant lost his job as a result of the conviction and has faced obstacles in both his professional and private relationships as a result of the publication. Furthermore, the pages show the complainant's full personal identity number, which is an extra-protected personal data whose processing without consent is specifically regulated by Swedish law in addition to the General Data Protection Regulation. As can be understood, the complainant has thus objected to the processing in accordance with Article 17(1)(c) of the GDPR.

In its refusal decision of 16 November 2021, Google denied the complainant's request for removal on the grounds that "Google does not have access to the content of the [directory service]. Therefore, based on the information available to us, we cannot conclude that the information is inaccurate or out of date. We have therefore decided not to take action in respect of this URL(s)'.
On 22 November 2021, the complainant submitted the content of the URLs in the form 
of a screenshot of the website and a copy of the judgment.
Subsequently, in a refusal decision dated 3 December 2021, Google denied the 
complainant's request, stating that "Google has concluded that the information about you on these URLs - taking into account all the circumstances of the case known to us - is still relevant for data processing purposes. Therefore, the reference to this document in our search results is justified on grounds of public interest".
The complainant then objected again on 5 December 2021 to the processing under 
Article 21 of the GDPR and sent a reminder on 21 December. Nevertheless, in breach 
of Article 12 of the GDPR, Google has not responded within one month with a review 
of the request or information on the extension, thus preventing the complainant from asserting his rights as a data subject. Google has still not responded.
In particular, the complaint points out that Google did not invoke any specific legal basis, did not inform in accordance with Article 21(4) of the GDPR about the right to object under Article 21(1) and did not inform that the data subject may obtain information on the assessment upon request as proposed in the Article 29 Working Party's Guidelines on Transparency (WP260 rev.01).
What Google has stated
Google has indicated in a statement to IMY on 28 March 2022 that, after further review, Google has granted the request and executed the removal on 16 March 2022.
In its submission to IMY of 25 April 2022, Google has stated in essence the following.
General
When Google receives a request for deletion under the right to be forgotten, Google 
makes an assessment in accordance with the case law of the European Court of 
Justice and the guidelines of the European Data Protection Board (EDPB), in the light of the requirements of Article 17(1)(c) and Article 21(1) of the GDPR. These 
requirements are part of the balancing test that Google carries out when dealing with right to be forgotten requests, where Google considers whether there are compelling legitimate reasons for the URLs to continue to be displayed in the event of a search on the data subject's name, and balances all relevant interests in accordance with the EDPB guidelines. In light of Articles 12(1), 12(2) and 12.4, Google also provides an FAQ on the process explaining how individuals make requests and how Google evaluates such requests, including examples of common scenarios where Google removes content (e.g. lack of public interest and content relating to minors) and where Google does not (e.g. strong public interest, where for example the information relates to the data subject's professional life or otherwise his/her role in public life).
The handling in this case
The URLs in this dossier link to content on the Lexbase website. Lexbase is a legal 
directory service containing public information from Swedish courts and authorities. It allows users to search for individuals and companies that have been the subject of criminal or civil claims in Swedish courts. Lexbase holds a certificate of publication. It is possible to search for individuals and company names, but the full information on convictions is available behind a paywall.
As Google does not have access to the information behind the paywall, it is not 
possible for Google to assess the full content of the URL. It is therefore also not 
possible to verify whether the information provided by the complainant is correct. 
Google has therefore assumed that the material on Lexbase is an accurate copy of 
public records.
One of the factors Google takes into account when determining whether information is in the public interest is where the information comes from. Government records play a crucial role in keeping society informed about matters of public interest, and the government's decision to publish and make it publicly available is a strong indication that it believes there is a public interest in the information. Lexbase gets its information from Swedish courts, i.e. from publicly available sources. The source is reliable and the information can, as a general rule, be assumed to be in the public interest when it is available in the public domain.
Databases similar to Lexbase also exist in other jurisdictions, and what Google often sees is that registrants may be inclined to mischaracterize the content behind the paywall, provide incomplete or inaccurate information, or provide subjective descriptions of situations that speak in their favour. Cases are difficult to assess because Google can only make a judgement based on the information it receives in the case in question. It also involves assessing the credibility of different sources. The fact that some information comes from a public source, published by a government agency or a court, will naturally carry more weight when what is in the other scale is the subjective version of the complainant.
Google notes that Google is a separate data controller from the website operator 
Lexbase. The distinction is why Google has an obligation to delete search results 
under data protection law, even when the website owner has not necessarily violated 
any law in publishing the material in question. Google is aware that Lexbase has been the subject of considerable public debate and that the incomplete display of an individual's presence in the database may be seen as problematic and have negative consequences for the individual. However, it remains the case that material on Lexbase is part of a public record and remains available from official sources. 
Therefore, in these very difficult borderline cases, when balancing all available facts, Google made the judgment that the URLs should not be deleted.
Based on the above considerations, Google decided to reject the complainant's 
request on 16 November and 3 December 2021. The decision also clearly stated that 
the nature of the website was the reason why Google rejected the request.
Google communicated the decision and the reasons for it to the complainant in 
accordance with Article 12(1) and 12(4) of the GDPR. This gave the complainant the opportunity to evaluate whether the decision taken by Google achieved the purpose of his request. The complainant had the possibility to submit a further request if he was not satisfied with the outcome. Google also informed the complainant that he could lodge a complaint with his local data protection authority.
The fact that the complainant then provided screenshots from Lexbase did not change 
Google's decision. This in light of the above considerations regarding Lexbase as a 
whole. It was not possible for Google to verify that the screenshots submitted by the complainant constituted all the information behind the paywall on Lexbase. It was therefore also not possible for Google to make a full assessment of the credibility of the complainant's allegations. However, Google had knowledge that the material in question on Lexbase was in all likelihood an accurate copy of a public record. In view of the freedom of information in these complex cases, where the balance of interests is difficult, the decision to refuse deletion in the complainant's case appeared to be reasonable for Google.
After receiving several requests for information from IMY, seeing requests from other registrants related to Lexbase, and reflecting more deeply on Lexbase cases, Google decided to evaluate the removal practices regarding Lexbase and other similar databases. Google carefully considered the questions raised by IMY and re�evaluated the decision in the present case. As mentioned above, Google decided to 
grant the complainant's request and remove the URL after receiving IMY's request 
for information.
However, for the reasons set out above, Google continues to believe that the present case is a borderline case with respect to the final decision to delete the URL. In such borderline cases, further information or reflection may legitimately lead to a different conclusion, without necessarily an initial misjudgment. Google
therefore does not consider that the earlier decision not to delete the URL was 
unreasonable or incorrect based on the information available at the time. Over time, however, Google has evaluated the approach to act with caution around deletions regarding Lexbase. Many less serious crimes are listed on Lexbase and it is reasonable to remove such URLs for individuals who have no role in public life. 
Google's practices are regularly updated and improved based on feedback from data 
subjects and data protection authorities.
What this means in the future for Lexbase-related URL deletion requests is that 
Google will ask registrants for screenshots of the entire content behind the paywall. 
Google will then take this evidence into account when evaluating individual cases and, in line with EDPB guidelines, delete information where there is no public interest, for example for minor or long-standing crimes.
Finally, as regards Google's compliance with Article 14(1)(c) and 14(2)(b) of the 
GDPR, Google relies on its legitimate interests in reproducing personal data made 
publicly available by websites that allow indexing by search engines, and third parties' legitimate interests in accessing these data, in accordance with Article 6(1)(f) of the GDPR. Google informs users of this in the Google Privacy Policy:
"In certain circumstances, Google also collects information about you from publicly 
available sources. For example, if your name appears in the local newspaper, 
Google's search engine may index the article and display it to others if they search for your name." (in the section entitled: "Information Google collects")
"We process your data for our legitimate interests and for the legitimate interests of third parties, while applying appropriate security measures to protect your privacy" (in the section entitled: "Compliance and cooperation with public authorities") Users can click on "third parties" to see the following example of how Google may process personal data on the basis of Article 6(1)(f) of the GDPR: "We may also process your data if someone searches for your name and we display search results for websites that contain publicly available information about you."
In reassessing this case, Google considers that it could have expressed its decision more clearly in its communications with the complainant. Google will review these communication practices. Google has also updated its internal instructions and communicated these to all staff handling removal requests. Google regularly trains its staff and these updates will be part of this training going forward.
Grounds for the decision
Applicable provisions etc.
General Data Protection Regulation and complementary national legislation
Article 12(2) of the GDPR requires the controller to facilitate the exercise of the data subject's rights in accordance with, inter alia, Articles 17 and 21.

According to Article 17(1)(c) of the GDPR, a data subject has the right to have 
personal data relating to him or her erased if he or she objects to the processing 
under Article 21(1) and there are no overriding legitimate grounds for the controller's processing. According to Article 17(3)(a), this shall not apply to the extent that processing is necessary for the exercise of the right to freedom of expression and information.
According to Article 21(1) of the GDPR, the data subject shall have the right to object at any time, on grounds relating to his or her particular situation, to the processing of personal data concerning him or her which is based on Article 6(1)(f) of the GDPR, including profiling based on these provisions. The controller may no longer process personal data unless it can demonstrate compelling legitimate grounds for doing so which override the interests, rights and freedoms of the data subject or for the establishment, exercise or defence of legal claims.
Article 85 of the GDPR requires Member States to reconcile by law the right to privacy under the Regulation with freedom of expression and information, including processing for journalistic purposes. In Sweden, this has been done through Chapter 1 of the GDPR. The second paragraph of this provision provides, inter alia, for exemptions from the provisions of the General Data Protection Regulation in the case of processing of personal data for journalistic purposes.
Court of Justice of the European Union
The European Court of Justice has ruled in GC and others that a search engine 
provider must, within the limits of its responsibility, competence and ability, ensure that the processing of personal data in the search engine business complies with the requirements of the data protection rules.
The specific nature of the activity does not exempt it from the prohibitions and 
restrictions on the processing of sensitive and criminal data. However, these shall be applied within the scope of the responsibilities, powers and possibilities of the search engine provider as controller of the processing carried out in the course of its activities. 
The prohibitions and restrictions therefore apply to the reference to the web pages 
where the data appear as a result of the display of search results, in the context of the examination which the search engine provider must carry out at the request of the data subject.
The CJEU has also clarified in GC and others how the right to erasure applies in 
relation to the GDPR and criminal data, stating that the interest of internet users in accessing a website containing such data through searches related to a data subject's name must be taken into account and that this right to freedom of information is protected by Article 11 of the EU Charter of Fundamental Rights. The Court stated that while the rights of the data subject under Articles 7 and 8 of the Charter generally outweigh the interest of Internet users, the balance may be affected in particular cases by the nature of the information and its sensitivity to the privacy of the data subject and the public interest in the information,which may vary depending, inter alia, on the role played by the data subject in public life (paragraph 66). In addition, the processing of criminal data may constitute a particularly serious interference with the data subject's fundamental right to respect for private life and protection of personal data, given the sensitivity of such data (paragraph 67). It must therefore be examined, having regard to the circumstances and the seriousness of the interference, whether, in view of the important reasons of public interest which referred to in Article 9(2)(g) and subject to the conditions laid down therein, it is strictly necessary that search results linking to a website containing criminal data be displayed in searches relating to the name of the data subject in order to protect the freedom of information of Internet users who might be interested in accessing the website through such searches, a freedom enshrined in Article 11 of the Charter (paragraph 68). 4European Data Protection Board (EDPB)
The European Data Protection Board (EDPB), in its guidelines on the application of the above-mentioned provisions of the GDPR and the rulings of the European Court of 
Justice on the removal of search hits, has stated, inter alia, that. 5
The right to object entails stronger safeguards for data subjects as it does not 
include any limitation of the grounds on which data subjects can request deletion 
under Article 17(1).
The GDPR provides a presumption in favour of the data subject and it is for the 
controller to demonstrate "compelling legitimate grounds for the processing" 
(Article 21(1)). This means that a search engine provider that receives a deletion 
request based on the specific situation of the data subject must delete the personal data under Article 17(1)(c), unless it can demonstrate 'overriding legitimate grounds' for the display of the specific search result, which compared to Article 21(1) is 'overriding legitimate grounds [...] which outweigh the interests, rights and freedoms of the data subject'. If the search engine provider can demonstrate 'overriding legitimate reasons', including exceptions under Article 17(3), the search engine provider may justify not removing a search hit. However, if the search engine provider cannot demonstrate overriding legitimate reasons, the data subject shall be entitled to deletion under Article 17(1)(c). Indeed, requests for removal involve a balancing of the reasons relating to the specific situation of the data subject and the decisive legitimate reasons of the search engine provider. In making such an assessment, it may be relevant to consider the balance between the protection of privacy and the interests of Internet users in accessing the information, as established by the Court of Justice of the European Union, as well as the balance struck by the European Court of Human Rights in cases concerning freedom of the press.
The criteria for removal developed by the Article 29 Working Party in the 
Guidelines on the implementation of the judgment of the Court of Justice in Google 
Spain and Google,6 can therefore still be used to assess a request for removal 
based on the right to object (Article 17(1)(c)).
In this respect, a request for deletion will be based on the specific situation of the data subject (e.g. whether a search result puts the data subject at a disadvantage when looking for a job, or harms the data subject's reputation or privacy) and this situation will be taken into account when balancing personal rights and the right to information, in addition to the classical criteria for dealing with deletion requests, such as
• if he or she is not a public figure,
• if the information in question is not related to the person's professional 
life but affects his or her private life,
• if the information consists of hate speech, defamation, libel or similar offences 
against his or her freedom of expression as determined by a court of law
• if the information appears to be verified facts but is factually inaccurate, and
• if the data relate to a relatively minor offence committed a long time ago and 
are harmful to the data subject.
However, no assessment of these criteria is required in the absence of evidence 
of compelling legitimate reasons for refusing the request.
One circumstance that may militate against removal is if the person concerned 
plays a role in public life or is a public figure.
A role in public life normally refers, for example, to politicians, senior public 
officials, business people and practitioners of regulated professions (e.g. lawyers 
and doctors). It can be argued that the public should be able to search for 
information relevant to their public roles and activities. The rule of thumb is 
whether having access to the specific information - through a search on the name 
of the data subject - would protect the public from improper public or professional 
conduct. A public figure is someone who, by virtue of their duties or obligations, 
has a certain degree of media exposure. It may be someone who holds public 
office, uses public resources or plays a role in public life in politics, economics, the arts, the social sphere, sport or any other field. There may be information about public figures that is purely private and should not appear in search results, such as information about their health or family members. However, as a rule of thumb, if the person concerned is a public figure, and the information in question is not purely private, this militates against deletion. In determining the balance, the case law of the European Court of Human Rights (ECtHR) is particularly relevant.
As the European Court of Justice explains in its judgment in G.C. and others, C�136/17, Article 17(3)(a) of the GDPR 'expresses that the right to the protection of personal data is not an absolute right, but [...] must be understood in relation to its function in society and balanced against other fundamental rights in accordance with the principle of proportionality'. It therefore 'explicitly requires a balancing of the fundamental rights to respect for private life and protection of personal data, as enshrined in Articles 7 and 8 of the Charter, with the fundamental right to freedom of information guaranteed by Article 11 of the [EU Charter of Fundamental Rights]'.
The CJEU further states that "where a search engine provider receives a request 
for removal relating to a link to a web page containing [sensitive personal data or 
data relating to infringements] [...], it must examine, on the basis of all the 
relevant circumstances of the case and having regard to the seriousness of the 
interference with the data subject's fundamental right to respect for private life 
and protection of personal data, as laid down in Articles 7 and 8 of the [EU 
Charter of Fundamental Rights], whether, having regard to the important 
[reasons] in the public interest [....] it is strictly necessary to include the link to the website in question among the search results displayed following a search for the name of the data subject in order to protect the freedom of information of internet users who may be interested in accessing that website through such a search, a freedom enshrined in Article 11 of the Charter".
In conclusion, depending on the circumstances of the case, search engine 
providers may refuse a removal request if they can show that it is strictly 
necessary to include the search hit in the search results in order to protect the 
freedom of information of internet users.
The Article 29 Working Party has stated that in assessing whether a search hit should be removed, depending on the context, it may be relevant to consider whether the information on a website was published for journalistic purposes and that the fact that the information was published by a journalist whose profession is to inform the public is a factor to be taken into account in the balancing exercise. 8
With regard to the meaning of the concept, the EDPB has noted that the Court of 
Justice of the European Union, when examining the right to removal from search 
engines, distinguishes between the legitimacy that the publisher of a website may 
have in disseminating information and the legitimacy of the search engine provider. In doing so, the CJEU has recognised that a publisher's activities may be exclusively for journalistic purposes, in which case the publisher would benefit from the exemptions that Member States may establish in such situations on the basis of Article 9 of the Directive (now Article 85(2) of the GDPR). In this respect, the CJEU's judgment of 28 June 2018 in M.L. and W.W. v. Germany suggests that the balancing of interests in question may lead to different results depending on the type of request (distinguishing between, on the one hand, a request for erasure directed against the original publisher - whose activities are central to the protection of freedom of expression - and, on the other hand, a request for deletion directed against a search engine provider - whose main interest is not to publish the original information about the data subject, but to enable the identification of any available information about the person and thus to establish the person's profile). 9 European Court of Human Rights
The case-law of the European Court of Human Rights shows that when data subjects 
request a ban on the publication on the internet of old media reports on criminal 
proceedings against them, a fair balance must be struck between the right to privacy of the persons concerned and, inter alia, the public's right to freedom of information. In seeking this fair balance, account should be taken of the important role of the press in a democratic society, which includes reporting on and commenting on court proceedings. In addition to the role of the media in conveying such information and opinions, there is also the right of the public to receive the information. In this context, the European Court of Human Rights has recognised that the public has an interest not only in being informed about a current event, but also in being able to investigate past events.
However, the scope of the public interest in criminal proceedings may vary and evolve over time, taking into account, inter alia, the circumstances of the case.
Privacy Authority IMY has provided guidance in a legal opinion11 on the various factors that are relevant to the balancing of interests when a data subject requests the removal of a search hit, in particular when the linked page relates to a publication in a news medium. Among it is also considered that if a search result leads to a publication that falls under both 
the responsible publisher system of the freedom of expression principles and the 
media ethics system, it should, as a starting point, be given independent weight in the balancing of interests as a factor against removal, although the balancing of interests should be made on the basis of an overall assessment of all the relevant 
circumstances of the case. 12 It follows, therefore, that if an activity is covered only by the system of responsible publishers under the freedom of expression foundations, this cannot in itself be given independent weight, since this system does not, on its own, have the same protective effects for individuals as the media ethics system, and may also cover activities which have no journalistic purpose (since this is not a precondition for the granting of voluntary constitutional protection through the so�called 'certificate of publication'). Such a factor must therefore be assessed on a case�by-case basis. IMY has stated in a legal opinion on the concept of "journalistic purposes "13 that the concept should be given a broad interpretation. It emphasises that journalistic activities are those aimed at disseminating information, opinions or ideas to the public, regardless of the medium through which this is done. The concept thus has a broader meaning than in everyday language and covers not only professional journalists and traditional mass media, but all persons engaged in activities aimed at disseminating information, opinions or ideas to the public. However, IMY stresses that the concept cannot be given such a broad meaning as to include all information made available on the internet and containing personal data. In this context, IMY points out that, 
according to the European Court of Justice, the concept does not cover the processing carried out by a search engine provider through the provision of the search engine and that in the Swedish constitutional legislative work it has been established that the same should apply to a purely public search service relating to criminal convictions. 14
Furthermore, the statement clarifies how Chapter 1. 7 of the Data Protection Act is to be applied when there are other parallel purposes to the journalistic purpose and what significance it has that the term "exclusively" is not included in Chapter 1. 7(2) of the Data Protection Act and Article 85 of the GDPR. Among other things, IMY considers that, based on the wording of the relevant provisions, there is no requirement that processing be carried out exclusively for journalistic purposes for the exemption to apply. 15 However, IMY considers that it should be required that the journalistic purpose is the main purpose of the processing for the exemption to apply. 16 Privacy Authority's assessment
Has Google handled the complainant's requests correctly?
IMY is required by Article 57(1)(f) of the GDPR to deal with complaints about the 
incorrect processing of personal data and, where appropriate, to investigate the 
subject matter of the complaint.

12(14)
It appears from the information in the case that Google has taken steps to respect the complainant's rights by removing the relevant search hit from searches related to the complainant's name. In the light of the above, IMY does not consider it necessary to investigate the complaint further in this respect.
However, IMY finds grounds to investigate the complaint insofar as it concerns the 
wording used by Google in the refusal decision of 16 November 2021 of the 
complainant's removal request concerning the Lexbase directory service, i.e. that 
Google did not have access to the material behind the paywall.
The investigation shows that the open part of the directory service shows the 
complainant's full name, age and location and that the complainant is "present" in the database.
It also states that "[the directory service] is a web service offering public information from Swedish courts and other authorities. Here you can find judgments in criminal and/or civil cases concerning individuals".
Google justified its rejection decision on the grounds that 'Google does not have 
access to the content of the [directory service]' and '[u]nder the data available to [Google] ... cannot conclude that the information is inaccurate or out of date'.
However, IMY notes that the complainant, through his request, objected to the 
processing on the basis of Article 17(1)(c) and Article 21 of the GDPR with reference to his specific situation. Under Article 21(1), in such circumstances, the burden of proof is on the controller (i.e. Google) and not on the data subject (i.e. the complainant) to show that there are compelling legitimate grounds for refusing the request.17 Such a ground could be, for example, that there is a legitimate public interest in being able to access a particular publication through searches related to a particular person's name on a search engine such as Google's. However, such a public interest normally requires that the person requesting removal is a public figure or has a role in public life, which it had not been shown that the complainant was, either at the time of the refusal decision or subsequently. The fact that the directory service has a certificate of publication and that the documents behind the paywall are public does not, either individually or taken together, mean that the publication is in the public interest or can 
be regarded as having been made for journalistic purposes. 18 Thus, in the absence of conclusive justification for refusing the request, Google should, 
as a general rule, have granted it on the basis of the evidence available at that time, which it did not do. Furthermore, the reasoning given by Google to the complainant in the decision - that Google did not have access to the content behind the paywall of the directory service - gave the erroneous impression that the complainant had to submit such evidence in order to have its request granted. Since the content is behind a paywall, anyone wishing to access it has to pay the owner of the directory service (currently €98). The consequence of Google's action was thus that the data subject (the complainant) had to pay a third party (Lexbase) in practice in order to exercise his rights (right of deletion) with the controller (Google). Google has thus not facilitated the exercise of the complainant's rights under the GDPR and has thus processed the complainant's personal data in breach of Article 12(2).

Choice of intervention
Articles 58(2)(i) and 83(2) of the GDPR provide that IMY has the power to impose 
administrative fines in accordance with Article 83. Depending on the circumstances of the case, administrative fines shall be imposed in addition to or instead of the other measures referred to in Article 58(2), such as injunctions and prohibitions. 
Furthermore, Article 83(2) sets out the factors to be taken into account when deciding whether to impose administrative fines and when determining the amount of the fine. 
In the case of a minor infringement, IMY may, as indicated in recital 148, instead of imposing a penalty, issue a reprimand under Article 58(2)(b). Account will be taken of aggravating and mitigating circumstances of the case, such as the nature, gravity and duration of the infringement and any relevant previous infringements.
IMY notes the following relevant facts. The breach has affected an individual and 
involved difficult trade-offs between competing fundamental rights and freedoms.
During the procedure, Google has respected the complainant's rights and has taken 
several remedial measures and improved its information to data subjects.
IMY concludes that, on the basis of an overall assessment of the circumstances, in 
particular taking into account the difficult balancing of competing fundamental rights and freedoms, there are grounds to refrain from imposing a fine on Google for the infringement found.
In view of the above, IMY also considers that this is a minor breach within the meaning of recital 148, which requires Google LLC to be reprimanded under Article 58(2)(b) of the GDPR for the breach found.
This decision has been taken by the special decision-maker Olle Pettersson, lawyer, on the recommendation of Martin Wetzler, lawyer.
Olle Pettersson, 2022-07-26 (This is an electronic signature)
Copy to
The complainant

How to appeal
If you wish to appeal against the decision, you should write to the Data Protection 
Authority. In your letter, please state which decision you are appealing and the change you are requesting. The appeal must be received by the Office no later than three weeks from the date on which you received the decision. If the appeal is received in time, the Office will forward it to the Administrative Court in Stockholm for review.
You can email the appeal to the Data Protection Authority if it does not contain any personal data that is sensitive to privacy or that may be covered by confidentiality. The Authority's contact details are given on the first page of the decision.