Personvernnemnda (Norway) - 2021-17 (20/02389): Difference between revisions

From GDPRhub
(Created page with "{{COURTdecisionBOX |Jurisdiction=Norway |Court-BG-Color= |Courtlogo=Courts_logo1.png |Court_Abbrevation=PVN |Court_With_Country=PVN (Norway) |Case_Number_Name=PVN-2021-17 Sl...")
 
mNo edit summary
Line 52: Line 52:
}}
}}


In a recent “Right to be forgotten” case, the Norwegian Privacy Appeals Board forces Google to delete search results on data subject’s criminal past.
In a recent “''Right to be forgotten''” case, the Norwegian Privacy Appeals Board forces Google to delete search results on data subject’s criminal past.


== English Summary ==
== English Summary ==
Line 59: Line 59:
The data subject had been convicted of accessing, possessing and publishing images and videos containing sexual assaults on children. The criminal offences had taken place over a period of 20 years. During this time, the data subject worked as a psychiatrist and as an expert witness in cases relating to family and children welfare.
The data subject had been convicted of accessing, possessing and publishing images and videos containing sexual assaults on children. The criminal offences had taken place over a period of 20 years. During this time, the data subject worked as a psychiatrist and as an expert witness in cases relating to family and children welfare.


After having served a prison sentence, the data subject contacted Google and asked to have five search results removed from the search engine. The data subject argued that the search results had a severe negative impact on them and their children. Google denied the request, stating that the search results were a matter of public interest.
After having served their prison sentence, the data subject contacted Google and asked to have five search results removed from the search engine. The data subject argued that the search results had a severe negative impact on them and their children. Google denied the request, stating that the search results were a matter of public interest.


The data subject brought the case before Datatilsynet (The Norwegian Data Protection Office). Datatilsynet concluded that Google had to delete the search results. Google appealed, and the case was brought before the Privacy Appeals Board (PVN). The PVN had to consider only two of the five search results, as the rest had been deleted before the proceedings begun.
The data subject brought the case before Datatilsynet (The Norwegian Data Protection Office). Datatilsynet concluded that Google had to delete the search results. Google filed an appeal, and the case was brought before the Privacy Appeals Board (PVN). The PVN had to consider only two of the five search results, as the rest had been deleted before the start of the proceedings.


=== Holding ===
=== Holding ===
The PVN first remarked that Google as a controller had legal basis for their processing in article 6(1)(f) GDPR. However, according to article 17(1)(c) cf. article 21(1) GDPR, upon receiving an erasure request from the data subject, the search engine company would have had to delete the search results unless there were overriding legitimate grounds for not doing so. The PVN highlighted that the controller had the burden of proving such legitimate grounds.
The PVN first remarked that Google as a controller had legal basis for their processing in [[Article 6 GDPR|article 6(1)(f) GDPR]]. However, according to [[Article 17 GDPR|article 17(1)(c)]] cf. [[Article 21 GDPR|article 21(1) GDPR]], upon receiving an erasure request from the data subject, the search engine company would have had to delete the search results unless there were '''overriding legitimate grounds''' for not doing so. The PVN highlighted that the controller had the burden of proof when it came to such legitimate grounds.


When assessing the matters of the case, the PVN held that when a person working closely with children as a public figure is convicted for sexual offences relating to children, there is an obvious legitimate interest in making this information accessible to a wide audience. The legitimate interest is reduced over time, however the PVN considered the time passed in this case (almost 4 years) to be rather short.
The PVN first held that when a person working closely with children as a public figure is convicted for sexual offences relating to children, there is an obvious legitimate interest in making this information accessible to a wider audience. The legitimate interest is reduced over time, however the PVN considered the time passed in this case (almost 4 years) to be rather short.


On the other hand, the PVN argued that the search results contained information on criminal convictions, which warrants a higher level of protection cf. article 10 GDPR. In addition, the information in the search results had not been accurate and updated cf. article 5(1)(d). The PVN also found that the search results led to private web pages with articles containing general criticism of the Norwegian child protective services. The public interest in accessing this information through a name search of the data subject was limited. Furthermore, the PVN considered that journalists have a solid basis for making public interest assessments and highlighted that important media channels that had discussed the case against the data subject in anonymous form.
On the other hand, the PVN argued that the search results contained information on criminal convictions, which warrants a higher level of protection cf. [[article 10 GDPR]]. In addition, the information in the search results had not been accurate and updated cf. [[Article 5 GDPR|article 5(1)(d)]]. The PVN also highlighted that the search results led to private web pages with articles containing general criticism of the Norwegian child protective services. The public interest in accessing this information through a name search of the data subject was limited.


Lastly, the PVN held that the most compelling reason for deleting the search results were the harmful effects they had on the data subject and their family. Their children had been kept out of sports teams. They were struggling to find a job, even in a more administrative role.  
Lastly, the PVN held that the most compelling reason for deleting the search results were the harmful effects they had on the data subject and their family. They were struggling to find a new job, and their children were being excluded from social activities.  


The PVN thus concluded that Google had to delete the search results regarding the data subject’s criminal past.
The PVN thus concluded that Google had to delete the search results regarding the data subject’s criminal past.


As a sidenote, the PVN criticized Datatilsynet for having waited over a year from the date of the complaint until they asked Google to delete the search results. The PVN highlighted that long lasting case proceeding could potentially make the right to be forgotten illusory.
On a sidenote, the PVN criticised Datatilsynet for having waited for over a year from receiving the complaint until they asked Google to delete the search results. The PVN highlighted that long lasting case proceeding could potentially make the "Right to be forgotten" illusory.


== Comment ==
== Comment ==

Revision as of 22:57, 24 January 2022

PVN - PVN-2021-17 Sletting av søketreff i søkemotoren Google
Courts logo1.png
Court: PVN (Norway)
Jurisdiction: Norway
Relevant Law: Article 6(1)(f) GDPR
Article 10 GDPR
Article 17(1)(c) GDPR
Decided: 07.12.2021
Published: 07.12.2021
Parties:
National Case Number/Name: PVN-2021-17 Sletting av søketreff i søkemotoren Google
European Case Law Identifier:
Appeal from: Datatilsynet
20/02389-5
Appeal to: Unknown
Original Language(s): Norwegian
Original Source: personvernnemnda.no (in Norwegian)
Initial Contributor: n/a

In a recent “Right to be forgotten” case, the Norwegian Privacy Appeals Board forces Google to delete search results on data subject’s criminal past.

English Summary

Facts

The data subject had been convicted of accessing, possessing and publishing images and videos containing sexual assaults on children. The criminal offences had taken place over a period of 20 years. During this time, the data subject worked as a psychiatrist and as an expert witness in cases relating to family and children welfare.

After having served their prison sentence, the data subject contacted Google and asked to have five search results removed from the search engine. The data subject argued that the search results had a severe negative impact on them and their children. Google denied the request, stating that the search results were a matter of public interest.

The data subject brought the case before Datatilsynet (The Norwegian Data Protection Office). Datatilsynet concluded that Google had to delete the search results. Google filed an appeal, and the case was brought before the Privacy Appeals Board (PVN). The PVN had to consider only two of the five search results, as the rest had been deleted before the start of the proceedings.

Holding

The PVN first remarked that Google as a controller had legal basis for their processing in article 6(1)(f) GDPR. However, according to article 17(1)(c) cf. article 21(1) GDPR, upon receiving an erasure request from the data subject, the search engine company would have had to delete the search results unless there were overriding legitimate grounds for not doing so. The PVN highlighted that the controller had the burden of proof when it came to such legitimate grounds.

The PVN first held that when a person working closely with children as a public figure is convicted for sexual offences relating to children, there is an obvious legitimate interest in making this information accessible to a wider audience. The legitimate interest is reduced over time, however the PVN considered the time passed in this case (almost 4 years) to be rather short.

On the other hand, the PVN argued that the search results contained information on criminal convictions, which warrants a higher level of protection cf. article 10 GDPR. In addition, the information in the search results had not been accurate and updated cf. article 5(1)(d). The PVN also highlighted that the search results led to private web pages with articles containing general criticism of the Norwegian child protective services. The public interest in accessing this information through a name search of the data subject was limited.

Lastly, the PVN held that the most compelling reason for deleting the search results were the harmful effects they had on the data subject and their family. They were struggling to find a new job, and their children were being excluded from social activities.

The PVN thus concluded that Google had to delete the search results regarding the data subject’s criminal past.

On a sidenote, the PVN criticised Datatilsynet for having waited for over a year from receiving the complaint until they asked Google to delete the search results. The PVN highlighted that long lasting case proceeding could potentially make the "Right to be forgotten" illusory.

Comment

Share your comments here!

Further Resources

Share blogs or news articles here!

English Machine Translation of the Decision

The decision below is a machine translation of the Norwegian original. Please refer to the Norwegian original for more details.

Decision of the Privacy Board 7 December 2021 (Mari Bø Haugstad, Bjørnar Borvik, Line Coll, Hans Marius Graasvold, Ellen Økland Blinkenberg, Morten Goodwin)
The case concerns a complaint from Google LLC (Google) against the Data Inspectorate's decision of 14 June 2021 on the deletion of search results in the Google search engine.
Background to the case
A, to whom the search results apply, was sentenced by the district court in April 2018 to one year and ten months in prison for, among other things, acquiring, storing, and making available pictures, films and magazines that dealt with sexual abuse of children. The scope of the material was approx. 200,000 data files of which approx. 50,000 unique photos and 3,000 unique videos. The criminal acts had been going on for a period of about 20 years. In an aggravating direction, the court emphasized the convicted person's professional background as a psychiatrist and expert in child welfare cases and parental disputes.
The verdict was originally public, but following an inquiry from A, the district court decided in September 2018 that the verdict could only be reproduced publicly in anonymised form, cf. the Courts of Justice Act § 130 first paragraph letter a and ECHR article 8. Particular emphasis was placed on consideration for A's children , who found the spread of the verdict very stressful.
In November 2018, the Court of Appeal reduced the sentence to one year and eight months. A's authorization as a psychiatrist has been revoked.
A contacted Google on October 14, 2019 and requested the deletion of the following search results for his name in the search engine:
a) […]
b) […]
c) […]
d) […]
e) […]
Google rejected the deletion request on October 18, 2019.
A brought Google's refusal to the Data Inspectorate on 19 October 2019 and asked for assistance in deleting search results a) to e) by searching for his name in Google.
A requested answers from the Norwegian Data Protection Authority on 2 February 2020 and 26 August 2020, respectively.
The Data Inspectorate notified Google on 13 October 2020 that the Authority would make a decision to order Google to remove the following search results for name A in the search engine:
a) […]
b) […]
c) […]
When the audit dealt with the case, the search results d) and e) no longer appeared in Google when searching for A's name. The Norwegian Data Protection Authority therefore limited the processing of the case to the search results a) to c).
At the time of the Data Inspectorate's processing of the case, the search led a) to an article with information about police seizure of abuse material in A's home, that he was convicted of these matters, and that he previously worked as an expert in child welfare cases.
Search results b) led to a post published on a private website. The post mentions the conviction of A and his background as an expert. A's private family relationship is also mentioned.
Search match c) led to an article on […] .no and was about a documentary about child welfare in Norway.
In a letter to the Norwegian Data Protection Authority on 2 November 2020, Google maintained its assessment not to remove search results that led to these URLs, among other things with the following reasons:
«The URLs in question relate to the data subject's crime of possessing child pornography. The criminal activities went on for a 20 years long period and he was only recently convicted for them. The criminal activities involved possession of material where children were subject to aggravated sexual abuse by adults, forced sexual intercourse between children as well as children forced to perform sexual acts on themselves. This all went on at the same time while the data subject was practicing as a child psychiatrist and a consultant in a Child Expert Commision… ».
The Norwegian Data Protection Authority ordered Google on 14 June 2021 to remove search results a) to c) in A's name. The deadline for implementing the order was set for 14 July 2021.
Google filed a timely appeal against the Data Inspectorate's decision on 5 July 2021. Google also states that they will temporarily block search results a and b pending a final decision in the case. Search hits c) did not appear at this time when searching for A's name.
The Norwegian Data Protection Authority assessed the complaint, but found no reason to change its decision to remove search results a) and b). The Norwegian Data Protection Authority forwarded the case to the Privacy Board on 12 October 2021. The parties were informed of the case in a letter from the Board on 13 October 2021, and were given the opportunity to comment. A submitted comments via email on October 18, 2021. Google submitted comments in a letter on October 26, 2021.
The case was discussed at the tribunal's meeting on 7 December 2021. The Privacy Committee had the following composition: Mari Bø Haugstad (chair), Bjørnar Borvik (deputy chair), Line Coll, Hans Marius Graasvold, Ellen Økland Blinkenberg and Morten Goodwin. Secretariat leader Anette Klem Funderud was also present.
The Data Inspectorate's assessment in outline
The Norwegian Data Protection Authority initially states that the authority is the right supervisory authority and has the competence to process the case, and that section 3 of the Personal Data Act does not apply to cases concerning the deletion of search results.
The Norwegian Data Protection Authority then refers to Article 6 (1) (f) of the Privacy Ordinance as a relevant valid basis for processing, as well as Article 21 (1) on the data subject's right to protest, and the right to be forgotten in Article 17.
The Data Inspectorate is based on the fact that consideration for the data subject generally weighs heaviest if the person in question actively requests that the search result does not appear when searching for the data subject's name, cf. the European Court of Justice in Google Spain and Google, C-131/12 of 13 May 2014.
In its assessment of whether there are compelling justified reasons for the processing that take precedence over the data subject's interests, rights and freedoms, the Data Inspectorate is based on the guidelines prepared by the Article 29 group (the precursor to the Privacy Council under the Privacy Ordinance). which applies to deletion of search hits.
The Authority stipulates that the search results contain information that is in principle prohibited from processing pursuant to Article 10, and states that there is a presumption that the data subject shall have the right to have the search results deleted when submitting a protest against the processing unless special considerations apply. applicable. This clear starting point applies in particular where the search results concern personal data covered by Article 10, cf. the Privacy Board's case PVN-2020-08. That the information the search results lead to is covered by Article 10, in the Data Inspectorate's assessment, indicates that the search result should be deleted, cf. G.C. & amp; Others v. CNIL, section 67 (C-136/17 of 24 September 2019).
The Norwegian Data Protection Authority agrees with Google that the fact that these are serious offenses committed in recent times is something that nuances the starting point that search hits concerning personal information about criminal convictions and offenses should in principle be deleted. The audit indicates that a specific assessment must be made in the individual case.
The Data Inspectorate assumes that the case is in the public interest, but points out that newspapers such as VG.no and Dagbladet.no, after a press ethics assessment, have nevertheless decided to mention the criminal case without revealing A's identity. In the Authority's assessment, this suggests that the search results be deleted.
The newspapers mention that the convicted person worked as a psychiatrist in child welfare cases and the Authority assumes that this is information of public interest. The fact that he has kept images of child abuse can weaken confidence in his ability to work as a psychiatrist, especially in cases involving children. This basically speaks against deletion. Nevertheless, the Data Inspectorate points out that A no longer has a license and can no longer work as a psychiatrist. He can also not work with children in positions where a certificate of good conduct is required. The Authority therefore believes that the information in the search results is less relevant for future employers or clients, which speaks in favor of deletion, cf. PVN-2019-02.
With regard to whether the data subject has had a role in the public eye, the Authority assumes that the criminal acts for which he was convicted, and which are mentioned in the search results, are acts he has committed as a private person and not in the exercise of the professional role. This means deleting the search results, especially when the person in question no longer has the authorization to continue as a psychiatrist and expert.
The Data Inspectorate further emphasizes the negative consequences of the search results for A and his family, cf. Google Spain and Google and PVN-2020-08. He has now served his sentence and is trying to recover. The audit points out that search engines should not function as a gaping hole or a public criminal record. A experiences that the outside world does not want anything to do with him, and he has problems finding a job. The children experience negative consequences in the local community and one of the search results includes a picture of A where it looks like he is pictured with his children. Search match b) also contains personal information about the data subject's children. The Data Inspectorate points out that consideration for the children's best reasons for deletion.
After a specific balance of interests between the interests of the data subject's privacy and public interests, the Data Inspectorate concludes that there are no compelling justifiable reasons which indicate that the interests of the complainant's privacy must give way to public interests.
Google's views on the matter in outline
There is no legal basis for demanding the removal of the relevant search results. The search hits constitute strictly necessary processing to exercise the right to freedom of expression and information, cf. the Privacy Ordinance Article 1 no. 3 letter a. There are also compelling justified reasons for the processing, cf. Article 21 no. (including the freedom of information of the public) indicates that the search results should still be available.
The Norwegian Data Protection Authority has made an overly narrow assessment of the "right to be forgotten" criteria, which are expressed in the Article 29 Working Party's guidelines.
The search results concern serious crime of public interest, committed by the data subject
The search results concern serious crime of public interest, and access to the information is strictly necessary for the general public. The Norwegian judiciary is based on openness. Both court hearings and criminal convictions are, as a general rule, public. The public has the right to receive information about public, non-confidential criminal convictions and information about such convictions. Most criminal convictions, including the registered one, are handed down with "no restrictions on access to public reproduction". The court has the opportunity to make exceptions for reasons of privacy. That has not been done in this case. Then it is paradoxical that public comments are limited by the Norwegian Data Protection Authority.
The information on the websites is correct, relevant and not superfluous considering the severity of the actions in question, as well as the fact that they are of more recent date. The criminal acts lasted over a 20-year period and the data subject was recently convicted. The acts for which he was convicted were committed at the same time as the registered person practiced as a psychiatrist for children, and as a profiled member of the Child Expert Commission.
The information applies to the data subject's professional life and role in the public
The criminal acts led to the registrant losing both his job and his authorization. The general public has a clear interest in having access to information that is so strongly linked to the professional practice of a profiled person in their field. It is incorrect when the Data Inspectorate assumes that the actions were not committed in the exercise of the professional role. Both the district court and the court of appeal state that the defendant's professional practice is given weight in an aggravating direction.
In view of the data subject's position in his field of occupation, he must be regarded as a public person, cf. PVN-2017-17. In PVN-2020-08, the tribunal concluded that there was a public interest in articles related to an owner of a tour operator company who was convicted of fraud and threats against journalists. The information was of public interest even after the company went bankrupt, which was over five and a half years ago at the time of the tribunal's decision. This suggests that the search results in this case are not deleted.
Other media's decision not to identify the data subject is not relevant
The Norwegian Data Protection Authority has pointed out that newspapers have written about the case without naming the registered person. The fact that other media have chosen not to name the registered person is irrelevant in the balancing of interests.
The Norwegian Data Protection Authority has also pointed out that the Privacy Board has previously emphasized that personal data has been published by a recognized news source as an argument against removal, and that the opposite should apply when the data has not been published by such a news source. The Data Inspectorate in reality acts as a control body for the press and goes beyond its mandate as an administrative body. It is not the Data Inspectorate's task to assess the importance or degree of public interest in actually correct information published by someone who exercises their freedom of expression. That's not Google's task either. It is important to distinguish between balancing of interests under the right to be forgotten, as opposed to assessments of the bold use of freedom of expression. The fact that data published in a journalistic context is an argument against removal does not mean that information that is not published in a journalistic context is an argument for removal.
The interests of the data subject do not outweigh the interests of the general public in having access to information
The Data Inspectorate's argument for deleting the search results has pointed out that the registered person has lost his authorization and can no longer work as a psychiatrist or have any other job that requires a police certificate and that he has problems finding work.
The relevant information may be of public interest in situations other than where the data subject must present a police certificate. In Norwegian law, the use of a police certificate is strictly regulated. The employer must have a legal basis for requiring a police certificate. If there is no such legal basis, the employer can not ask about any criminal history. The information in the search results is relevant to A's future professional life, since there is a clear public interest in accessing the information for anyone who may want to engage him in a future job if children may be involved. Google does not agree with the Norwegian Data Protection Authority that the possibility of obtaining a police certificate reduces the public's interest in easily accessible information about serious crime committed by the data subject.
The data subject has not substantiated his claim that it is the information that makes it difficult for him to get a job. In any case, given that the actions were not a one-off event, the public interest in the information must outweigh the negative consequences for the data subject.
The importance of freedom of expression and information
The Personal Data Act shall always be interpreted and applied in accordance with the freedom of expression and information protected in Article 100 of the Constitution, Article 10 of the ECHR and Article 19 of the UN Convention on Civil and Political Rights. to remove search results, but also by assessing the negative consequences such removal has for the population's rights in practice.
It is true that deleting search results does not affect the availability of the information on the website as such. But the fact that it reduces the audience's ability to find the relevant website will constitute an intensive interference with the freedom of information and expression.
It is not up to the authorities to decide the specific forms of how legitimate debate must be exercised. In this case, it is strictly necessary to keep the information in our case available for the sake of the public's freedom of expression and information.
Summary
The relevant search results lead to actually correct and relevant information about serious crime that is incompatible with the data subject's work, committed intentionally, over a long period of time. The information is relatively new and not outdated. The information is published for journalistic purposes, and is based on court decisions that are stipulated by law to be public. On this basis, it is considered that there are compelling justified reasons related to the freedom of expression and information that outweigh the interests of the data subject.
A view of the case in outline
He is constantly confronted with information others have found about him online, and he has difficulty finding a job as his name is exposed a lot online. This also applies to positions of an administrative nature without client contact. Those who seek him out online, including artisans, will have nothing to do with him.
The situation also goes beyond his children for whom he has sole care. One child was refused to join the local football club after meeting twice because the parents of the other children recognized him. The reason given by the management, who regretted it all, was that the coaches on the team would withdraw if his child continued, and that the other parents had expressed discomfort at being near him. The reason for their attitude was mainly based on the information they have read online. This underscores the importance of having the search results removed.
The district court decided on 30 September 2018 that the judgment could only be reproduced publicly in anonymised form.
He has reviewed websites that illegally use pictures of him and his children. However, the police have dropped the cases as the owners of the servers have refused to state who is responsible for the websites where the illegal material is located.
He is critical of Google's presentation of the facts of the case, which he believes is both deficient and partly incorrect. Google mentions how many photos the police found, not how many photos he was convicted of. The sharing of material was not an active action, as Google claims, but was done because it is a feature of the file sharing program that can not be turned off. Google describes circumstances that were withdrawn from the indictment, and presents the case as larger than it was. The same thing is happening online, and it is impossible to counter such claims.
Although he has committed a criminal act, he also has human rights. Now his entire life is exposed in detail online, including his sex life and sexual orientation.
The Privacy Board's assessment
The Privacy Board will decide whether Google should be ordered to delete the two search hits a) and b) mentioned in the introduction to the decision. Both search results are temporarily blocked by Google pending a final decision in the case, which means that they do not appear when searching for A's name in the search engine. The tribunal has found it appropriate to process both search results that the Data Inspectorate decided on, even though the website that the search result a) led to, at the time of the tribunal's decision, does not exist.
The legal basis for the assessment
When a search engine collects personal data and presents search results to the public, this represents a processing of personal data regulated by the Privacy Ordinance, cf. Article 4 no. 2. The search engine provider is responsible for the processing of personal data that takes place in that connection, cf. Article 4 no. 7. This has also been established by the European Court of Justice, both in judgment C-131/12 of 13 May 2014 (Google Spain and Google) and in judgment C-136/17 of 24 September 2019 (GC & amp; Others v CNIL). This is also in line with established administrative practice from the Norwegian Data Protection Authority and the Privacy Board, see for example PVN-2019-02, PVN-2020-08, PVN-2020-14 and PVN-2021-05.
Although the above-mentioned judgments of the European Court of Justice concern the understanding of the EU Privacy Directive (Directive 95/46), the judgments are also relevant for the interpretation of the new Privacy Regulation. This is because the provisions in the regulation on what constitutes the processing of personal data, and who is to be regarded as the data controller, is a continuation of the provisions in the directive.
Like the Norwegian Data Protection Authority, the tribunal assumes that it is the Privacy Ordinance, Article 6, No. 1, letter f, that provides Google with a processing basis for collecting personal data and presenting search results to the public by name search. This case mainly concerns personal information about criminal convictions and offenses. This is information that is covered by Article 10 of the Privacy Regulation. & amp; Others v CNIL, which in the Board's assessment provides the necessary clarification of the legal bases when the information collected either belongs to a special category of personal data in Article 9 of the Regulation, or applies to personal data that falls under Article 10, cf. PVN-2020-08 . This means that the question must be decided on the basis of a balance of interests (a necessity assessment) where the consideration of privacy must be weighed against the consideration of freedom of information. Privacy is enshrined in Article 8 of the ECHR and Articles 7 and 8 of the EU Charter, while freedom of expression (including freedom of information) is enshrined in Article 10 of the ECHR and Article 11 of the EU Charter.
As A has protested against the processing of personal data, cf. Article 21 (1) of the Privacy Ordinance, it follows from Article 17 (1) (c) that Google has a duty to delete the data (ie remove the search hit) unless it is more weighty. justified reasons for the processing that take precedence over the data subject's interests, rights and freedoms. It follows directly from the wording of Article 21 (1) that it is the data controller who must demonstrate that there are compelling justifiable reasons for the processing, while under Article 14 of the Directive it was the person who demanded the deletion who had the burden of proof. In other words, a balance of interests must be struck between the data subject's interest in having the search match deleted, against the public's interest in gaining access to this information by conducting a name search in a search engine. In this connection, the tribunal would like to point out that deleting search results from a search engine is not about removing the information from the Internet as such. All information will still be available on the original websites, and on these websites the information will still also appear by name search. Furthermore, the information will be available from search engines, but then keywords other than personal names must be used, such as the subject of the case.
In the case of Google Spain and Google, the European Court of Justice points out that the search engine provider's processing of personal data differs and represents something in addition to the processing done by the publisher of websites (section 35). This means that the outcome of the balance of interests that the directive provides guidance on may be different for the search engine provider than for the publisher of the Internet pages. This may be partly due to the fact that the interests that justify the processing of personal data may be different, and partly it may be due to the fact that the consequences of data processing will be different. The European Court of Justice ruled in Google Spain and Google that anyone who offers search engine services may have an obligation to delete a search result at the request of the search result, even if the publisher of the website has a processing basis for publishing the information on his website (section 88). This position is continued in G.C. & amp; Others v CNIL (Section 52).
Regarding the specific balance of interests for the search engine provider's processing of personal data, the European Court of Justice in Google Spain and Google, section 81 (official Danish translation) says:
'Although the rights of the person concerned protected by these articles also generally outweigh the interests of Internet users, this trade-off may in specific cases depend on the nature of the information in question and how sensitive it is to the person concerned. person's privacy, as well as the public's interest in having this information, which i.a. may vary depending on the role of this person in public life. "
The Privacy Council has published a guide concerning the right to be forgotten by a search engine, cf. «Guidelines 5/2019 on the criteria of the Right to be Forgotten in the search engine cases under the GDPR (part 1) - version adopted after public consultation »Of 7 July 2020. The guide has been published in the wake of Google Spain and Google and GC & amp; Others v CNIL, replacing the guide previously developed by the Article 29 Working Party. However, it is pointed out in section 31 of the Privacy Council's guide that the criteria developed by the Article 29 Working Party are still relevant when deciding on requirements for deletion of search results. The tribunal assumes that the Privacy Council's guide expresses the administrative practice of the audits in the EU and the EEA, and in this way provides some guidance for the balancing of interests to be done. The tribunal refers to sections 26 to 33, and 44 to 52 in the guide from the Privacy Council.
The Article 29 Working Party itself emphasizes that "no single criterion is, in itself, determinative", and furthermore that "the list of criteria is non-exhaustive and will evolve over time", cf. also the European Court of Justice's assessment in G.C. & amp; Others v CNIL, where the court in section 66 confines itself to stating that a balancing of interests must be carried out and then gives a fairly soluble instruction as to which elements are to be included in this balancing:
'In any case, when submitting a request for removal, the search engine provider shall, in the light of the important societal interests referred to in Article 8 (1), 4 of Directive 95/46 or Article 9, para. 2 (g) of Regulation 2016/679 and in compliance with the conditions laid down therein, test whether the inclusion of a link to the relevant web page in the results list, which appears after a search in the name of the data subject, is necessary for the Internet users , who could potentially be interested in accessing this website through such a search, may exercise their right to freedom of information, which is protected under Article 11 of the Charter. 8, generally weighs heaviest in relation to the Internet users' freedom of information, however, this balance may in special cases depend on the nature of the information in question and how sensitive it is to the data subject's privacy, as well as the public's interest in having this information. may vary according to the role of this person in public life (see, to that effect, Judgment of 13 May 2014, Google Spain and Google, C-131/12, EU: C: 2014: 317, paragraph 81). "
The tribunal will emphasize that the criteria will not fully replace the specific discretionary balancing of interests that must be done, but that they only provide national supervision with some guidance for the balancing that forms the basis for the assessment of requirements for deletion of search results.
The European Court of Justice rules in Google Spain and the Google judgment that in the balancing of interests there is a presumption ("these rights in principle precede") that the data subject has the right to have the search hit deleted, unless special considerations apply. This presumption must, in the tribunal's assessment, be even stronger when it comes to information on criminal convictions and offenses covered by Article 10 of the Privacy Regulation. The European Court of Justice has pointed out the same in section 67 of G.C. & amp; Others in CNIL:
'In addition, in the case where the processing concerns the specific categories of information in Article 8 (1) and (5) of Directive 95/46 or in Article 9 (1), Article 10 (1) and Article 10 of Regulation 2016/679, the infringement of the data subject's fundamental rights to privacy and the protection of personal data as set out in paragraph 44 of this judgment could be particularly serious due to the sensitive nature of that data. "
The concrete balance of interests
The tribunal then moves on to the specific balancing of interests, and will first look at the circumstances that argue against deleting the search results.
Firstly, it is important whether the information that appears in the search results is of public interest (including still relevant) and whether it concerns a person who plays a role in the public. Public figures and persons who play a role in public life must, depending on the circumstances, endure encroachment on privacy to a greater extent than others. The background is, among other things, the public's interest in gaining access to information about the exercise of the public role, cf. Borgarting Court of Appeal, LB-2020-18230.
Regarding who is to be regarded as a public person or to play a role in public life, the tribunal stated in PVN-2018-07:
"There is no clear definition of what a public figure is or who is considered to play a role in public life. The Data Inspectorate mentions as examples of public figures “public officials with senior positions, such as ministers, politicians or directors. […] Well-known business persons or persons in regulated professions such as lawyers, doctors or the like ”, but states that the examples are not exhaustive and that an overall assessment must be made. The tribunal agrees with that assessment. "
When the criminal offenses were discovered, A undoubtedly had a role in public life. He is a trained psychiatrist, was appointed as an expert in many child welfare cases and in parental disputes in the courts. He was also a member of the Child Expert Commission, which makes an assessment of all expert reports used in child welfare cases. That a person who holds such roles is convicted of keeping images of child abuse is without a doubt of public interest. Retention and sharing of images of child abuse undermines confidence in his ability to work as a psychiatrist, especially in cases involving children. The Norwegian Board of Health Supervision has found A unsuitable for practicing his profession properly and has revoked A's authorization, cf. the Health Personnel Act § 57. This shows the strong connection between what he is convicted of and his role in public life. This basically speaks against deleting the search match.
The public interest in convictions will weaken over time, and may be a factor that may indicate the deletion of search results. The background for this view is that people who have served their sentences should be given an opportunity to put the past behind them and move on in life. In this case, it has been three years and eight months since the district court's ruling in April 2018, and three years since the Court of Appeal dealt with the sentencing appeal in November of the same year. The criminal offenses for which he was convicted were committed over a period of 20 years, from 1997 to 2017. In view of the seriousness of the case, the time that has elapsed is relatively short. In the tribunal's assessment, this is also a matter that speaks against deletion.
When the tribunal has nevertheless come to the conclusion that Google should be ordered to delete the relevant search results, this is due to other weighty considerations which together indicate that Google's interest in processing the information, and the public's interest in gaining easy access to the relevant information, must give way.
First, the search results deal with information covered by Article 10 of the Privacy Ordinance. This is a matter which in principle dictates that the search results be deleted, unless special considerations apply. This applies even if both court hearings and criminal convictions are, as a general rule, public under Norwegian law. The right to attend court proceedings and gain access to court decisions is something other than the question of deleting search results from a search engine. The public's right of access to a court decision is regulated by the Courts of Justice Act and is not affected by whether a search engine is ordered to delete a search result by searching by name. The tribunal notes for the sake of order that the district court made a decision that the judgment could only be reproduced publicly in anonymised form. It is not stated whether the Court of Appeal made a similar decision or whether the judgment from there was public without restrictions. The tribunal has not found it decisive for the question to be considered in this case.
Another factor of significance for whether search hits should be deleted or not is whether the information is correct and up-to-date, cf. also the Privacy Ordinance, Article 5, paragraph 1, letter d. discusses how many photos and videos the police found (over 190,000 photos and 12,000 videos), and not what A was convicted of (approximately 50,000 unique photos and 3,000 unique videos). The consideration of correct and up-to-date information is therefore a factor that indicates that the search result should be deleted.
It is also important for the balance of interests in the context in which the information was originally published, cf. the Article 29 group's criteria nos. 10 and 11. the information is of public interest. Search results b) leads to a private website that mainly publishes critical articles related to child welfare. The article is critical of A's professional assessments in child welfare cases, and the author argues that A has, among other things, destroyed families. The allegations made are subjective and to some extent unvarnished. Search results a) led to the website «[…], which is also critical of the child welfare service. In connection with the article on this page, a picture of A with her young children was published. The article therefore had a clear harassing character, cf. the tribunal's decision PVN-2016-10. In the tribunal's assessment, the identification of A with name and photo is not necessary to debate the child welfare service. The public's interest in gaining access to this information by conducting a name search for A in a search engine is, in the Board's assessment, very limited. In continuation of this, the tribunal also places a certain emphasis on the question of whether A will later be able to work as a psychiatrist, does not depend on the information being available when searching on Google. In its assessment of this question, the Norwegian Board of Health Supervision will in any case have access to this information. This question may arise differently in cases where the data subject pursues a profession or holds a title that is not legally protected.
The tribunal also places some emphasis on the fact that central editor-controlled media that have covered the criminal case against A, have not found reason to identify him. The tribunal sees it as editors and journalists having special prerequisites for assessing what is in the public interest, and the press ethics norms provide a good approach to this assessment. The tribunal refers here to the decision PVN-2016-10. The Oslo District Court's decision that the verdict against A can only be reproduced in public in anonymised form, expresses a corresponding balance of interests.
The most important factor in the tribunal's decision to order the deletion of a search match is the consideration for the data subject's privacy and the harmful effects A has accounted for both with regard to him and especially his children. A has explained that the information about him online makes it difficult for him to get a job, also work of an administrative nature. In addition, he experiences that, for example, craftsmen he contacts do not want anything to do with him because they have read about him online. As for A's children, there is an account of a specific incident where one of the children was refused to participate in the local football club because the parents of the other children recognized A from online coverage. The tribunal notes that it is difficult to know what is due to the relevant search results and what is due to knowledge of the case based on attention and mention of such a case, regardless of search results. However, there is no doubt that the search hits make the relevant information much more accessible, and that the right to be forgotten more easily becomes real if the search hits are deleted.
The tribunal notes that there is no precondition for deleting the search results that the processing of the relevant personal data has had negative consequences for the registered person or his family. However, if there are sufficient grounds for such consequences, there will clearly be a factor that points in the direction of the data subject being granted the claim for deletion. This also follows from the Article 29 Working Party's guidelines:
"The data subject is not obliged to prove any damage in order to request the deletion of information, which in other words means that damage is not a condition for the exercise of the right recognized by the Court. However, if there is evidence that the availability of the search result is detrimental to the data subject, this factor clearly indicates that the information should be deleted. "
The tribunal considers that it has been sufficiently substantiated that A has in various contexts been confronted with the information that emerges when searching for his name in Google, and that he experiences this as a great burden. It has also affected his children who are now […] years old.
In conclusion, and without it being important for the actual balancing of interests, the tribunal will point out that the Authority's case processing time in this case has been unacceptably long. As early as 19 October 2019, A requested the Data Inspectorate's assistance in deleting a number of search results. Only on 13 October 2020, and after A had twice requested an answer to his request for assistance, was Google notified that the Authority was considering making a decision that three of the search results should be deleted. The final decision to delete the three search hits was first made by the Norwegian Data Protection Authority on 14 June 2021, almost 20 months after A had first requested the Authority's assistance. The case processing time of the Authority in this case means that the right to deletion in Article 17, in the Board's assessment, has largely lost its practical reality for A. This is very serious. The tribunal would like to remind that the state also has a positive duty to protect people's privacy under Article 8 of the ECHR, and it follows from case law from the ECtHR that the right to be forgotten is among the interests that Article 8 protects, see M.L. and W.W. v. Germany (case numbers 60798/10 and 65599/10) judgment of 28 June 2018 and Hurbain v. Belgium, (case number 57292/16) judgment of 22 June 2021. The tribunal notes that if the case processing time becomes so long that the right to deletion loses much of its practical reality, this could constitute a violation of the positive duty in Article 8 of the ECHR to respect people's privacy, which in the final analysis may result in the state being liable for the negligence.
The Data Inspectorate's decision is upheld.
The decision is unanimous
Conclusion
The Data Inspectorate's decision to order the deletion of search results a) and b) is upheld.


Oslo, 7 December 2021
Mari Bø Haugstad
Manager