Privacy Corner

by Eva Jarbekk


Security cameras mounted on wall.

The last newsletter was a review of some key issues. Afterwards, I have received many comments on the article about the level of detail needed in relation to who personal data has been shared with. For those who did not read the previous newsletter: an important decision has been pronounced by the ECJ stating that requests for access shall be answered with information about the specific data processors you have shared information with – and in many cases also who of your own employees have accessed information. We will surely return to this topic later.

This time I will take a slightly wider sweep of issues that have come up recently. There is more than enough to choose from! And if any of you have a particular interest in data protection practices in other Nordic countries – send me an e-mail and I'll see if we have a place available for a separate Nordic Privacy Round Table in April. We are visited by the leading privacy experts from all the Nordic countries.

Third country transfers and the EU-US Data Privacy Framework

We will not write much about third country transfers in this newsletter, but it can't be ignored completely.

As is known, the EU Parliament has commented on the proposed agreement on transfers between the EU and the US. The EU Parliament is highly critical of the agreement and believes that it does not ensure equivalent data protection when European citizens' personal data is transferred to the US. A number of things are criticized. In particular, it is highlighted that the understanding of what is "proportional" and "necessary" is not the same in the US and in Europe, and that this is particularly problematic when the regulations are to be enforced in practice in the US. Another central criticism is that the agreement is based on a so-called "Executive Order" in the US, which any president can change at his own discretion.

Read the EU Parliament's statement here.

It is interesting that the comments from the EU Parliament largely coincide with the criticism that the NOYB has put forward against the agreement.

Shortly, it is also expected that the EDPS will comment on the proposed agreement. It is likely that these comments will be roughly the same as the comments from the EU Parliament.

The matter will then be further discussed in the European Commission, which can either adopt the agreement as it is, even if the EU Parliament has objected, or they can try to improve it through further negotiations with the US. However – it may be difficult for the US to be able to counter the criticism that is being raised, because it will require quite large changes to the US legal system. There is good reason to believe that we will have to do "Transfer Impact Assessments" for the use of US services for a long time to come.

Portuguese case regarding suspended transfers

In 2021, the Portuguese Data Protection Authority (CNPD) received several complaints about a national census survey conducted by a public body, the National Statistics Institute (INE). Respondents to the survey could answer the survey online. INE used the American service CloudFlare as data processor and supplier of the online service. CloudFlare reportedly has over 200 data centres in 100 countries.

The complaints concerned several things, both the legality of the processing of personal data for statistical purposes, however, in this context, what is most relevant is that complaints were also made about transfers of personal data to a third country without an adequate level of protection.

The CNPD carried out investigations and it turned out that, at that time, around 2.5 million forms with the personal data of over six million people living in Portugal had been sent to the service.

The CNDP was highly critical of the fact that the data controller had entered into a contract with the data processor that allowed the transfer to the US (on the basis of an SCC) without any additional measures being taken to secure the transfers. Moreover, the agreement allowed the data processor to cooperate with other (sub)processors established in third countries without a corresponding level of protection. The CNPD also highlighted INE's lack of control and knowledge of encryption tools.

The CNPD ordered INE to suspend all data flows to the US and other third countries that do not offer an adequate level of protection, either via Cloudflare, Inc. or via any other company. INE was fined 4.3 million euros. The case can be appealed.

I do not know the background of the case in detail. However, I know of many companies that use good American services and that do not negotiate on the terms of agreement offered by them. It is not entirely certain that this is wise going forward, regardless of whether you are in the public or private sector. Although the Norwegian Data Protection Authority has been reticent in inspecting transfers, they will have to relate to the current regulations if receiving a complaint.

Cookie Banner Taskforce – new analysis from the EDPB

On 18 January, the EDPB published its report on the work carried out by the "Cookie Banner Taskforce". The Cookie Banner Taskforce was created in September 2021 to coordinate the responses to the many complaints regarding the incorrect use of cookies. It was the NOYB that over a short period submitted 101 complaints (the so-called "101 Dalmatians"). The aim of the EDPB's task force was to promote cooperation, sharing of information and best practice between the data protection authorities.

The EDPB points out that the report is not a decision on the specific complaints that have been submitted, because these must also be resolved in the context of national laws that implement the ePrivacy Directive.

The report reviews nine types of situations that the EDPB has found. Briefly summarised, the recommendations are as follows:

  • The ePrivacy Directive concerns the placement of cookies, as implemented in national legislation. In contrast, the GDPR concerns the processing of personal data collected through cookies.
  • The GDPR's one-stop-shop mechanism does not apply to violations of the ePrivacy Directive.
  • The cookie banner's first layer should have a button that allows users to reject all cookies.
  • Cookie banners shall not have pre-ticked choices and shall not influence or force users to accept cookies. It should not be more difficult for users to reject cookies than to accept them by displaying misleading "reject" buttons. So-called "dark patterns" shall not be used and the colour and shape of the buttons shall be neutral.
  • Users shall receive clear and easily understandable information about the purpose of the cookies used.
  • Users who consent to the placement of cookies shall be able to withdraw consent at any time. It should be as easy to withdraw consent as it is to give it.

The report is available here.

Two Danish cases regarding cookies and statistics, as well as new guidelines regarding cookie walls

The Danish Data Protection Authority has made two important decisions regarding cookies and consent. The topic is the consent requirements in Article 4 of the GDPR.

In one case, the Data Protection Authority decided that access to a website can be conditioned by either consenting to cookies for marketing and statistics or by actually paying for the service. The price for access was so low that the Data Protection Authority believed that the data subject had a real choice. At the same time, the Data Protection Authority stated that the use of information for statistics was not a necessary part of the alternative to payment and the company was ordered to either substantiate such necessity or allow for separate consent.

The Data Protection Authority's opinion is that visitors to a website, under certain conditions, can be considered to have a real and voluntary choice when the website offers visitors content against obtaining consent to the processing of personal data, as long as the company also offers an alternative way of accessing the content that does not involve the processing of personal data. However, this requires that the content offered by the company must be largely the same, regardless of whether the visitors give consent or, for example, pay to access the content or service.

This is a bit intricately worded in the decision, but I understand it to mean that they cannot "bundle" statistics production into the consent solution and that a separate consent is required for this, or that they have to demonstrate the "necessity" they have for using the information. It is somewhat unclear whether the requirement of necessity is then linked to consent or to the use of a legitimate interest – presumably it is the latter.

The second case concerned the sharing of articles in a newspaper with friends. The reader could share articles either by consenting to cookies for marketing and statistics or by taking out a subscription. The Data Protection Authority has the following important statements in the decision:

"A consent is not given voluntarily if the data subject does not have a real or free choice and control over information about him-/herself. Any form of inappropriate pressure or influence on the data subject's free will means that the consent is invalid.

A data controller can to a certain extent motivate the data subject to give consent by the fact that there is an advantage associated with consent. Membership in a company's loyalty scheme can, for example, involve discounts that may motivate the customer to consent to receiving advertising material from the company. The discount or the benefits that a consent to a loyalty scheme entails do not exclude that the consent can be considered to be voluntary.

However, it is important to be aware of whether a lack of consent entails negative consequences for the data subject who does not want to give consent, e.g., in the form of additional costs."

Here, the Data Protection Authority found that it was not a genuine consent, as the services were not the same in scope depending on whether one paid or consented. The Data Protection Authority emphasized that by consenting users gained access to share "unlocked" articles, while by paying they gained access to all articles. Because these were not comparable services, they did not consider the consent to be a free choice.

Also in this case, the Data Protection Authority could not see that statistics were a necessary part of the alternative to payment.

Both cases are mentioned on the Danish Data Protection Authority's website here.

And NOYB is steadfast in the work against cookies

In this context, an article from NOYB about the "Pay or OK principle" is exciting. NOYB criticizes an online newspaper that offers either consent to cookies – or a subscription for 6 euros a month. NOYB estimates that the newspaper would earn between 30-90 cents per user per month and thus would earn between 6 and 20 times more on an advertising-free, privacy-friendly version. They write, not surprisingly, that a legal clarification is required as to whether this is legal. Read about the case here.

For how long shall documentation for consents be kept?

The Danish Data Protection Authority has prepared a guide on how long personal data shall be kept in order to be able to document that consent has been obtained. The guide is probably practical for many people.

The duty to be able to document compliance with the rules in the GDPR, including that the data subject has consented, must be seen in the context of the GDPR's other rules. In particular the rules on data minimization and storage limitation. The GDPR, Article 11 is important here. It states that if the purposes of processing personal data do not (any longer) require the identification of the data subject, the data controller is not obliged to maintain, acquire or process additional information in order to identify the data subject.

The condition in Article 7 regarding documentation therefore only applies as long as the processing is ongoing. After completed processing – e.g., because the data subject has withdrawn consent – there is thus no obligation to demonstrate that consent has been acquired from the data subject by storing the given consent or personal data processed on the basis of this.

The GDPR's general requirement for documentation therefore does not require that a copy of a given consent is kept. The requirement should instead be fulfilled by so-called system documentation regarding general procedures for obtaining consent, e.g., what information is given for consents.

In other words, the consequence of this is that personal data that is processed based on consent shall, as a general rule, be deleted immediately after completed processing and/or that consent has been withdrawn. And this also applies to the consent itself.

The guide was a consequence of a specific case where the Data Protection Authority determined that a storage period of 5 years in order to document the validity of an obtained consent was not in line with the rules on storage limitations. It might be a good idea to take a closer look at how long documentation is kept. Read about the cases here.

Viking Line is fined for incorrect storage of HR information

A former employee at Viking Line complained to the Finnish Data Protection Authority that he had not received all the personal data he had requested, following a request for access. Viking Line stored health information about the former employee for 20 years. E.g., they had stored diagnostic data together with data on sick leave. According to the complainant, some of the information was incorrect and the reason was that the system did not have diagnostic codes that were relevant.

The Data Protection Authority carried out an inspection and found several serious deficiencies. Viking Line should have separated information about the employee's state of health from other personal data about the employee. It was unlawful to store diagnostic data together with other work-related data. Furthermore, it was objectionable that some of the data was incorrect. Health information should also have been deleted when there was no longer a need to store it. Viking Line had also not informed its employees correctly about the processing. Viking Line was ordered to remedy the deficiencies, as well as provide the complainant with all the information he was entitled to.

Emphasis was placed on the fact that even incorrect diagnostic information was stored for a long time and that this constitutes a significant privacy risk for those affected.

Viking Line was fined 230,000 euros and the Data Protection Authority cooperated with the data protection authorities in Norway, Sweden and Estonia on determining the size of the fine.

The case shows, again, that the management of HR information is important and that IT systems sometimes set limits on how information is processed. Even if you have a deficient IT system, the requirements of the GDPR must be complied with. You are not treated more leniently if an error is due to the IT system not being optimal. Storage of employee HR information over a long period of time is probably not entirely unusual in practice, but it is reasonably clear that much information shall in fact be deleted and that the Data Protection Authority requires this.

Another issue is that we are seeing an increasing degree of cooperation between data protection authorities, which can lead to increased focus on HR data in particular.

Swedish decision against Google is legally binding

In 2020, Google was fined SEK 50,000,000 by the Swedish Data Protection Authority (Integritetsskyddsmyndigheten – IMY) because they did not remove search results in Google. The fine was originally a whopping SEK 75,000,000.

The case concerned, in part, that Google made too narrow an assessment of which web addresses should actually be removed from the search results when a person demands that they are deleted. In part, it concerned that when Google removes search results, the company notified the website of this in a way that enables the website owner to find out which website this concerns and who asked for the search results to be removed. This allowed the website to be republished with a different URL that would still show up in Google searches. The IMY also held that Google, in the removal form, provided information in a way that prevented individuals from requesting deletion.

The decision was appealed and IMY won in the Court of Appeal (Kammarretten). It is now clear that the case will not be admitted for further hearing in higher courts and it is thus final. As is known, there are a number of cases in Europe regarding large amounts that are not final and it can be difficult to advise in a field until decisions are final. Now this one has landed.

The Norwegian Data Protection Authority and PostNord

PostNord has avoided fines, but must change how their app works following a decision from the Norwegian Data Protection Authority. After two reports of deviations by PostNord and tips from the public, the Data Protection Authority carried out further investigations into the security of the "mypostnord" service. The app is designed for private customers who use the company's services.

PostNord used mobile numbers as the only authentication for users in the app. This meant that persons who get a new phone number could access the profile of the previous owner of the phone number. They could then see other people's names, addresses, and in some cases information about packages. So far, not particularly sensitive information.

In addition to the confidentiality provisions in the GDPR, PostNord has a duty of confidentiality under the Norwegian Postal Act.

Following an order from the Data Protection Authority, PostNord must now take measures to prevent new owners of telephone numbers from gaining access to the previous owner's information.

Is the fact that someone has a guardian a special category of personal data?

The short answer is yes. The Swedish Data Protection Authority, IMY, has recently considered the issue. Having been appointed a guardian provides indirect information about the health situation of an individual and is therefore a special category of personal data. It is interesting that IMY refers to the now quite old Lindqvist-judgment from the ECJ and writes that what is a special category of personal data must be interpreted broadly and information that indirectly shows sensitive content shall be included.

The consequence is, as many know, stricter requirements for legal basis in Article 9 compared to Article 6, stricter requirements for, e.g., access control and information security.

Read more about this from IMY here.

Privacy-friendly Apple also receives a fine

Apple often talks warmly about privacy and has, e.g., made it difficult for others to track users on Iphones. However, towards the end of last year, Apple was fined €8,000,000 by the French Data Protection Authority (CNIL) for violation of the rules on targeted advertising and the use of cookies. As justification for the size of the fine, the CNIL referred to Apple's profits from advertising revenue, among other things. To my knowledge, this is the first privacy fine issued to Apple.

The case began as a complaint from a data subject. The CNIL investigated and concluded that Apple collected the identifiers of visitors using an old operating system in the App Store without their consent. The data was used to customize advertisements in the App Store.

CNIL did not consider the use strictly necessary for the provision of the service. Here one can see that they take the same strict line as some other data protection authorities when it comes to what is "necessary" in order to deliver a service.

Apple had pre-ticked acceptance of targeted advertising in "Settings" on the phones.

One of the most interesting things about the case is that the CNIL had jurisdiction in the case even though the Apple Store is managed from a company in Ireland. This is because they applied the ePrivacy Directive. The so-called "one-stop-shop" mechanism in the GDPR did not apply.

Data protection and competitors

In Germany, an interesting case has emerged regarding whether competitors in the private sector can take action against each other due to unfair trading practices based on violations of the GDPR. The German Federal Court has referred the issue to the ECJ.

The case has its background in the pharmaceutical sector. An operator sells products via Amazon and has not obtained consents as part of the ordering process. A competitor wants to complain about this. The operator selling the products online believes that the information is not health information and otherwise that all legal requirements are covered.

The case before the ECJ concerns whether this possible violation of the GDPR can be pursued by another pharmacist through a competition law action.

Many in the private sector will follow this case. There seems to be an increasing number of cases where competitors are using violations of the GDPR against each other, which is hardly surprising. If there is to be fair competition, this must also apply in the area of data protection.

Again, increased focus on the role and independence of the Data Protection Officer

On 9 February 2023, the ECJ pronounced its judgment in the X-FAB Dresden case (C-453/21). The ECJ clarified the criteria for assessing whether there is a conflict of interest between the position of the Data Protection Officer, or DPO, and other tasks that a DPO carries out. It was emphasized that it must be ensured that the DPO is not given tasks that may impair the performance of the DPO's supervisory role in the company. The judgment makes it absolutely clear that the DPO cannot determine the purposes and methods of processing activities.

The decision is in line with the EDPB's guide for the DPO's role. What may possibly be a conflict of interest must be assessed specifically on a case-by-case basis. There have been several decisions that shed light on where the boundaries are. Previous decisions have shown that the DPO cannot have roles as head of risk management, compliance or internal audit.

Whether there was actually a conflict in this case was referred to the national court to decide and the ECJ specified that all relevant circumstances must be taken into account, in particular the organisational structure of the data controller or its data processor(s), all applicable rules including any guidelines for the data controller or its data processor(s). The role that will be considered in this case was whether the DPO can be "chair of the works council", which in Norwegian context can be compared to being a kind of local, company-internal trade union leader. It is not immediately obvious that there is a conflict of roles here, but it will be interesting to see what the German courts decide.

As many know, the EDPB is now carrying out an investigation into the role of DPO. The Norwegian Data Protection Authority has also carried out several investigations into this in 2022, and there have in part been questions about the DPO's professional expertise. It is likely that there will be more decisions about the DPO's role in 2023.

Right from the introduction of the GDPR, it has been clear that the DPO must be independent. At the same time, there has been an acute shortage of people with sufficient expertise in data protection, so that many have probably both acted as DPO and at the same time assisted in how an organisation should process personal data. There is now a shift in supervisory practice towards the role of DPO having the independence and controlling function that the GDPR describes. The role of DPO is a second-line function for the companies, while first-line on data protection issues should be handled by other functions, typically the legal department. It is moving towards a clearer division of roles.

Meta sues "scraper"

Finally: As mentioned in my newsletter in January, in November 2022 Meta received a giga fine of NOK 2.5 billion from the Irish Data Protection Authority because personal data on the Facebook pages could be scraped by third parties. Facebook was then criticized for not having sufficient built-in data protection, so that the information could be leaked.

In the meantime, Meta has become aware of a company that has carried out such covert scraping and has taken this company to court: the company is called Voyager Labs, is American, and offers surveillance services to law enforcement agencies in the United States. They have software that will predict whether individuals are particularly predisposed to commit criminal acts. They use artificial intelligence that needs large data sets to be able to make as good predictions as possible. The software and associated data were used, among other things, by the police in Los Angeles.

According to the court documents, Meta claims that Voyager created over 38,000 fake accounts on Facebook, Instagram, Twitter, YouTube, LinkedIn and Telegram and then used these to scrape information about over 600,000 people.

At the present time, Meta has not sued the company with a view to receiving any form of compensation, they initially only want the practice to cease. The legal basis is that the company violates Facebook's guidelines and terms of use of the service, and that it can therefore be considered a breach of contract. Had this happened in Europe, they would have been able to argue with GDPR as well. The case shows that there are significant legal challenges with international platforms. Read more about the case here.

Do you have any questions?