Newsletter

Privacy corner

by Eva Jarbekk

Published:

Security cameras mounted on wall.

We are now entering July and the first half of 2023 is behind us. With our privacy glasses on, it is fairly clear that we are dealing with a developing set of regulations and that there are frequent adjustments. Important cases and changes have often come from courts in the EU and from the EDPB just before the summer holidays – and that may happen this year as well. Often a little before they go on holiday in Europe – around mid-July, with an associated interrupted summer holiday here up north. We'll see how it goes this year.

This is the fifth summer after the GDPR entered into effect. Privacy is far from perfect, but overall, privacy has become better than it was. Although, I also agree that some of the documentation requirements in the EDPB's guides are (unnecessarily) extensive and difficult to relate to. I have often thought that people's personal data is used in ever new areas and that one of the most important things is that we get information about how it is used. So, it is not such a big deal if not everyone cares, it is perfectly OK that just some do.


What has been most important in the first half of the year? I think we have seen four important topics.


The very largest fine has now been given to Meta for breaching the rules on transfers to third countries/the USA – an estimated NOK 14 billion. The fine will be disputed, of course, but going forward, companies risk being affected by the fact that this is intended to be a "signal fine", which is a warning to others. I don't think that the "barber on the corner" will be affected, but for many large companies this is important. And if you do not get a fine, but are told to stop the transfers you have? In practice, it is as bad as a fine for most.


Another very important case is the now so-called SRB case regarding pseudonymised personal data. Could it be that life has been breathed into the Breyer judgment from 2016 and that data that the holder cannot re-identify, can be considered anonymous and "escape" GDPR? The consequences can be large. The deadline for filing an appeal against the SRB judgment expires after this newsletter has been written. Many have said that this view of pseudonymized data is a more pragmatic interpretation of the GDPR. I think it is unlikely that there will be less work if it becomes the new standard – because then one must document and assess what possibilities for re-identification the recipient of pseudonymised personal data has. But obviously – it is tempting to consider pseudonymised data as anonymised. Together with good colleagues, I had an article about this in Digi, which may be worth reading (in Norwegian). Read it here.


For readers in Norway, it is also worth noting that the Privacy Appeal Board (Nw.: Personvernnemnda) has become less important. This is shown through the case about SATS, where the possibility of appeal to the Privacy Appeal Board was replaced by having to lodge a complaint with the courts – because the case had relevance to several countries and then the national Privacy Appeal Board cannot be used. There are many Norwegian and Scandinavian companies that have similar activities abroad, as SATS has. And it is more expensive to lodge complaints to the court system than to the Privacy Appeal Board. However, this has become a reality and we see it in several cases from the Norwegian Data Protection Authority.


The trend from the data protection authorities and court practice is that the right of access is interpreted quite literally in relation to the underlying principles of privacy and transparency. This entails good privacy protection – but it may come as a surprise to some data controllers. Imagine that you have to disclose all the data processors you use and where the personal data is located. A challenge – yes, for the vast majority.


Below you will find my comments on some of the other most exciting cases in the last month. I'm sure that a lot will happen in the autumn as well.


I wish you a really good summer!

The Finnish Data Protection Authority orders the suspension of the use of Google Analytics and reCAPTcha

The Finnish Data Protection Authority has issued a reprimand to the Meteorological Institute of Finland due to the transfer of personal data to the US using surveillance technologies. The Meteorological Institute has used both "reCAPTCHA" and Google Analytics (GA) on its website.


The Finnish Data Protection Authority concluded that the Meteorological Institute had no legal basis for the transfer of personal data that the use of reCAPTCHA and GA entailed. The Meteorological Institute had also not stopped the transfer of data immediately following the Court of Justice of the European Union's Schrems II decision. Nor had they carried out a DPIA (Data Protection Impact Assessment).


There has been a lot of talk about GA in the past year, and in Norway we are still waiting for what the Norwegian Data Protection Authority will say about the case against Telenor. It is still conceivable that a decision will be made before the summer holidays start. The scope for them to choose to be flexible and emphasize the fact that there have been no access requests against the GA may seem to become narrowed when these decisions against GA are quite similar. However, I think it should be doable. The SRB case shows that "established" truths within the GDPR can be challenged.


Read more on the Finnish case here.

The deadline for responding to access requests is not extended due to employee's illness

There have been a number of cases about access requests recently. I believe there will be even more of them. Most people are in the process of becoming aware of what they have the right to know and there are cases that expand this as well. The Swedish Spotify case is one of them. The case below from the Belgian Data Protection Authority shows that it is not that easy for the data controller to refer to illness as an excuse for delaying responding, either.


A former tenant requested access to personal data pursuant to GDPR Art. 15 from the lessor (the data controller). The access request was sent on 2 December 2019. The data controller replied on 31 December 2019 that the response time had to be extended by two months. However, the complainant did not receive a response until 2 September 2020, i.e., 10 months after the request was first sent. In the response, not all questions in the request were answered either, because they claimed that the requested data was not covered by GDPR Art. 15.


The person concerned then lodged a complaint against the data controller to the Belgian Data Protection Authority (GBA). The GBA emphasized the importance that GDPR Art. 15 is followed and concluded that the data controller had not responded to the request within the original deadline, nor after the extended deadline of two months. This could not be excused by the fact that the person responsible for the case was on long-term sick leave.


Access requests shall be answered with information about which specific data processors you have shared data with – and in many cases also which of your own employees have received data. Consequently, there was a violation of GDPR Art. 15.1, 12.3 and 12.4.


The decision can be found here at GBA: Nl_Advies.dotm (link).

More big fines for "targeted ads"?

As is well known, retargeting and the use of personal data for advertising purposes have given rise to many large fines. Now, the Irish Data Protection Authority (DPC) had notified Microsoft of a violation fee to Linkedin. The case started with the DPC investigating complaints against LinkedIn that they ran unlawful targeted advertising. Microsoft has now received an advance notice from the DPC with a tentative fine of just under NOK 5 billion for unauthorised retargeting.


The decision is not yet public, but Microsoft will of course dispute the claim. It is not yet clear when the final decision from the Irish Data Protection Authority will be available. At the same time, we see that both the practice of some large players (Google) and changes in regulations are moving in the direction of explicit consent for retargeting. Privacy is still a young, rapidly changing discipline and the fact that the data protection authorities have spent some time (read: years) coordinating how the GDPR should be interpreted has probably meant that some players have settled on extensive interpretations. It may look as if it is being narrowed now, but it will both take time and create many discussions. And if you use retargeting in relation to your customers – then there is every reason to keep an eye the development.


Read more about the Microsoft case here.

And while we're talking about Microsoft – they have challenges in the US too

Microsoft has reached a USD 20 million settlement with the Federal Trade Commission (FTC) to end a charge that it collected personal data about children without parental consent and in some cases kept it for years.


The breach is a violation of the Children's Online Privacy Protection Act (COPPA), which is supposed to protect the privacy of children under the age of 13. COPPA requires that companies that collect personal data notify parents of the data being collected and that the parents give permission. In addition, personal data that is no longer necessary to be stored shall be deleted.


The FTC alleges that children who create users on Microsoft Xbox must provide a variety of personal data and have a pre-populated "checkbox" that allows Microsoft to share the personal data with advertisers. Microsoft has supposedly collected the data before obtaining consent. In addition, they stored the data even if the parents did not complete the creation of the user.


X-box boss Dave McCarthy believes that the storage of the personal data is due to a technical error and that the data was never used or shared in any way whatsoever.


The FTC now requires that Microsoft shall notify parents and obtain consent for the storage of data for all users created before May 2021. In addition, Microsoft shall implement new systems to delete children's personal data if consent from parents has not been obtained and ensure that data is deleted when it is no longer necessary to store the data.


There has been a lot of activity for the FTC lately. This case was their third COPPA-related case in recent weeks. Including a case against Amazon that kept recordings from the well-known Alexa platform despite requests from parents to delete the recordings.


You can read more about the case here.

The EDPB has published its final guide for the assessment of fines for violations

The guide shall ensure that fines are calculated in (roughly?) the same way in the European countries. It is, of course, governed by Article 83, but they have created a method to be used in determining the size of a fine.


There are three elements that shall be taken into account: the nature of the violation, the seriousness of the violation and the turnover of the violator. Although this may seem clichéd, discretion is still central. The EDPB itself writes that a discretionary assessment must be a part of deciding the size of a fine. In addition, mitigating circumstances must be considered. This is nevertheless an important contribution to the framework for bringing about more effective cooperation between data protection authorities in cross-border cases.


Although the final guide is largely in line with the feedback from the public consultation, the final guide involves a change in how the size of an enterprise shall be weighted in order to determine the starting rate. What is defined as less serious violations can be given between 0-10% of the maximum penalty, medium violations are given between 10-20% of the maximum penalty and serious violations shall have between 20-100% of the maximum penalty.


It is clear that there will be discussions about which category a violation falls into – thus far, the data protection authorities have tended to regard violations of fundamental rights as serious. Account shall also be taken of a company's turnover (and carrying capacity) and they have introduced "starting levels" for fines for different types of turnover.


We will be seeing a lot of discussion about this going forward in relation to cases of imposed fines. The guide is available here.

Few are concerned about privacy, but they ensure the privacy of the many

This is a statement that has been said many times and it is absolutely true. And then it is useful for people to lodge complaints against violations of privacy.


Now the EDPB helps people lodge complaints. They have developed a template for a complaint form as well as a receipt for receiving complaints. The aim is to give the complainant general information about the next step after a complaint and to inform them of the right to effective enforcement of the decisions of the data protection authorities. The templates should also make it easier for the data protection authorities to process complaints in cross-border cases.


The templates take into account differences in national legislation and allow the data protection authorities to adapt to their own laws.


Read more here.

In Sweden, the IMY has given Spotify a large fine

When users of Spotify request access to their personal data, Spotify releases three types of information; profile information, history linked to the user's personal data and specific information that a user requires to be provided. The data is provided in a so-called technical JSON format.


The Swedish Data Protection Authority, IMY, believes that it is sufficient that data is made available through a web-based solution. However, it is absolutely crucial that the data is formulated in such a way that it fulfils the purpose of the access request. The user must be able to understand how the personal data is processed and that the processing is in line with the law.


Spotify had not adapted the data to a user-specific situation, which made it difficult to review whether the processing of the personal data was in line with the law. In addition, data was not readily available and too vague and general when provided, which made it difficult for ordinary users to understand it.


IMY did not consider the tripartition of information to be a violation of Art. 15 because the user has the opportunity to obtain all the data at the same time by contacting customer service. The format of the data (JSON format) was also generally sufficient. However, technical data was only provided in English. It is a requirement under Art. 12 that data under Art. 15 shall be provided in a concise, clear and comprehensible form and in comprehensible and clear language. The Swedish Data Protection Authority therefore concluded that only providing data in English was a violation of Art. 12.


The Swedish Data Protection Authority also added that if it is unclear which data an access request applies to, then the data controller shall assume that the user wants access to all personal data.


The investigation ended with a fine of SEK 50 million.


The case is discussed by NOYB here.

Can tax information on US citizens be transferred to the US according to FATCA?

In April 2021, the EDPB asked the member states of the EU/EEA to assess, and possibly revise, bilateral agreements that require the transfer of personal data to third countries in connection with taxation. The Association of Accidental Americans (AAA) has pointed out in a letter to the EDPB that a very long time has passed without anything happening in the member states. Until recently.


Several US/Belgian citizens living in Belgium were notified by their bank that it had to disclose their bank accounts, including balances and other assets related to the FATCA agreement. Among other things, this agreement obliges banks to inform local tax authorities about bank accounts that US citizens have opened abroad. The local authorities then report to the US. The Belgian Data Protection Authority believes the agreement is in violation of the GDPR.


According to GDPR Art. 96, international agreements entered into before the entry into force of the GDPR are not affected by the GDPR. However, the Belgian Data Protection Authority believed that, in exceptional cases, the rules in the GDPR must nevertheless prevail if the use of Art. 96 has disproportionate consequences for the rights of the complainants.


The Belgian Data Protection Authority also believed that the FATCA agreement does not contain a sufficiently clearly stated purpose and therefore it is not possible to examine the extent to which the processing of personal data is necessary in order to fulfil the purpose. The FATCA agreement was also not in line with the principles of necessity and proportionality, it contains no protective measures and does not mention the privacy of those whose personal data is processed through the agreement.


As data controller, the Belgian tax authorities had also not provided sufficient information about the processing of personal data as GDPR Art. 13 and Art. 14 prescribe and they had not carried out a DPIA for sharing personal data with the US. The Belgian tax authorities had not taken measures to ensure that the sharing of personal data was in accordance with the GDPR. Consequently, the Belgian Data Protection Authority adopted a ban on processing the complainants' personal data in accordance with the FACTA agreement.


Norway has also signed the FATCA agreement with the US, and there was some discussion about privacy when it was signed. Thus far, I can't see that there have been discussions similar to the one in Belgium.


Read more here and here.

And again, a reminder that HR data must be treated wisely

The Capital's emergency services in Denmark discovered that their new archiving system (ESDH) gave all employees access to current and former employees' personal data. Including name, social security number and addresses. This applied to a total of 2,000 employees. After a closer investigation, the data controller discovered that six different users had used the access without it being relevant to the performance of their work.


The Danish Data Protection Authority chose to criticize the emergency service, but did not issue fines (fines are rarely issued in Denmark because it has to go through the police).


In other European countries, we see a lot of attention being paid to who has access to HR data. The rules will be the same in Norway, but for now, there are yet not many cases about this here. However, that may change.


It may be a good idea to check access to HR data, for how long it is stored and whether it is in cloud services that could entail a transfer to the US – then a TIA (Transfer Impact Assessment) is needed. Also when the supplier is Scandinavian, but uses a subcontractor who is American. I recently had a meeting with an HR manager who "suddenly understood" that his Norwegian contracting party had US subcontractors and that this triggered a third country transfer situation. Lucky for the person concerned that it came up in conversation with me, and not with the Norwegian Data Protection Authority. Help your own HR manager!


Read more about the Danish case here.

Status of the AI Act – soon finished and with fines almost twice as large as in the GDPR?

After the European Parliament proposed changes to the original legal text drawn up by the Commission on 14 June, trialogue negotiations are now taking place on a final result. Those close to Brussels indicate slightly different time scenarios for when we will get a final regulation, but given the attention AI currently has, this may happen quickly. It is also worth noting that the GDPR does not "disappear" with the AI Act. A data processor must still have legal basis in the GDPR, but there are a number of formal requirements in addition in the AI Act.


The level of fines in the AI Act is significantly higher than the level in the GDR; the following is proposed:


  • Violation of the regulations on prohibited AI methods can give rise to fines of up to 40 million euros or 7% of the company's annual turnover.
  • Violation of Art. 10 and Art. 13 on transparency and privacy can result in fines of up to 20 million euros or 4% of annual turnover.
  • Other violations of the AI Act can result in fines of up to 10 million euros or 2% of annual turnover.

Parliament has made a number of changes in the regulation, some of the most important of which are:


  • They take into account "generative" AI.
  • When using generative AI, there is also an obligation to provide information that generative AI has been used.
  • Several AI systems are defined as "high risk" – especially those that recommend content to social media users.
  • Individuals have been given the opportunity to appeal to a supervisory body, much in the same way as for the GDPR.
  • Special restrictions are introduced in contract terms on limitation of liability in agreements on high-risk AI.

There is a lot to discuss here, and it is relevant to many people. We will look into this in more detail in the autumn.

Fines are starting to come for AI that cannot be explained

As a reader of this newsletter and thus well acquainted with GDPR and technology law, you will be well aware of the challenges associated with explaining algorithms and AI. It is interesting that there are now cases where the data protection authorities do not find it acceptable that the reasons for a decision cannot be explained.


A new case comes from Germany where a person had applied for a credit card and provided a number of personal data. The person was automatically rejected despite a good credit score and high income. The person then asked the bank to inform them why the person concerned was refused, and the bank refused to provide this data. The German Data Protection Authority determined that when data controllers provide automatic responses using algorithms, they are required to provide concrete information about databases and factors that influence the decision. The case ended with a fine of 300,000 Euro.


The case regarding Uber, which I wrote about in our previous newsletter, is also relevant, where it criticized that the AI came up with proposed solutions that the human employee just accepted without making an own assessment on what should be the outcome. It is also clear that, AI Act or not, AI is presently not used in an empty legal space – there are many rules that apply.


Read about the case from Germany here.

Do you have any questions?