Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Security camera survailance

There is hardly a meeting or a conference without ChatGPT being mentioned at the moment. Problems with copyright, problems for schools with cheating at exams, problems with privacy, the fact that the Italian Data Protection Authority stopped its use for a period – you name it. And it is important. My colleague Inge Brodersen has written an excellent article about this and the actions of the Italian Data Protection Authority here.

About ChatGPT

Seen from the perspective of this newsletter – what is the legal basis for processing personal data? When ChatGPT can write articles about both you and me – and make up and distribute false facts along the way – what legal basis can be used? Hardly consent. Does legitimate interest work? I have great doubts.


And (at least) one person has demanded Open AI access to what personal data the AI has used when it wrote articles about them that included (wrongly) that they had passed away. And what if a programmer sends code (with sensitive personal data) to ChatGPT to test whether it works – and then the information becomes part of ChatGPT's learning material later? This was a case which actually occurred and is discussed in The Economist Korea, see here. This should have an impact on how you use ChatGPT internally in companies.


Perhaps it is good that EDBP has set up a separate task force on this. I'm guessing there will be some relatively principled viewpoints from them going forward. In any case, this will be another exciting field to follow! Below are some other matters I believe should be of interest. Happy reading

EDPB's final guidelines on the right of access

It looks like there is an increasing trend for more and more in-depth access requests. For some, it is quite a challenge when you have to account for, e.g., which personal data is transferred to which countries and which security measures have been taken. It is therefore relevant that the EDPB has updated its guidelines for how such access requests should be understood and handled. A final updated guideline was published on 17 April.


The guidelines address and review several points in this context. Key and relevant points are as follows:


  • The user does not need to give reasons for the access request, and it is not up to the controller to assess whether the request will actually contribute to verifying the lawfulness of the processing or the exercise of the rights.

  1. The right of access includes three different components:
  2. Information about whether data about the person is processed or not,
  3. Access to the personal data,

  • Access to information about the processing of the data, e.g., purpose, categories of data and its recipients, duration of the processing and the user's rights.
  • The ways to grant access may vary depending on the amount of data and the complexity of the processing carried out. Unless otherwise expressly stated, the request should be understood to apply to all personal data about the user. The controller may ask the user to specify the request if they have a large amount of information.
  • The controller must search for the data subject's personal data in all IT and archiving systems based on search criteria that reflect the way the information is structured, e.g., name and customer number. The provision of data and other information shall be concise, understandable and easily accessible.
  • As regards how the request for access is made, there are no specific requirements for the format in which it is made. The controller shall ensure that there are appropriate communication channels that are easily accessible to the user. The user is nevertheless not required to use the channels and can therefore instead send the request directly to others at the controller.

The guidelines can be found here.


In addition, I note previous comments about new court decisions that the right of access is broad – that it usually covers which data processors have had access to information and in many cases probably also which internal employees have had access to information.

Norway – The Norwegian Data Protection Authority recommends review of cookies and tracking mechanisms on websites

The Norwegian Data Protection Authority has recommended that all businesses review their websites for cookies or other tracking mechanisms. The recommendation was due to, inter alia, the fact that Sveriges Radio recently revealed that over 100 pharmacies have shared customers' personal data with Facebook.


The Norwegian Data Protection Authority emphasises, inter alia, that they are concerned that several Norwegian websites use cookies or other tracking mechanisms without being aware of what kind of information the tracking technologies uncover or with whom the information is shared. For the users of the website, this can be a big privacy risk, while for the business it can be both a legal and reputational risk. The Norwegian Data Protection Authority therefore emphasizes that all businesses should have a conscious relationship towards data protection regulations when they implement analysis and marketing solutions on their websites.


The Norwegian Data Protection Authority otherwise emphasizes that the concern applies to both private and public websites. I do not think the recommendation is surprising, and my assumption is that enforcement of cookie regulations will become much more important in Norway in the future.


You can read more about the case on the Norwegian Data Protection Authority's website here.

Briefly about third country transfers on the new US-EU agreement

The EU Parliament is still critical of the new transfer agreement between the EU and the US. They are asking for the draft agreement to be perfected. However, the resolution that came in April is not binding, and the Commission is therefore not bound by it.


Read more about this here.


The last I heard about this case is that it is currently uncertain how the Commission is now approaching this. It is conceivable that there will be an adequacy decision about the US within a few months, but I have also heard informed sources say that it could be the end of the year or even longer. As written before – it can be good to have a good template for writing Transfer Impact Assessments, as they are unlikely to disappear anytime soon.

About Klarna

The Administrative Tribunal in Sweden has recently handed down a judgment in which they conclude that Klarna Bank AB has violated the Swedish Data Protection Regulation. The consequence of the violation is a penalty fee of SEK 6,000,000 (approximately EUR 530,000).


Klarna was first ordered to pay a fee of SEK 7,500,000 (approximately EUR 660,000) by the Swedish Data Protection Authority, IMY. The company appealed the decision, claiming that the only thing that had been breached was a set of non‑binding guidelines from the EDPB. Many believe that precisely that argument is important in compliance with a GDPR regulation that is rough around the edges, given that the guidelines from the EDPB do not go through as many democratic discussions as the GDPR itself. I will come back to the extent to which the EDPB's guidelines were decisive and emphasized here, but it is clear that the court believed that the duty to provide information had not been breached to the same extent as the IMY believed.


However, the Administrative Tribunal came to the same result as IMY but adjusted the fee down by one and a half million SEK. In the assessment, the court emphasised that Klarna had not given its online users sufficient information about how and for what purposes the personal data was processed. Nor was sufficient information given about what rights the users had.


There is currently not much information available about the case, but it is discussed here.

Austria – case against newspaper

The Austrian Data Protection Authority, DSB, has commented on the "Pay or Okay" case (the introduction to this was mentioned in the March newsletter).


The case concerned the requirements for consent in GDPR Article 4. The Austrian newspaper Der Standard had offered the user on its online newspaper to either consent to cookies or to buy a subscription for 6 euros a month. NOYB argued that this was not privacy friendly, as they believed that the user in practice did not have the opportunity to give free consent as the only alternative would have been to subscribe to the newspaper.


The DSB concluded that such a "pay or okay" was not legal according to the principle of consent in GDPR Article 4 (11). The Court justified this, inter alia, in that a consent must be free, specific, informed and unambiguous. Furthermore, it was emphasised that "the granularity of consent" is a key element in the assessment. The consent must be specific for all possible processing, which means that when there are several processing options, consent must be given for each of them. This was not the case in the present case. They had bundled consent for marketing, analytics and "social media plug-ins".


The principle of "Pay or OK" was not directly commented upon, as the court chose to address granularity in the consent. This is exactly the area we see many Data Protection Authorities currently focusing on.


Read more about the case here.

More about the role of the Data Protection Officer

The Lithuanian Data Protection Authority has published a report based on inspection of various Data Protection Officers. E.g., they have the following findings:


  • Conflict of interest: some companies appointed the CEO as Data Protection Officer (DPO), while in other cases the DPO had tasks that led them to determine the means and purpose of the processing.
  • Some companies had several DPOs – without it being determined what the responsibilities were. The Lithuanian Data Protection Authority emphasises that if there is a need to have more resources working with data protection, it is recommended to set up a group consisting of one DPO and a team – not more DPOs.
  • Where the DPO role was purchased as a DPO-as-a-service, the service was often limited to, e.g., 60 hours per year and without having clarified how continuity of the service was to be ensured.
  • Lack of clarity in how the DPO should report to senior management.
  • Little information given to employees about the DPO's role for employee information. While contact information for the DPO is often on a company's website, there was not always corresponding information for employees.
  • When advice from the DPO was not followed, there was a lack of documentation on why, on who makes such a decision and how.
  • Few DPOs carried out internal controls in the company.

The report is available here, albeit in Lithuanian.


There is reason to believe that some similar discoveries can be made elsewhere. The EDPB's coordinated investigation into the DPO role will probably focus on the same matters. In Norway, we are still waiting for the results of the Norwegian Data Protection Authority's investigations from quite some time ago. I believe that it will be difficult for anyone other than fairly large companies and agencies to have full independence for the DPO role because it requires such significant resources. Many also wonder what to do with a DPO who you can't really ask for advice on how to proceed. Some pragmatic solutions will probably have to be adopted, but it is also clear that there will be a much stronger focus on the DPO's independence in the future. And – as some of us said when the GDPR came – if you are not required to have a DPO, then you are probably better helped by a "Privacy officer" who has the freedom to give advice and recommendations into the organisation without being required at the same time to revise them.

More data protection legislation in the US

In a seminar last week, I heard ICT Norway claim that "EU regulates while US operates". It was said with a sigh over a complicated GDPR that Europeans have to deal with. However, I believe that this is an outdated opinion. GDPR is complicated to understand and is very detailed, but it is becoming at least as complicated in the US. Now the seventh state in the US is proposing to introduce a data protection regulation.


A number of US privacy lawyers I know think that Europe is lucky to have only one set of regulations to deal with – they already have many sets of regulations – until one day (perhaps) they manage to agree on a comprehensive set of regulations for all states. But that seems to take a while longer.


On 13 April 2023, the Indiana Senate passed Senate Bill 5 (SB 5). SB 5 is a new data protection regulation for consumers. Consumers shall receive an accessible, clear and information-rich privacy policy which, inter alia, contains information about what kind of personal data the business holds, the purpose behind the collection of the data, as well as the rights the consumer has. Furthermore, the regulations require that the businesses carry out and document an assessment of the data protection consequences for the various processing options that involve personal data. An American privacy lawyer I know says with a wry smile that the requirements for a "DPIA in the US are greater than in Europe". And it may well be that this is true.


For those interested, there is more about SB 5 in Indiana here.


It is likely that even more states will follow suit and introduce data protection regulations. However, the difference between the regulations is considerable – and frustrating. We may have to come back to this, as it is relevant for businesses with activities in the US. A different set of regulations in each state is of course difficult for businesses that work across several states, a bit like it was in Europe before GDPR – except that we didn't have the stiff fines or particularly effective Data Protection Authorities at the time.


Read more about developments in the US here.


And here is an overview of the legislation in the US.

Practical advice for privacy by design from England?

After all, England aims to have effective and practical data protection. As you know, they have proposed a wide range of changes to their version of the GDPR and many of these have been criticized for weakening data protection. But in any case, it is good to have a Data Protection Authority that is concerned with providing practical advice. Now they have come up with both advice and illustrative videos for product development.


E.g., the ICO has created a new guide to help UX designers, product managers and software engineers embed data protection into their products and services from the start. The guide contains both examples of good practice as well as practical steps that organisations can take to comply with data protection legislation when designing websites, apps or other technology products and services.


See summary of the guide and link to the videos here.


The ICO divides the process of starting a business into six different phases; The kick-off, research, design, development, launch and post-launch phase. The guide provides concrete advice for each of the six phases on how the business should process personal information and data. This may actually be useful for others as well.


See the guide to the six phases here.

Tesla and built-in privacy

Perhaps Tesla could have benefited from reading the advice from the ICO a little more carefully. This is certainly no news to readers of this newsletter, but it is nevertheless thought-provoking that even companies that should be extremely professional so emphatically miss the mark on data protection.


It has been a bit of world news that former employees of Tesla claim that there were several cases where employees had shared intimate and invasive videos/photos of Tesla owners in their cars. The videos included, inter alia, a naked man approaching a Tesla, car accidents and outbursts of rage. The videos have been recorded via built-in cameras in the cars, which have been inserted for the purpose of assisting Tesla's self-driving function/autopilot. Not for the purpose of entertainment for employees.


Unsurprisingly, Tesla is now being sued as a result. The claimant alleges, inter alia, that Tesla employees shared "highly invasive videos and images" from users' cars for their own "tasteless and tortious entertainment" and "the humiliation of those surreptitiously recorded."


Read more about the lawsuit here.


The Guardian writes about the case here.


We shall come back to how often lawsuits in the field of privacy pay off for the individuals.

Two new fines for TikTok – both from England and from the Netherlands

TikTok has recently been fined by the ICO in the amount of EUR 12,700,000 for unlawfully processing the personal data of 1,400,000 children who were under the age of 13 and who used TikTok without parental consent.


The age limit for use of TikTok is, according to its own terms and conditions, set at 13 years. The fine is based on, inter alia, TikTok failing to prevent children under the age of 13 from accessing the platform, failingto delete children under the age of 13 from the platform, and failingto uncover or discover that children under the age of 13 used their platform.


The case already started in May 2018, after which the fine was first proposed to be over EUR 25,000,000. After the fine, TikTok has emphasised that it has changed its practice since the ICO began investigating the matter. Now the site uses more than merely users' self-reported ages when trying to figure out how old they are. This includes, inter alia, training the moderators to identify minors' accounts and providing tools for parents to request the deletion of their minor children's accounts.


Read more about the case here.


TikTok has also been fined by the Dutch Data Protection Authority, AP, in the sum of EUR 750,000 for breaching GDPR Article 12 (1). The breach of the GDPR means that TikTok has only provided and presented its privacy policy in English and not as an alternative in Dutch.


When a user creates a TikTok account, they are informed in Dutch that they accept TikTok's privacy policy. However, the Dutch Data Protection Authority's investigation revealed that, between 25 May 2018 and 28 July 2020, TikTok provided its privacy policy to Dutch users – including children – only in English. This applied both during the registration process, as well as when a user is logged in and wanted to consult the privacy policy of the TikTok app.


Read more about the case here.


The question of which language a privacy policy must be given in comes up as a question at regular intervals. Different countries in Europe have slightly different practices and it must be checked locally. Some countries have rules that it must be given in the country's own language. Others believe that you "merely" have to make sure that it is given in a language the users understand. In practice, I don't really feel that this is a big practical problem.

The French Data Protection Authority on geolocation

Nowadays where rented cars, scooters and other mobile units are common, it is natural that Data Protection Authorities focus on whether such agreements are entered into legally.


The French Data Protection Authority, CNIL, has investigated the use of geolocation for such services. In this context, CNIL has focused on the company CITYSCOOT, which rents out scooters. They focused particularly on the company's collection of data, whether information had been provided and consent obtained from users before the processing of data on mobile phones. It was revealed that during the rental of a scooter, the company collected data relating to the geolocation of the vehicle every 30 seconds. In addition to this, the company kept track of users' journeys over a long period of time.


The company justified the collecting of the geolocation by the fact that it was used for processing traffic offences, customer complaints, user support, and handling claims and thefts. However, the CNIL concluded that none of these reasons justified the collection of such detailed and extensive data. Furthermore, it was highlighted that the company could have offered an identical service without geolocation of the customers. The CNIL also noted that three contracts entered into with CITYSCOOT's data processors did not contain all the information required by the GDPR.


The company had also used a reCAPTCHA mechanism, provided by Google, which collected hardware and software information (such as unit and application data). The mechanism was activated when creating an account on the CITYSCOOT mobile application, when logging in and when going through the forgotten password procedure on the website. No prior consent was obtained nor was information given to the users that the company had access to and stored their information. CITYSCOOT was given a fine of EURO 125,000.


Read more about the case here.

Inspection of Moderaterne in Sweden regarding personalised videos in the election

The Swedish Data Protection Authority, IMY, is now launching an investigation into the political party Moderaterna. The investigation is linked to the fact that IMY has received a number of complaints that Moderaterna used personalised video greetings ahead of the 2022 election. In the complaints, it is submitted, inter alia, that Moderaterna do not have sufficient information on their websites about how personal data was used during the election campaign, that the services used for the video greetings data is stored in the US, and that it has technically been possible to gain access to other people's personal data video greetings. It is quite obvious that this type of personalisation – if it has occurred – will require both clear information and clear legal basis. The fact that IMY also raises the issue of third-country transfers in the same case raises the stake in a way.


Read more about the case here.

And a strict case from the Spanish Data Protection Authority in the end

The Spanish Data Protection Authority, AEPD, has imposed a fine of EUR 5,000 on a controller for not having legal basis to send an e-mail to approx. 190 recipients. The recipients were in CC and not in BCC. The Data Protection Authority believed that this was a violation of Article 5(1)(f) and Article 32(1) of the GDPR.


It has been difficult to get hold of the details of the matter, but it seems that it involved an invitation to a paid dinner/event. It must be fair to say that the reaction seems somewhat strict. Especially when it is the case that you can often allow a transfer to a third country, for example the US, if the information in question is publicly known – one of the examples being email addresses. It will naturally play into what kind of e-mail addresses were involved and I have no insight into this. But maybe I will think about it one more time before I put people in BCC or CC in the future and press send.


Read more about the case here.

Do you have any questions?