by Eva Jarbekk
Wishing you all a great month of May!
There is hardly a meeting or a conference without ChatGPT being mentioned at the moment. Problems with copyright, problems for schools with cheating at exams, problems with privacy, the fact that the Italian Data Protection Authority stopped its use for a period – you name it. And it is important. My colleague Inge Brodersen has written an excellent article about this and the actions of the Italian Data Protection Authority here.
Seen from the perspective of this newsletter – what is the legal basis for processing personal data? When ChatGPT can write articles about both you and me – and make up and distribute false facts along the way – what legal basis can be used? Hardly consent. Does legitimate interest work? I have great doubts.
And (at least) one person has demanded Open AI access to what personal data the AI has used when it wrote articles about them that included (wrongly) that they had passed away. And what if a programmer sends code (with sensitive personal data) to ChatGPT to test whether it works – and then the information becomes part of ChatGPT's learning material later? This was a case which actually occurred and is discussed in The Economist Korea, see https://www.cpomagazine.com/cyber-security/samsung-employees-fed-sensitive-data-to-chatgpt-while-using-it-to-check-code-create-presentations/. This should have an impact on how you use ChatGPT internally in companies.
Perhaps it is good that EDBP has set up a separate task force on this. I'm guessing there will be some relatively principled viewpoints from them going forward. In any case, this will be another exciting field to follow! Below are some other matters I believe should be of interest. Happy reading
It looks like there is an increasing trend for more and more in-depth access requests. For some, it is quite a challenge when you have to account for, e.g., which personal data is transferred to which countries and which security measures have been taken. It is therefore relevant that the EDPB has updated its guidelines for how such access requests should be understood and handled. A final updated guideline was published on 17 April.
The guidelines address and review several points in this context. Key and relevant points are as follows:
The guidelines can be found here: https://edpb.europa.eu/system/files/2023-04/edpb_guidelines_202201_data_subject_rights_access_v2_en.pdf
In addition, I note previous comments about new court decisions that the right of access is broad – that it usually covers which data processors have had access to information and in many cases probably also which internal employees have had access to information.
The Norwegian Data Protection Authority has recommended that all businesses review their websites for cookies or other tracking mechanisms. The recommendation was due to, inter alia, the fact that Sveriges Radio recently revealed that over 100 pharmacies have shared customers' personal data with Facebook.
The Norwegian Data Protection Authority otherwise emphasizes that the concern applies to both private and public websites. I do not think the recommendation is surprising, and my assumption is that enforcement of cookie regulations will become much more important in Norway in the future.
You can read more about the case on the Norwegian Data Protection Authority's website here: https://www.datatilsynet.no/aktuelt/aktuelle-nyheter-2023/datatilsynet-oppfordrer-virksomheter-til-a-gjennomga-nettsidene-sine/
The EU Parliament is still critical of the new transfer agreement between the EU and the US. They are asking for the draft agreement to be perfected. However, the resolution that came in April is not binding, and the Commission is therefore not bound by it.
Read more about this here: https://www.euractiv.com/section/data-privacy/news/meps-to-call-for-renegotiation-of-eu-us-data-transfer-framework/
The last I heard about this case is that it is currently uncertain how the Commission is now approaching this. It is conceivable that there will be an adequacy decision about the US within a few months, but I have also heard informed sources say that it could be the end of the year or even longer. As written before – it can be good to have a good template for writing Transfer Impact Assessments, as they are unlikely to disappear anytime soon.
The Administrative Tribunal in Sweden has recently handed down a judgment in which they conclude that Klarna Bank AB has violated the Swedish Data Protection Regulation. The consequence of the violation is a penalty fee of SEK 6,000,000 (approximately EUR 530,000).
Klarna was first ordered to pay a fee of SEK 7,500,000 (approximately EUR 660,000) by the Swedish Data Protection Authority, IMY. The company appealed the decision, claiming that the only thing that had been breached was a set of non‑binding guidelines from the EDPB. Many believe that precisely that argument is important in compliance with a GDPR regulation that is rough around the edges, given that the guidelines from the EDPB do not go through as many democratic discussions as the GDPR itself. I will come back to the extent to which the EDPB's guidelines were decisive and emphasized here, but it is clear that the court believed that the duty to provide information had not been breached to the same extent as the IMY believed.
However, the Administrative Tribunal came to the same result as IMY but adjusted the fee down by one and a half million SEK. In the assessment, the court emphasised that Klarna had not given its online users sufficient information about how and for what purposes the personal data was processed. Nor was sufficient information given about what rights the users had.
There is currently not much information available about the case, but it is discussed here: https://www.dagensjuridik.se/nyheter/domstol-satter-ned-sanktionsavgift-for-klarna-bank-efter-dataskyddsmiss/
The Austrian Data Protection Authority, DSB, has commented on the "Pay or Okay" case (the introduction to this was mentioned in the March newsletter).
The case concerned the requirements for consent in GDPR Article 4. The Austrian newspaper Der Standard had offered the user on its online newspaper to either consent to cookies or to buy a subscription for 6 euros a month. NOYB argued that this was not privacy friendly, as they believed that the user in practice did not have the opportunity to give free consent as the only alternative would have been to subscribe to the newspaper.
The DSB concluded that such a "pay or okay" was not legal according to the principle of consent in GDPR Article 4 (11). The Court justified this, inter alia, in that a consent must be free, specific, informed and unambiguous. Furthermore, it was emphasised that "the granularity of consent" is a key element in the assessment. The consent must be specific for all possible processing, which means that when there are several processing options, consent must be given for each of them. This was not the case in the present case. They had bundled consent for marketing, analytics and "social media plug-ins".
The principle of "Pay or OK" was not directly commented upon, as the court chose to address granularity in the consent. This is exactly the area we see many Data Protection Authorities currently focusing on.
Read more about the case here: https://gdprhub.eu/index.php?title=DSB_(Austria)_-_2023-0.174.027&mtc=today
The Lithuanian Data Protection Authority has published a report based on inspection of various Data Protection Officers. E.g., they have the following findings:
The report is available here, albeit in Lithuanian: https://vdai.lrv.lt/uploads/vdai/documents/files/DAP%20tikrinimu%20apibendrinimas%202023-04.pdf
There is reason to believe that some similar discoveries can be made elsewhere. The EDPB's coordinated investigation into the DPO role will probably focus on the same matters. In Norway, we are still waiting for the results of the Norwegian Data Protection Authority's investigations from quite some time ago. I believe that it will be difficult for anyone other than fairly large companies and agencies to have full independence for the DPO role because it requires such significant resources. Many also wonder what to do with a DPO who you can't really ask for advice on how to proceed. Some pragmatic solutions will probably have to be adopted, but it is also clear that there will be a much stronger focus on the DPO's independence in the future. And – as some of us said when the GDPR came – if you are not required to have a DPO, then you are probably better helped by a "Privacy officer" who has the freedom to give advice and recommendations into the organisation without being required at the same time to revise them.
In a seminar last week, I heard ICT Norway claim that "EU regulates while US operates". It was said with a sigh over a complicated GDPR that Europeans have to deal with. However, I believe that this is an outdated opinion. GDPR is complicated to understand and is very detailed, but it is becoming at least as complicated in the US. Now the seventh state in the US is proposing to introduce a data protection regulation.
A number of US privacy lawyers I know think that Europe is lucky to have only one set of regulations to deal with – they already have many sets of regulations – until one day (perhaps) they manage to agree on a comprehensive set of regulations for all states. But that seems to take a while longer.
For those interested, there is more about SB 5 in Indiana here: https://www.huntonprivacyblog.com/2023/04/18/indiana-likely-to-become-seventh-state-to-enact-a-comprehensive-state-privacy-law/
It is likely that even more states will follow suit and introduce data protection regulations. However, the difference between the regulations is considerable – and frustrating. We may have to come back to this, as it is relevant for businesses with activities in the US. A different set of regulations in each state is of course difficult for businesses that work across several states, a bit like it was in Europe before GDPR – except that we didn't have the stiff fines or particularly effective Data Protection Authorities at the time.
Read more about developments in the US here: https://iapp.org/news/a/2023-state-privacy-prospects-bring-new-paradigm/
And here is an overview of the legislation in the US: https://iapp.org/resources/article/us-state-privacy-legislation-tracker/
After all, England aims to have effective and practical data protection. As you know, they have proposed a wide range of changes to their version of the GDPR and many of these have been criticized for weakening data protection. But in any case, it is good to have a Data Protection Authority that is concerned with providing practical advice. Now they have come up with both advice and illustrative videos for product development.
E.g., the ICO has created a new guide to help UX designers, product managers and software engineers embed data protection into their products and services from the start. The guide contains both examples of good practice as well as practical steps that organisations can take to comply with data protection legislation when designing websites, apps or other technology products and services.
See summary of the guide and link to the videos here: https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2023/03/ico-shares-resources-to-help-designers-embed-data-protection-by-default/
The ICO divides the process of starting a business into six different phases; The kick-off, research, design, development, launch and post-launch phase. The guide provides concrete advice for each of the six phases on how the business should process personal information and data. This may actually be useful for others as well.
See the guide to the six phases here: https://ico.org.uk/for-organisations/privacy-in-the-product-design-lifecycle/
Perhaps Tesla could have benefited from reading the advice from the ICO a little more carefully. This is certainly no news to readers of this newsletter, but it is nevertheless thought-provoking that even companies that should be extremely professional so emphatically miss the mark on data protection.
It has been a bit of world news that former employees of Tesla claim that there were several cases where employees had shared intimate and invasive videos/photos of Tesla owners in their cars. The videos included, inter alia, a naked man approaching a Tesla, car accidents and outbursts of rage. The videos have been recorded via built-in cameras in the cars, which have been inserted for the purpose of assisting Tesla's self-driving function/autopilot. Not for the purpose of entertainment for employees.
Unsurprisingly, Tesla is now being sued as a result. The claimant alleges, inter alia, that Tesla employees shared "highly invasive videos and images" from users' cars for their own "tasteless and tortious entertainment" and "the humiliation of those surreptitiously recorded."
Read more about the lawsuit here: https://www.cbsnews.com/news/tesla-lawsuit-privacy-concerns-autopilot-reuters/
The Guardian writes about the case here: https://www.theguardian.com/technology/2023/apr/07/tesla-intimate-car-camera-images-shared
We shall come back to how often lawsuits in the field of privacy pay off for the individuals.
TikTok has recently been fined by the ICO in the amount of EUR 12,700,000 for unlawfully processing the personal data of 1,400,000 children who were under the age of 13 and who used TikTok without parental consent.
The age limit for use of TikTok is, according to its own terms and conditions, set at 13 years. The fine is based on, inter alia, TikTok failing to prevent children under the age of 13 from accessing the platform, failingto delete children under the age of 13 from the platform, and failingto uncover or discover that children under the age of 13 used their platform.
The case already started in May 2018, after which the fine was first proposed to be over EUR 25,000,000. After the fine, TikTok has emphasised that it has changed its practice since the ICO began investigating the matter. Now the site uses more than merely users' self-reported ages when trying to figure out how old they are. This includes, inter alia, training the moderators to identify minors' accounts and providing tools for parents to request the deletion of their minor children's accounts.
Read more about the case here: https://www.theguardian.com/technology/2023/apr/04/tiktok-fined-uk-data-protection-law-breaches
Read more about the case here: https://gdprhub.eu/index.php?title=AP_(The_Netherlands)_-_TikTok
Nowadays where rented cars, scooters and other mobile units are common, it is natural that Data Protection Authorities focus on whether such agreements are entered into legally.
The French Data Protection Authority, CNIL, has investigated the use of geolocation for such services. In this context, CNIL has focused on the company CITYSCOOT, which rents out scooters. They focused particularly on the company's collection of data, whether information had been provided and consent obtained from users before the processing of data on mobile phones. It was revealed that during the rental of a scooter, the company collected data relating to the geolocation of the vehicle every 30 seconds. In addition to this, the company kept track of users' journeys over a long period of time.
The company justified the collecting of the geolocation by the fact that it was used for processing traffic offences, customer complaints, user support, and handling claims and thefts. However, the CNIL concluded that none of these reasons justified the collection of such detailed and extensive data. Furthermore, it was highlighted that the company could have offered an identical service without geolocation of the customers. The CNIL also noted that three contracts entered into with CITYSCOOT's data processors did not contain all the information required by the GDPR.
The company had also used a reCAPTCHA mechanism, provided by Google, which collected hardware and software information (such as unit and application data). The mechanism was activated when creating an account on the CITYSCOOT mobile application, when logging in and when going through the forgotten password procedure on the website. No prior consent was obtained nor was information given to the users that the company had access to and stored their information. CITYSCOOT was given a fine of EURO 125,000.
Read more about the case here: https://www.cnil.fr/en/geolocation-rental-scooters-cityscoot-fined-eu125000
The Swedish Data Protection Authority, IMY, is now launching an investigation into the political party Moderaterna. The investigation is linked to the fact that IMY has received a number of complaints that Moderaterna used personalised video greetings ahead of the 2022 election. In the complaints, it is submitted, inter alia, that Moderaterna do not have sufficient information on their websites about how personal data was used during the election campaign, that the services used for the video greetings data is stored in the US, and that it has technically been possible to gain access to other people's personal data video greetings. It is quite obvious that this type of personalisation – if it has occurred – will require both clear information and clear legal basis. The fact that IMY also raises the issue of third-country transfers in the same case raises the stake in a way.
Read more about the case here: https://www.imy.se/nyheter/imy-inleder-tillsyn-av-moderaterna/
The Spanish Data Protection Authority, AEPD, has imposed a fine of EUR 5,000 on a controller for not having legal basis to send an e-mail to approx. 190 recipients. The recipients were in CC and not in BCC. The Data Protection Authority believed that this was a violation of Article 5(1)(f) and Article 32(1) of the GDPR.
It has been difficult to get hold of the details of the matter, but it seems that it involved an invitation to a paid dinner/event. It must be fair to say that the reaction seems somewhat strict. Especially when it is the case that you can often allow a transfer to a third country, for example the US, if the information in question is publicly known – one of the examples being email addresses. It will naturally play into what kind of e-mail addresses were involved and I have no insight into this. But maybe I will think about it one more time before I put people in BCC or CC in the future and press send.
Read more about the case here: https://gdprhub.eu/index.php?title=AEPD_(Spain)_-_EXP202102433