Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Computer and lock

Artificial intelligence – that’s what everyone’s still talking about. How can we introduce Copilot? What documentation do we need? Is the agreement with OpenAI balanced? Isn’t OpenAI DPF certified? Can we let AI/KI train on our customer data – what does it take? And NOYB has sued OpenAI for stating “erroneous” claims about individuals and that the errors cannot be corrected, as required by the GDPR.  EDPB has its ongoing task force on the legality of OpenAI and they have unusually just published a temporary report in which the door is put on ajar for legitimate interest to be used to scrape third-party data. Given that a number of conditions are safeguarded – at the same time as they write that they have not concluded which legal basis is correct. They also acknowledge that LLMs hallisunize and write that users must be informed accordingly. Perhaps EDPB himself decided the lawsuit to NOYB with that statement. Although I may not believe it. It's complicated. And fun - good reading! 

Is privacy important?

It has become an area where services are provided to a large extent. A new report by Coherent Market Insights shows that the global GDPR service market was valued at over a billion dollars in 2023, and that is counted as an annual growth of over 22% up to 2030. It is a big industry!

The GDPR service market is mainly driven by the increased focus on data privacy regulations across different sectors of the EU. The purpose of the GDPR is to strengthen and ensure the protection of personal data in Europe. The fact that breaches of GDPR are sanctioned by heavy fines has forced organizations across sectors to take action and invest in GDPR consulting and assessment services. This, along with increased use of cloud-based services, has increased, and continues to increase demand for such services, because public cloud sharing increases the need for third-party access management and control. 

Using AI and analytics tools are clear trends. There is a high demand for products that can automate consent processing, data subject rights management, data access and classification. The development and demand for GDPR services is a natural consequence of technological developments and compliance with GDPR regulations. 

Anyone who reads this newsletter works with privacy. There is work for us in the future. You can read more about privacy as a service area here.

Congress in the United States prohibits the use of Copilot – what do you do?

The U.S. Congress prohibits the use of AI language models like Copilot and ChatGTP, due to poor security. Microsoft launched Copilot in 2024, and we see many businesses adopting the AI language model in their daily work. It is unclear which terms and conditions Congress was presented to, but it seems to be about the risk of data being shared with unapproved cloud providers. 

There is little publicity about exactly what Congress has reacted to. In any case, implementing Copilot and other AI requires vigilance. In my opinion, there will often be a need to do a Data Protection Impact Asessment (DPIA). You also need to be very careful about what information is provided for the service. Many discourage installing such AI models on the entire existing Office 365 file library. You must also have good guidelines for using the service internally. Typical elements of these are when the service can be used, but also what it should not be used for. For example, it should not be used to record and write minutes from performance conversations. Another detail that may be useful to know is that OpenAI at the time of writing is not DPF approved. That means it may be necessary to do a Transfer Impact Assessment (TIA). 

Quality is more important than ever. While technology streamlines work, it also enables shortcuts and half-way solutions. That the product you deliver is of good quality has always been important, but in a time like this it may have become even more important? It may be okay with the documentation obligation that requires some thought on how this should be done.

And apropos AI. You can find the report from EDPB on OpenAI here - it’s readable if you’re developing AI. It lists the questions they have asked OpenAI - there are questions you would benefit from looking at for your own use of AI. 

Link. 

Read more here.

Enrichment of data in focus at the Data Protection Authority in Bavaria

Credit reporting agency CRIF Bürgel purchased personal data such as names, addresses and dates of birth for millions of Germans, including the complainant, from the broker Acxiom, who collected this data for direct marketing purposes. CRIF Bürgel used the personal data to assess the creditworthiness of individuals.

The complainant requested access to a copy of their data and information about the processing of their personal data. CRIF Bürgel responded with information about what personal data was available but did not provide the complainant with information about the exact date of receipt of data from Acxiom, the storage period, disclosure of data to specific recipients and the purposes of the transfer. 

The complainant also asked CRIF Bürgel to restrict the processing of his personal data in accordance with Article 18. CRIF Bürgel argued that the right to restrict processing only existed if the data in question were incorrect and therefore rejected the complainant’s request for restriction. CRIF Bürgel further claimed that they did not carry out a “new processing” as they already had the same personal data about the complainant when they received the personal data from Acxiom.

The complainant had help of NOYB. They complained to the Bavarian Data Protection Authority. They rejected CRIF Bürgel’s argument that there was no “new processing”. The Norwegian Data Protection Authority explained that CRIF Bürgel could conclude that the complainant was still resident at the address that CRIF Bürgel had stored in his system based on Acxiom's data. Therefore, the Data Protection Authority determined that the complainant should have been informed no later than one month after receiving data from Acxiom, in accordance with Article 14. This was a breach of the duty of information. The complainant should also have received information about this date.

The Norwegian Data Protection Authority further determined that Acxiom’s processing for the purposes of direct marketing and CRIF Bürgel’s processing for the purpose of assessing the creditworthiness of individuals was not compatible. 

The case against Axciom has been delayed when Axciom went to court to prevent the complainant from accessing case documents. This case has been lost, and the Data Protection Authority probably makes a decision in that case as well. 

There are many companies that enrich data. It must probably be considered that this is a type of activity that gets more attention going forward.

Read more here. 

Consent or Agreement?

It should be clear whether consent or agreement is used. Not everyone is good at it. 

The Swedish Data Protection Authority (“IMY”) initiated a survey against Expressen Lifestyle AB (Expressen) in 2019 to check whether consent was obtained properly. Expressen is a major newspaper publisher in Sweden. Following the introduction of the GDPR in 2018, Expressen relied mainly on contractual necessity and legitimate interest rather than consent for subscription. However, they mistakenly forgot to update the registration form of one of the company's online stores, the Magazine Shop. The Webshop had a check box on the website along with the text “I accept the Subscription Terms. By doing so, I consent to the processing of personal data within the Bonnier Group.” 

Expressen also did not update the Subscription Terms which set out the following: “When ordering, you agree that your personal data, including your email address, mobile phone number for calls and text messages and other digital addresses, may be stored and used within Bonnier for digital services, marketing and for statistical and analytical purposes.” Furthermore, information was provided about the right to withdraw consent.

After the Supervisory Authority commenced the inspection, Expressen took immediate action to correct the information provided in the registration process on their online store. Now they ask that the subscriber accept the Subscription Terms (i.e. the Purchase Terms) and confirm that they have read the controller’s privacy policy, rather than a request for consent.

IMY concluded that the original text next to the check box on the controller’s website gave the impression that the controller’s legal basis for processing personal data was consent. This was enhanced by the text of the Subscription Terms and the information provided about the right to withdraw consent. Since Expressen did not base its processing on consent, but on agreement and legitimate interest, IMY found that Expressen violated Article 13(1)(c) by stating an incorrect legal basis.

The supervisory authority found that the breaches were a minor infringement under consideration 148, as the website was not the main page used by the data subjects to subscribe. The number of affected persons was limited and the breach did not result in serious consequences for the data subjects. IMY therefore issued a reprimand and no fine. There is a pragmatic angulation by IMY on the choice of sanctions here, it is not every country’s supervision that makes it so.

There are many who mixed agreements with consent and the limit can be hairline. If you use any of this as a basis for treatment – take a look at the information you give your customers and see if the wording needs a refresher. 

Read more here. 

One of the cases against Klarna has become final

The Swedish Data Protection Authority (“IMY”) imposed a fine of SEK 7,300,000 for not providing sufficient information to the data subjects. Klarna appealed the decision to the Stockholm Administrative Law, which partially lifted the decision and reduced the fine to SEK 6,000,000 because the infringements were not intentional and because Klarna had improved the information of the data subjects.

IMY appealed the decision to Kammarrätten in Stockholm, who considered key aspects related to the GDPR in the case:

  • Kammarrätten disagreed with the Administrative Law on whether there was a violation of Article 13(1)(f) by not specifying specific third countries. Kammerrätten concluded that the GDPR does not require the specification of third countries. Therefore, Kammarrätten found that Klarna did not violate the GDPR in this context. 
  • Kammarrätten disagreed with the Administrative Law on the information on data subject rights under Article 13(2)(b). They believed that the GDPR does not require a detailed description of these rights. 
  • However, Klarna was found to have violated Article 13(1)(f) and Article 14(2)(g) by not providing information on security measures for transfers to third countries and on the use of a scoring model in automated decisions.
  • Kammarrätten also found that the information on automated decisions was not readily available as required by Article 12(1) GDPR, and that the information on the right to data portability and the right to restriction was unclearly formulated. 

Despite Klarna improving its privacy policy, Kammarrätten believed that the severity of the violations, which affected a large number of data subjects, justified a fine of the amount IMY had originally imposed. Thus, the appeal was complied with and the fine maintained on the original amount.

Read more here. 

And the case itself here (must be ordered).

Principle of the balance between privacy and other rights

Four French organizations, including La Quadrature du Net, wanted to cancel a provision of the French Intellectual Property Act that states that a copyright authority for the dissemination of works and the protection of rights on the Internet (Hadopi) can request the identity, postal address, email address and telephone number of a person who has made protected works available for download on the Internet. The purpose is to enable Hadopi to take action against the identified individual. The case ended up in the ECJ. It was claimed that the provision was contrary to Article 15 of the ePrivacy Directive and Articles 7, 8 and 11 of the Charter. The processing was considered to fall outside the GDPR because it is about criminal prosecution, ref GDPR Art 2(2)(d). 

The Court concluded in case C-470/21 that such access is permitted under national law, under certain conditions:

  1. Purpose and Limitation: Hadopi’s access to data must only serve to identify individuals suspected of copyright infringement and may not be used to monitor the individual’s online activity.
  2. Storage and access: ISPs must store the data in a way that ensures that it is not possible to draw conclusions about the person’s privacy by combining IP addresses with other personal data. Hadopi shall not access traffic data or location data. The personal data must only be stored for a period that is strictly necessary.
  3. Privacy and Security: Legislation must contain clear and precise storage and access rules, and provide effective guarantees against misuse and unlawful access to the personal data.
  4. Pre-approval: Before Hadopi can link a person’s civil identity to an IP address and send a warning, it must be approved by a court or an independent administrative body. Such review must be a specific assessment and therefore cannot be an automated process.

The judgment underlines that Hadopi’s access to personal data does not constitute a serious breach of privacy, provided that the aforementioned conditions are met. There are complicated considerations in the judgment on the use of IP addresses and how to ensure that no conclusions are drawn about anything other than the pure identification of a person. Currently, it’s not much talked about, which is actually quite surprising. I think the case also has transfer value to other discussions about storing IP addresses and monitoring. I am sure this will be a matter we have to come back to several times. 

Link. 

The Attorney General states the minimum principle and general application of GDPR in yet another Meta case

A Facebook user in the EU received targeted advertising based on his interests, including advertising directed at his sexual orientation. This was Max Schrems himself, in this case. He claimed that Facebook had unlawfully processed his personal data and the case ended in court. Austrian courts asked for a prior statement from the ECJ for four questions. However, two of them were withdrawn after the verdict of July 4, 2023, Meta Platforms and Others. The remaining two questions were:

  1. Does the principle of data minimization (Article 5(1)(c) GDPR) mean that any personal data held by a platform may aggregate, be analysed and processed for the purpose of targeted advertising without restriction?
  2. Does Article 5(1)(b) read in the context of Article 9(2)(e) GDPR mean that a statement in a panel discussion by an individual about his or her sexual orientation entitles a controller to process other data regarding his or her sexual orientation to offer them personalized advertising?

Attorney General Rantos published his opinion on the matter on April 25, 2024. In response to the first question, the Attorney General determined that the data minimization principle does not allow unlimited processing of personal data for targeted advertising. The processing must be proportional and necessary in relation to the purpose for which the data is collected. Courts must assess whether the retention period and the scope of the data are justified in terms of the purpose.

On the second question, the Attorney General determined that a statement about a person’s sexual orientation during a public panel debate could make the information publicly available. This meets the legal requirement that the personal data is “clearly disclosed” under the GDPR. Nevertheless, this does not in itself allow the processing of this personal data for personalised advertising as other principles of the GDPR must be followed.

The final verdict does not exist, but the attorney general's statements often gain great weight. 

Link. 

And there’s even more attention to marketing information retention time

In Italy, a lot happens on the marketing information. A public transport company called Trasporto Passeggeri Emilia-Romagna S.p.A. (TPER) collected invalid consents for seasonal ticket subscriptions. The form has been in use since 2016.

The consent concerned specific purposes related to market research, satisfaction surveys, promotion and communication of information via telephone calls, as well as activation of SMS notification service regarding strikes and planned changes in the service. The form contained information that a lack of consent would make it impossible for TPER to process the data and thus impossible for the data subject to access the services. Data subjects could not express specific consent for each purpose, as the form only required a signature from the users at the bottom of the form. The form also claimed that it was mandatory to provide this data in order to issue TPER’s Personal Identification Card. 

Simple and straightforward, you can say they weren’t heard with this. Not surprisingly. There were a wide range of weaknesses with this setup. Today, I think few will choose an approach that TPER did. But there was also an interesting assessment about storage time. 

TPER stated that the storage period for the data is 10 years after the end of the last season ticket and believed that they are subject to various inspections from regional authorities that require documentation. They also referred to possible litigation with users. The Norwegian Data Protection Authority was clear that marketing purposes cannot indicate storage for 10 years and ordered them to follow national guidelines on storage of such information. In Italy, 24 months for marketing purposes and 12 months for personal data used for profiling apply. 

The case is discussed here.

Another Italian case concerns Coop Italia Società Cooperativea (Coop Italia), one of the largest supermarket chains in Italy. The data subject had purchased an e-SIM from them. He received constant promotional messages from Coop Italia and opposed this, ref. article 21(2) and he eventually complained to Garante, the Italian Data Protection Authority. 

Coop Italia admitted to sending two additional promotional text messages after receiving the objection, but said this was due to an unusual internal misunderstanding. Garante carried out an audit at Coop Italia focusing on marketing and profiling activities. 

They revealed that a wealth of personal data was collected as part of theirphone service, including telephone and traffic data, internet browsing data, location data and/or geolocation data related to the use of the e-SIM card. Coop Italia argued that this data was necessary to meet the contractual obligations to provide its services. For direct marketing purposes, they only processed the name and contact details provided by the data subject in accordance with the consent. In their view, this also included market research, economic and statistical analysis. Coop Italia also collected photos, videos and audio recordings to promote its events and fairs. They kept these for a maximum of 5 years, which Garante considered too long.

The inspection also revealed that Coop Italia processed identification and contact data, photos and other personal information from its social platforms, based on consent for marketing purposes and to respond to users’ requests. The storage period of 5 years for such material was also considered too long.

Garante considered the case as follows: 

Right of Objection and Access: Coop Italia violated Articles 12(3), 15 and 21(2) GDPR by not helping the data subject exercise their right to object to the processing and by not responding to access requests within 30 days.

Processing of various data: Although Coop Italia collected various data to fulfil contractual obligations, it was pointed out that statistical and economic analysis requires separate consent and combining this with marketing consent was illegal.

Storing images and video recordings: Garante rejected the objection to excessive storage time for images and video recordings when consent was specifically and freely given.

Data from social platforms: The storage period for data from social platforms lacked clear criteria and was considered excessive for marketing purposes and invasive of users’ privacy.

Garante fined €90,000. As I said so many times before – review the consents you are using and make sure the user has choices.

Link. 

In Sweden, IMY focuses on how easy it is to delete information

This case can have an impact on many as it is not very rare for businesses to have special requirements placed to delete information. In such situations, it is especially important to know what rights you have as a consumer. 

Seven data subjects contacted CDON AB and requested the deletion of their data. CDON responded that in order to process the request, they needed information about date of birth, address, customer number, information about recent purchases such as order number and payment method including the last four digits of the credit card number when making card payments. Several registered individuals claimed they could not retrieve all the requested information because their purchases were so far back in time.

The data subjects filed complaints against the data controller. CDON argued that the names and email addresses of the data subjects were not sufficient to ensure the identity of the data subject. They stated that they took their complaints very seriously and have since undergone and clarified the identification process so that data subjects now only need to answer one of two security questions, to confirm their identity.

Furthermore, they stated that they automatically deleted customer profiles depending on consumer legislation in different countries, for example after three years in Sweden. The data controller thus confirmed that all the personal data of the data subjects had been deleted.

IMY did not investigate two of the seven complaints, including due to lack of verification opportunities of the receipt or treatment date. For the remaining five complaints, IMY first considered whether the controller had reasonable grounds to doubt the identity of the data subjects. They pointed out that the processing may require additional information if the data controller has reasonable grounds to doubt the identity of the data subject, but a proportionality assessment must first be carried out. They believed that a general requirement for additional identification violates the GDPR. 

IMY found that CDON had not adequately assessed whether the additional information was necessary and concluded that CDON violated Article 5(1)(c) and Article 12(6). Furthermore, they stated that the data controller used a burdensome verification method when asking the data subjects to provide the order number and price of the last order when the last order was long ago. They believed that the data controller did not facilitate the exercise of the data subject’s rights, and thus violated Article 12(2) GDPR.

IMY concluded that the violations were less severe because CDON had taken steps to facilitate data subjects’ rights and changed their practices and that the violations occurred relatively long ago and that CDON did not receive any corrective action for GDPR violations in the past. Therefore, IMY gave a reprimand and no fine. I know of some businesses that could wish for a similar assessment of extenuating circumstances to be done more often! 

Link.

Anonymization is not always easy

Transferring personal data from a company to a sister company can cause problems. This was the case in a case where a company in the Czech Republic shared personal data collected from users of the company's antivirus program. 

The data controller passed personal data from the users of their antivirus software and browser extensions to their sister company without an adequate legal basis. The personal data transferred included approximately 100 million users and consisted of the users’ pseudonymised internet browsing history. A unique identifier was used on the users. Furthermore, it was found that the data controller gave incorrect information to its users about these data transfers, when writing the transferred data was anonymised and was only used for statistical trend analysis. The authorities concluded that internet browsing history, although not complete, may constitute personal data, as re-identification of at least some of the data subjects could occur. The decision is final after the appeal is rejected.

The data controller’s breach is even more serious considering that the company is one of the foremost cybersecurity experts offering data security and privacy tools to the general public. 

What can we learn from this? Pseudonymization is not anonymization. Many people know this well, but there are still many misconceptions about this.

Link. 

Moderators must have no more information than necessary

On November 2, 2022, Shanghai experienced Moonton Technology Co. Ltd., a Chinese video game company (controller), a security breach. The breach affected 442 Spanish data subjects and involved exposure of user names on the forum, user ID, visit frequency, reported gender, IP addresses, email addresses and forum activities. The stolen data was published on a third-party website.

The Spanish Data Protection Authority (AEPD) was notified of the breach on 21 November 2022. During the investigation, the AEPD found that the company had violated several GDPR articles. The AEPD concluded that the company had shared more personal information with forum moderators than necessary, thus violating the principle of data minimization. It was also determined that the company had not met its security obligations, which was clarified through the publication of personal information on a third-party website and inadequate security measures for moderators. Here it would be interesting to see if the audit has made any more in-depth assessment, but the decision is very long and my Spanish knowledge and translation skills have not gotten through everything.

Furthermore, the company had not appointed a representative in the European Union, as required by the GDPR. The Company did not notify the AEPD of the violation within the prescribed timeframe.

As a result of these violations, the AEPD initiated sanctions procedures against the company and recommended a fine of €90,000. 

Something to learn here? Moderators are similar to employees. Make sure no one has access to more information than necessary!

Link.

Case on fingerprint use in Spain

There are strict requirements for the processing of biometric personal data. The Burgos Club de Fútbol implemented a system for collecting biometric data that required around 700 members of the “heia fields” to provide mandatory fingerprints to access the arena in 2022. The fingerprint system collected names, national ID numbers, system ID numbers, and fingerprint patterns, replacing the previous system that used ID cards.

The background was a decision by a government commission against violence, racism, xenophobia and intolerance in sports. 

In the same year, complaints were submitted to the Spanish Data Protection Authority (AEPD). It was claimed that the control was excessive and that the data subjects did not receive sufficient information. On February 15, 2023, the data controller stopped the mandatory collection of biometric personal data and gave members the option to use their ID card or fingerprint for access. On February 19, 2023, the AEPD determined that the Commission’s requirements for biometric processing were not in compliance with the GDPR.

The AEPD found several breaches of GDPR. Among other things, no risk assessment had been carried out prior to the implementation of systems and there was no basis for processing (some risk assessment would probably have revealed). The processing also violated the principle of data minimization, as the security purposes could be achieved with the previous ID card system. Furthermore, a violation of Article 8 was found because biometric personal data from minors was collected without age restriction. Nor was the duty of information complied with.

The AEPD recommended a €200,000 sanction, but the controller was fined €120,000 by acknowledging liability and paying the proposed fine. There seems to be different practices about biometric data in different countries. In Denmark, facial recognition is allowed to exclude hooligans from football matches, and in Germany fingerprints are used in ID cards. What is certain is that if you are going to use biometrics, there are a number of assessments that need to be made to ensure that you are doing things legally.

Read more here.

Not facial recognition in supermarkets in the Netherlands

Many people appreciate face recognition for opening their mobile phone, but face recognition can be used for much more. 

The Netherlands Data Protection Authority has published a guide clarifying the use of facial recognition technology and the processing of biometric data. It stresses that introducing face recognition into supermarkets would violate Dutch privacy laws, as it is considered a significant infringement on visitor privacy. It is also confirmed in the guide that the use of special personal data, such as biometric data, is still prohibited by the use of facial recognition for identification purposes. 

The Netherlands still allows face recognition for security purposes, such as protecting nuclear power plants. 

Link. 

Not always easy to tell if you are a data processor or data controller

It is a trend that many people want to be data controllers because it gives greater control over the data. Although there may be a certain scope of action in determining roles, reality will be crucial to what is a correct result. 

The data subject exercised their right of access to the administrator of the software platform for an app and website. The Administrator claimed to act only as a data processor that processed personal data on behalf of the HCP. The app and website allowed patients to review different medical practices, identify themselves, select the doctor they wanted to schedule, select the desired time and confirm the appointment. The data subject believed that the administrator was actually the data controller and therefore complained to the Belgian Data Protection Authority (GBA).

GBA concluded that the purpose of the app and website was to have an online deal system, exchange data with other apps, and make fully automated agreements. Therefore, the purpose was autonomously determined by the healthcare professional. The administrator only provided an online calendar system to achieve this purpose. The healthcare professional was considered a data controller.

GBA believed that the data processing agreements showed that the processing activity was carried out exclusively in accordance with written instructions from the HCP. It was also expressly stated that the Administrator would not process personal data for purposes other than those specified by the HCP. The Privacy Policy also specified that the administrator was only responsible for the technical function of the Platform, while the healthcare professional was responsible for determining the purpose and content of the processing operations.

GBA also concluded that the data subject must exercise the right of access to the controller, not the data processor.

Link.

The Finnish Vaastamo case is coming to an end

A Finnish citizen has been sentenced to six years and three months in prison in the West Uusimaa District Court in Finland for hacking into the patient files of the psychotherapy centre Vastaamo and claiming NOK 400,000 in ransom. He was the perpetrator behind the so-called Vaastamo case, which is one of the worst hacking cases in the Nordic region. It is one of the few cases where imprisonment has been imposed for violating the GDPR. The psychotherapy center’s director received a three-month contingent prison sentence last year for failing to adequately protect sensitive data. His appeal starts next year. 

A psychotherapy center was hacked, and information about many patients was put on the "dark web". About 33,000 patient files were stolen. Thesycotherapy centre was fined €608,000 for breaching the GDPR. They had not adequately secured the personal data, had not taken into account internal notifications that the security was too poor and the reported data breach too late. The company that operated the psychotherapy centre eventually went bankrupt, and it is unlikely that the victims will receive compensation and that the fine will be paid. 

Read the DPA decision from January 2022 here.

Do you have any questions?