Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Smart Phone at night. Photo.

I aim to broaden your perspective in these newsletters. Amidst a sea of minor issues, which ones have the greatest practical and principled impact? Currently, the media is filled with varying quality articles and comments about DeepSeek. It’s evident that this will be important moving forward. If AI can be developed at a significantly lower cost than before, Europe might also be able to develop competitive services. DeepSeek is still in its early commercial phase, with several countries’ data protection authorities raising questions to the company. It will likely take some time to address the legal challenges. Similarly, it took time for data protection authorities to reach a conclusion about OpenAI, which has now happened (see the article below).

Another issue gaining attention is the transfer of data to third countries. This includes transfers to the USA and now also to China, and it wouldn’t be surprising if other countries outside the EEA also face increased scrutiny. Trump’s actions have not bolstered confidence that the DPF will withstand legal scrutiny. The clear advice is to remember that the SCCs apply, and we may need to use them more frequently in the future. This brings us to the next major topic: pseudonymization.

The EDPB has issued a guideline on pseudonymization. We are also awaiting an important ruling from the CJEU on the same topic (C-423/23 P, EDPS v SRB), with the Advocate General’s opinion expected later this week. Therefore, it might be prudent to wait before delving deeper into this. However, for those working with data transfers to third countries, these documents are crucial to follow. A notable concern is that the EDPB’s draft suggests another risk assessment for the re-identification of pseudonymized information. Producing yet another risk assessment is likely not on the top of many people’s wish lists. However, if it enables the use of SaaS services from third countries and facilitates data transfers, it might be worth the effort.

Below is a selection of important issues from the turn of the year, including reflections on the EU’s appointment of a new EDPS. This could have more significant implications than many realize.

Ensure your privacy policy is comprehensive and respond thoroughly to access requests!

Netflix users must create an account and provide their name, date of birth, email address, phone number, and bank account number. When they use the service, Netflix processes data about their viewing behaviour to recommend movies and series that may interest them.

Noyb requested access on behalf of two registered users, but they were dissatisfied with the response they received. Acting as a representative for the users, noyb filed a complaint with the Austrian Data Protection Authority, which transferred the case to the Dutch Data Protection Authority, where Netflix has its European headquarters. The Dutch Data Protection Authority fined Netflix 4,750,000 euros for the violations they found. This substantial fine was due to four main issues:

Legal basis and purposes for processing personal data

In communication with the authority, Netflix listed eight purposes, but these differed significantly from what was stated in the privacy policy and in the responses to access requests. The DPA believed that the company did not clearly convey which data is used for “its offers, audience analysis, and fraud prevention.” Do you specify this in your privacy policy? I believe compliance seems to vary here. 

Furthermore, Netflix did not disclose which personal data they receive from third parties.

Lack of specification of data processors 

Netflix uses service providers to assist them, but the privacy policy and the responses to access requests did not include the names of these recipients. However, this information was presented in the submissions to the authority.

Lack of precision on retention period

In the privacy policy and in the responses to access requests, Netflix stated that they would retain personal data as “permitted by laws and regulations,” without specifying the duration of such retention.

Lack of information about transfers to third countries 

Netflix did not specify the rights of data subjects when their personal data is transferred outside the EEA. There was no reference to specific countries outside the EEA, adequacy decisions, or appropriate safeguards.

Comment: There is A LOT to learn from this case. Although it is likely already appealed, it shows that supervisory authorities are beginning to reject vague statements in privacy policies. 

I have often told clients that the Data Protection Authority may not accept formulations like “we retain customer information as long as necessary,” even if “everyone” else does the same. I have been expecting stricter enforcement, and it seems to be happening now. The same applies to specifying which personal data is used for which purposes and much more.

So, review your privacy policy as noyb would. It may actually be taken seriously one day – and it doesn’t have to take much time to correct it either.

New fine for OpenAI from Italy – and some thoughts on one-stop-shop

In March 2023, a technical error in ChatGPT caused users to see the chat history of other users for a period. This included names, surnames, email addresses, and the last four digits and expiration dates of credit cards used to pay for the service. OpenAI confirmed the incident.

The Italian Data Protection Authority, Garante, initiated investigations that went far beyond the specific breach.

About One-Stop-Shop – and Who to Notify

It was also considered whether the one-stop-shop mechanism should apply. OpenAI’s European headquarters is in Ireland, like many other companies. However, many believe that Ireland handles complaints too slowly and doesn’t involve other EU countries enough. It is not satisfactory for a data protection authority to be sidelined. For the authority in a country (outside Ireland), I sometimes suspect that they try to avoid having a case end up in Ireland. For companies under supervision, it is often advantageous to use the one-stop-shop mechanism. Otherwise, they risk simultaneous supervision and investigation in several EU countries, which increases the risk of large fines and results in a lot of administrative work and extra costs. Therefore, there is sometimes a discussion about whether the one-stop-shop mechanism applies or not.

In this case, Garante concluded that OpenAI had activity in the EU from November 30, 2022, but established an office in Ireland only on February 15, 2024. Therefore, Garante had authority over breaches before February 15, 2024, and the one-stop-shop mechanism did not apply.

OpenAI stated that they notified the data breach to the Irish Data Protection Authority, as they were in the process of establishing their Irish registered office when the breach occurred. However, Garante concluded that when the breach occurred, the data controller was based in the USA, and thus the one-stop-shop mechanism did not apply. They could not transfer the case to the Irish Data Protection Authority as the breaches occurred before the establishment in Ireland. This meant that OpenAI had not notified in a timely manner. For foreign companies, it is therefore important to understand how the one-stop-shop mechanism works. My experience is that many foregin companies do not always understand this.

Violations of Articles 5(2) and 6 GDPR

Garante believed that OpenAI had not identified the legal basis for training the model. Specifically, they believed that the processing of personal data began at an earlier stage than the service itself. This is a reminder to be thorough when mapping out your processing activities, so you do not overlook quite central parts of a processing sequence. Even today, I experience that businesses “forget” that the legal basis for training AI is different from the actual use of the AI. This is not advisable. For OpenAI, this was considered a violation of Articles 5 and 6.

In reviewing the company’s privacy policy, Garante found two types of data processing: the ability to use the service itself, and the use of non-users’ data to train the models. They concluded that OpenAI had not complied with the information obligation for these. Non-users received no information. For users of the service, the privacy policy was difficult to access – and it was in English.

Furthermore, there were no mechanisms to verify the age of users, even though the terms required parental consent for users between 13 and 18 years old. Several other violations were also found, which we will not go into here.

Garante imposed a fine of 15,000,000 euros. It is likely to be appealed. Garante also required OpenAI to run an information campaign for six months to increase transparency in the processing of personal data. The campaign, to be broadcast on radio, TV, newspapers, and the internet, will focus on how ChatGPT uses personal data, especially regarding the collection of both user and non-user data for AI model training. The content of the campaign will be agreed upon with Garante and will aim to enhance individuals’ knowledge of their rights, including the right to object, correct, and delete personal data.

Another large fine – do you have the necessary information in your breach notification?

So far, we only have a press release about this case, and I look forward to the actual decision being published, as there are some interesting facts here. Among other things, I would like to know what the DPC believes Meta withheld in the breach notification, even though some of this is mentioned in the press release.

Meta had a breach that affected approximately 29 million Facebook accounts globally, of which about 3 million were in the EEA. Unauthorized persons were able to see the Facebook profile of the data subjects, and the information included names, email addresses, phone numbers, location, workplace, date of birth, religion, gender, posts on timelines, groups a user was a member of, and children's personal data. The breach was remedied shortly after.

The Irish Data Protection Authority, DPC, handled the case and imposed extensive fines. They coordinated with other countries' authorities, which (this time) had no objections to the DPC's proposed reaction.

Firstly, the DPC fined Meta 8,000,000 euros because the breach notification did not contain all the information it should have. This is a very large fine for not including everything in a breach notification. It will be interesting to see what was actually withheld.

The DPC also fined Meta 3,000,000 euros because the breach notification did not document the facts related to each breach, the steps taken to remedy them, and presented the information in a way that made it difficult for the DPC to verify compliance. Here too, it will be relevant to see what was missing.

The DPC further fined Meta 130,000,000 euros for lack of privacy by design. In addition, they fined Meta 110,000,000 euros for violating the minimization principle.

In the comment in the press release from one of the new directors of the DPC, it is clear that a central element is that parts of this breach concerned sensitive personal data:

"This enforcement action highlights how the failure to build in data protection requirements throughout the design and development cycle can expose individuals to very serious risks and harms, including a risk to the fundamental rights and freedoms of individuals. Facebook profiles can, and often do, contain information about matters such as religious or political beliefs, sexual life or orientation, and similar matters that a user may wish to disclose only in particular circumstances. By allowing unauthorised exposure of profile information, the vulnerabilities behind this breach caused a grave risk of misuse of these types of data."

We will return to this case!

Another (strict) case about facial recognition

A football supporter in Spain complained to the Spanish Data Protection Authority (AEPD) and claimed that the football club Club Atlético Osasuna had violated privacy regulations. The club had implemented a facial recognition system at several of the stadium entrances, but it was still possible to use entrances without this system. Fans could voluntarily register in the system by submitting a selfie, an ID card scan, and consenting to the terms. The purpose of the system was to speed up entry, not to improve security. The complainant argued that the large storage of biometric data was disproportionate – even though it was based on consent.

AEPD considered the consent to be valid and informed. There were no negative consequences for not consenting. However, they believed that the scope of the biometric data collected (ID card, photo, facial recognition procedure) was too extensive when QR codes or digital tickets could have been used to provide equally fast entry to the stadium.

This resulted in a fine of 200,000 euros. This case is significant because it shows that consent does not always take precedence. If you are going to use facial recognition, you must ensure you have the right justifications.

To ask for a title – or not…

It is no longer common to ask customers to indicate whether they are Mr. or Mrs., although it remains prevalent in many other countries. Personally, I feel like a forever young miss and would happily check that box! This matter highlights the importance of considering what is truly necessary for one’s customers.

In my opinion, this case demonstrates that the principle of data minimization is robust, perhaps even more so than the issue of privacy violation. Although some may disagree, I do not believe it poses a significant privacy risk to provide a title – but in line with the principle of data minimization, it is unnecessary.

The association Mousse filed a complaint against the French railway company SNCF with the French data protection authority (CNIL). SNCF required customers to provide their title (‘Monsieur’ or ‘Madame’) when purchasing transport tickets online. Mousse argued that this violated the principle of data minimization, as it is not necessary for the purchase of a railway ticket.

In 2021, CNIL dismissed the complaint (surprisingly, in my view) and found that this did not constitute a violation of GDPR. Mousse disagreed and took the case to the courts, which then referred it to the CJEU.

The CJEU ruled on the case in January 2025, stating that in accordance with the principle of data minimization, which reflects the principle of proportionality, the data collected must be adequate, relevant, and limited to what is necessary for the purposes for which they are processed. The court stated that for data processing to be considered necessary for the fulfilment of a contract, it must be objectively indispensable to enable the proper fulfilment of that contract. This is not surprising, but many do not have practices in line with this.

Furthermore, the court emphasized that personalization of commercial communication based on assumed gender identity is not objectively indispensable in a transport contract. SNCF could choose to communicate using generic, inclusive terms towards customers, without reference to assumed gender identity. They believed this would be both feasible and less intrusive.

IMY address misleading cookie banners

A complaint was filed against Aktiebolaget Trav och Galopp, one of Sweden's largest gambling companies, alleging that it was more difficult to refuse cookies than to accept them and that the colours, contrast, and links on the cookie banner were misleading.

The company responded that it was possible to refuse cookies and withdraw consent in a second layer of consent. Since October 2021, they had added a clear "refuse" button instead of a link to another layer where cookies could be refused. They also adjusted the colours and contrast of the buttons for acceptance and refusal.

The Data Protection Authority found that the cookie banner appeared immediately when a user visited the website. To accept cookies, the user could click on a green button. However, to withdraw consent, one had to navigate to the company's cookie policy in the footer. The authority believed that the process for withdrawing consent was much more complicated than accepting cookies. They also noted that the link to refuse cookies was less prominent than the green button to accept. The authority concluded that the consent was not an expression of a clear will.

The company received a reprimand instead of a fine. Nevertheless, it is time to review cookie banners everywhere.

Apart from some cases in Denmark, I believe this is the first strict case about cookies in Scandinavia. I doubt it will be the last.

A principled ruling on the right to complain

A recent ruling from the CJEU addresses “excessive” requests from data subjects, as per Article 57 of the GDPR. Article 57 states that excessive requests can be refused or that the supervisory authority can charge a fee. In case C-416/23, the CJEU was asked to provide an opinion on this in light of the Austrian Data Protection Authority (“DSB”) refusing to handle complaints from an individual due to the large number of requests submitted. Several of the complaints were about the individual’s access requests not being answered by the data controller within a month.

The data subject had sent 77 similar complaints to the DSB directed at different data controllers over approximately 20 months. The individual disagreed with the DSB’s refusal to handle the complaints, and the case eventually reached the CJEU, which had to clarify:

  • Does the exception for “requests” in Article 57 also apply to “complaints” under GDPR Article 77?

  • Is it sufficient for a request to be considered “excessive” if a data subject has sent a certain number of requests to a supervisory authority within a certain period, regardless of whether the facts are different and/or the requests concern different data controllers, or is an abusive intent from the data subject also necessary?

  • In the case of “manifestly unfounded” or “excessive” requests, can the supervisory authority choose between charging a reasonable fee or refusing to handle the request?

The decision is interesting from a legal standpoint and likely has significant practical implications for many data protection authorities. Additionally, it has some relevance to the similar provision that allows data controllers to refuse or charge for requests for access, for example.

The CJEU ruled that the term “request” should include complaints. Some may argue that this restricts the right to complain.

Regarding whether a numerical limit can be set for the number of complaints to be considered excessive, the CJEU answered no. The term “excessive” is not defined in the GDPR. However, the CJEU believes that a numerical limit could overly restrict individual rights and stated that the supervisory authority must also establish that the individual has an intent to abuse the right. A large number of complaints from the same person can indicate this if they are not objectively justified by privacy concerns.

Overall, the CJEU’s decision seems reasonable, although it may seem strange for an administrative institution to refuse to handle a complaint. Several of the factors considered will also be valuable for data controllers considering refusing requests for various rights under Article 12.

If faced with an excessive request, the CJEU believes that the supervisory authority can choose between charging a reasonable fee based on its administrative costs or refusing to handle these requests. However, the chosen option must be justified, appropriate, necessary, and proportionate in the relevant circumstances. The CJEU writes that the supervisory authority may consider first opting to charge a reasonable fee to stop abuse, as they believe this has a less negative impact on the GDPR rights of the individual. It can be questioned whether the individual will perceive it this way, as the actual cost is likely to be significant.

You can read more details about the case here.

AI Act – coming to a place near you!

On February 2nd, Article 5 of the AI Act came into force in the EU. It doesn’t apply to Norway, but for those with activities in EU countries, this is still relevant. Article 5 deals with prohibited AI, where some aspects are unlikely to be relevant in Western countries, while others actually are. The sanctions are significant - violations of Article 5 can result in fines of up to EUR 35,000,000 or 7% of global annual turnover, whichever is higher.

Here is a reminder of some of what is considered prohibited AI, namely the use or marketing of:

  • AI systems for social scoring by both public and private actors
     
  • AI systems to infer emotions in the workplace and in educational institutions
     
  • AI systems that scrape images from the internet or CCTV to create databases for facial recognition
     
  • AI systems that assess the risk of a person committing a criminal act based on profiling or assessment of personality traits and characteristics
     
  • Biometric systems aimed at identifying race, political opinions, trade union membership, religious or philosophical beliefs, sex life, or sexual orientation.
     

There are some narrow exceptions to the prohibitions, including that biometric recognition may be allowed to protect a significant public interest. We are seeing more and more situations where facial recognition is allowed to prevent attacks.

As mentioned earlier, I believe that it is within the workplace that most businesses may find this a relevant challenge. We see that AI offered in workplaces can easily provide analysis of the “mood” of employees based on an analysis of, for example, email messages and word choices in them. Something similar is seen in some recruitment systems. Copilot also had such a function for a while – it was something the Norwegian Data Protection Authority pointed out when they had NTNU’s use of Copilot in the sandbox. The EU’s AI Office is expected to issue guidelines on what is to be understood as prohibited AI. Hopefully, these will come soon.

New EDPS: Who and what is it – and is it important?

The EDPS – European Data Protection Supervisor – is not the same as the EDPB – European Data Protection Board. While the EDPB, among other things, issues a number of guidelines that we all adhere to (more or less), the EDPS does not do the same. The EDPS is Europe's own data protection officer and oversees how European authorities use personal data. It may seem like a limited task, but the EDPS has conducted many very thorough investigations – including on privacy in key services offered by Microsoft. See more about this below.

The role is also more central in the European data protection context than what may be known to many and is an important actor that provides input to new data protection regulations from the EU. In Norway, I often get the impression that people think the EDPS and the EDPB are the same. They are not.

It is now being decided who will hold the role for the next 5 years. It could be the existing ombudsman Wojciech Wiewiórowski, former head of the Polish Data Protection Authority. I believe that many have experienced Wojciech Wiewiórowski as a thorough data protection theorist, while some have probably thought that he is too conservative. Not an easy role to have regardless – as is often the case for important roles. You can read more about his role here.

The role is so important that there is disagreement in European institutions about who will be the next EDPS. The Parliament has voted for Bruno Gencarelli, who is now in the Commission, while the countries' ambassadors have voted for Wojciech Wiewiórowski.

Gencarelli led the negotiations with the USA for the DPF – possibly indicating a more pragmatic and less theoretical approach. As I understand it, the final decision will be made behind closed doors.

A bit more about why the EDPS is important

The EU is no small organization. When the EDPS points out that the EU itself does not handle personal data in accordance with GDPR, it can have significant consequences. Above, it is mentioned that the EDPS has conducted several analyses of how Microsoft’s products are used. They have published several reports on this – and they are critical. You can find the reports on the EDPS website, see for example here.

They are particularly critical of Microsoft defining itself as the data controller for some of the information generated when, for example, Microsoft 365 is used.

This is an “old” discussion in privacy circles – but it also touches on labour law. Does an employer really have the right to give an IT provider the right to, for example, “own” information about when and where we use IT systems that the employer requires us to use? A similar issue arose in the Danish case of whether a school can give Google data controller responsibility for parts of the information created when a student uses a system provided by the school.

The EDPS has ordered the Commission to bring the use of MS365 and the contract with Microsoft into compliance with the regulations. The case illustrates the difference between Europe’s desire for autonomy and the actual dependence on American technology. “There are no known credible offers from European providers,” states an internal Commission document seen by Euractiv – and it is easy to agree that there is a lack of alternatives. French authorities express particular concern about “the potential risks associated with the use of US-based solutions.”

In a recent report from the EU’s Directorate for Digital Services, concerns are raised about “excessive power in the hands of a few non-European companies, risks associated with a single supplier (price increases, migration difficulties)” – although this is not said publicly. It also states that they are positive about EU countries developing alternatives to Microsoft, although they admit that “at this time, no functionally equivalent alternatives to a platform like Microsoft 365 have been identified.” This is easy to understand.

Somewhat surprisingly, it states that the EU has not approved MS365 for sensitive data – and because there is no alternative – documents are sometimes not classified as sensitive as they should be. This does not sound good.

In March 2024, the EDPS ordered the Commission to review its current contract with Microsoft to ensure compliance with the EU’s institutional data protection rules (EUDPR). The EDPS believes it does not provide sufficient guarantees to prevent illegal transfers to countries with inadequate privacy laws and unauthorized disclosure of personal data. The Commission was not satisfied and responded by suing the EDPS. The Commission called the order a “misinterpretation and misapplication” of the EUDPR. On December 6th, three days before the decision came into force, the Commission nevertheless sent documentation to the EDPS to show that they are in compliance. The EDPS is currently reviewing the documents but reiterated in a press release on December 10th that “the decision of March 8, 2024, remains fully in effect.”

What will be the solution to this? At the very least, the EDPS is leading the way in “forcing” Microsoft to change its practices. Whether they succeed remains to be seen. I doubt many others have the capacity or ability to get Microsoft to change its practices or contract terms. To be continued…

Euractiv has written an informative article on the matter, see here.

NOYB – now also against China

As many know, NOYB has not itself filed new cases regarding the transfer of personal data to the USA based on the DPF. So far, others have done so. However, it is interesting that NOYB is now criticizing the transfer of personal data to China. On January 16, NOYB filed six complaints in various European countries for the illegal transfer of personal data to China. NOYB points out that there is no adequacy decision for China and that data importers in China cannot guarantee the same level of privacy as in the EU.

The companies complained about are TikTok (complaint filed in Greece), Xiaomi (complaint filed in Greece), Shein (complaint filed in Italy), AliExpress (complaint filed in Belgium), WeChat (complaint filed in the Netherlands), and Temu (complaint filed in Austria).

It is not clear to me why they filed complaints in these specific countries. To my knowledge, the European establishment for many of these companies is in Ireland (some in other countries), but not where the complaint was filed. Possibly, it is a way to engage multiple countries' authorities.

NOYB also asks the data protection authorities to order an immediate halt to transfers to China and to impose fines on the companies.

All complaints are available on NOYB's website here.

A bit more about transfers to third countries

On December 3, 2024, the EDPB published draft guidelines on Article 48 on how a data controller should act when they are subject to a judgment or administrative decision requiring the transfer or disclosure of personal data to a public authority in a third country.

It is hardly surprising, but it is good to establish that such transfers can only be carried out if they are based on an international agreement, such as a mutual legal assistance treaty, which is in force between the requesting third country and the EU or an EU member state.

The document clarifies that Article 48 is not a transfer mechanism in itself. Organizations using this provision must therefore still find both a legal basis under Article 6 and a basis for transfers in Chapter V. Depending on the international agreement in force with the requesting country, the potential legal basis could be, for example, legal obligation, public interest, or consent. It is interesting to see that the EDPB also writes that, given the necessary considerations are made, even legitimate interests can be used as a basis for processing.

The guidelines are open for consultation until the end of January 2025. You can find them here.

In Austria, there has been a somewhat peculiar case relevant to lawyers

A user complained about a gym, claiming that her personal data was unlawfully disclosed to a lawyer and used in a lawsuit between her and the gym. When she joined the gym, she had checked a box indicating that she did not consent to further processing of her data. The dispute arose when the gym increased membership fees without consent, and email correspondence between the data subject and the gym was used as evidence. The Austrian Data Protection Authority concluded that the gym had violated privacy by sharing unnecessary personal data. The gym appealed to the Austrian Federal Administrative Court (BVwG), arguing that the disclosure was crucial to prove the facts in court.

BVwG ruled that it was the gym's lawyer, acting as an independent data controller, not the gym, who had shared the data in the lawsuit. Thus, the DSB had placed the responsibility incorrectly. BVwG also stated that the data subject's lack of consent only applied to marketing purposes, not legal proceedings, and sent the case back to the DSB.

Do you have any questions?