Newsletter

Privacy Corner

by Eva Jarbekk and Trygve Karlstad

Published:

Data Protection Privacy Camera surveillance

I am pleased to invite you to seminar at Schjødt DIGITAL on November 13th, where we will discuss several pressing issues in the field of AI. This event will feature insights from experts across Norway and internationally. It promises to be an enlightening session – details for the program and registration can be found here

Recently, the European Data Protection Board ("EDPB") released a draft Guideline concerning legitimate interest and an Opinion regarding data processors. Additionally, on October 4th, the Court of Justice of the European Union ("CJEU") published multiple judgments that hold significant practical implications, which we will explore further.

It's also worth mentioning that there are rumours of a new European Electronic Communications Code being passed soon by the Parliament in Norway, potentially coming into effect at the start of the next year. Should this occur, it will introduce stricter cookie regulations in Norway much sooner than anticipated. This important update is likely to be the focus of our next newsletter in December.

For those utilising cookies on websites to analyse traffic, target customers, or for other purposes, it's advisable to begin preparations now to comply with the forthcoming changes.

As always, enjoy your reading!

Use of data processors and sub-processors in light of GDPR Article 28

The EDPB has released an important statement clarifying the responsibilities of controllers when engaging data processors and sub-processors. This guidance was requested from the Danish Data Protection Authority, which sought to address the complexities involved in managing data processors, especially in the context of cloud services. Cloud service providers frequently rely on a variety of subcontractors to deliver their services, complicating the documentation responsibilities of controllers. This complexity has led to uncertainties about compliance with Article 28 of the GDPR, which outlines the obligations of controllers when employing data processors. Consequently, the Danish Data Protection Authority emphasised the need for a unified approach to understanding and enforcing these regulations across borders. The EDPB's statement highlights several critical areas:

  1. Identification of Data Processors and Sub-processors: Controllers are required to maintain accurate and comprehensive records of all data processors and sub-processors they utilise. This record should include names, addresses and contact details, and must be maintained irrespective of the associated processing risks. Such diligence is essential for GDPR compliance and must be upheld continuously. This requirement extends throughout the entire supply chain, encompassing suppliers at all levels, including sub-subcontractors. To facilitate this, controllers might need to employ dynamic websites that provide current details of all engaged suppliers. Additionally, controllers should be notified promptly if any changes are made. While this information need not be included in the privacy statement, it must be readily accessible to the controller. From a rights perspective, this is not surprising. This requirement ensures that controllers can accurately inform data subjects about who is processing their personal data.
     
  2. Accountability in Selecting Data Processors: Article 24's accountability principle mandates that controllers must ensure their choice of data processor does not compromise the rights and freedoms of data subjects. Controllers must not simply assume compliance by their processors with the GDPR; they must actively confirm and document that all data processors and sub-processors are fulfilling their data protection obligations. This may involve deploying questionnaires, conducting audits, and employing other verification methods. The depth of this verification may vary based on the technical and organisational measures in place and the associated risk level. For high-risk processing activities, controllers are advised to intensify their verification efforts. The EDPB notes that controllers are not obligated under GDPR to systematically request sub-processor agreements to verify compliance down the supply chain, though this may be considered on a case-by-case basis.
     
  3. Transfers to Third Countries: Controllers must evaluate the risks associated with data transfers that occur throughout the processing chain, even those transfers initiated by processors to sub-processors, rather than by the controllers themselves. While the EDPB recognises that processors also play a critical role in these transfers, the ultimate responsibility lies with the controllers. Controllers must rely on the assurances provided by their processors and assess the risk of transferring personal data to non-EEA countries or those not deemed to provide adequate protection. Controllers must ensure that "adequate guarantees" are in place for such transfers.
     

What does this mean in practice? Controllers will probably increase focus on the oversight of sub-processors. Processors should anticipate and prepare for heightened scrutiny and governance. This will likely result in greater emphasis on control, transparency, and proactive management of privacy obligations across the processing chain. Additionally, the EDPB's statement may influence data subjects' rights to access information about the recipients of their personal data, as stipulated in Article 15 of the GDPR, ensuring data subjects can be informed of the actual identities of those recipients.

New guideline for "legitimate interest"

The EDPB has developed a new guideline that analyses how controllers can process personal data based on legitimate interest. These guidelines will be open for public consultation until November 20, 2024, and it is possible to provide input before the final guidelines are established. The initial impression and general perception suggest that the guideline is not particularly controversial, despite regulating what is a very important basis for processing for many businesses. The lack of controversy is likely due to it not involving significant changes compared to the existing guidance from WP-29.

To rely on legitimate interest, controllers must meet three cumulative conditions:

  1. Pursuit of a legitimate interest: The interest must be lawful, clearly and precisely formulated, honest, and present. 
     
  2. Necessity of the processing: It must be necessary to process the personal data to achieve the legitimate interests. If there are reasonable, equally effective, but less intrusive alternatives to achieve the same goal, the processing cannot be considered necessary. The principle of data minimisation should also be considered here.
     
  3. Balancing against the Individual's Interests: The controller must ensure that their legitimate interest does not override the individual's interests, fundamental rights, or freedoms. In this balancing assessment, the controller must consider the individual's interests, the impact of the processing, the existence of additional safeguarding measures that can limit the impact on the individual, and the data subject's reasonable expectations.
     

The guidelines also provide guidance on how this assessment should be performed in practice, with concrete examples such as fraud prevention, direct marketing, and information security. The document also explains the relationship between this legal basis and several rights the data subject has under the GDPR. Therefore, the guidance seems to be in line with the recently issued judgment C-621/22 KNLTB (more on this in section V).

Storage of personal data within LLM-models

In this article, Lokke Moerel and Marijn Storm delve into the complexities of managing personal data within large language models (LLMs) in the context of GDPR compliance. 

They challenge the prevailing notion that since LLMs do not technically "store" personal data, data subjects cannot directly exercise their rights over the model itself. This interpretation significantly influences how privacy regulations are applied to such technologies. According to this view, GDPR compliance is only necessary when personal data is actively processed within an AI system, placing the burden solely on deployers of AI systems, not on the provider of the LLMs. This suggests that even if LLMs are trained in ways that violate GDPR, this does not necessarily influence the legality of using LLM-supported AI systems. This area presents several complex issues.

The article interestingly cites guidance from Denmark's data protection authority, which suggests that the content of AI models, being merely the output from data processing, does not constitute personal data in itself—similar to how statistical reports, which result from data processing, are not considered personal data.

However, the authors contend that this interpretation is flawed. They argue that even though LLMs do not "store" personal data in the conventional sense, they should still be regarded as personal data due to the way they are utilised. For instance, chatbots create personal data based on inputs into the LLM. This scenario is similar to search engines, where the operators are regarded as data controllers responsible for managing user data. This critical perspective underscores that privacy responsibility extends beyond mere data storage to include how data is utilised and its implications on individual rights.

Moerel and Storm further argue that adhering to the guidelines from Hamburg and the Danish data protection authority would release LLM providers from responsibility for inaccurate outputs related to public figures, potentially creating a privacy protection gap. They advocate for a model where effective and comprehensive protection of data subjects is possible only if LLM providers are treated as joint controllers for the use of their models and held accountable for responding to data subjects' requests.

This discussion is far from concluded, and it is likely that judicial clarifications will be necessary to fully resolve these issues.

What constitutes special categories of data?

Regarding the Case Lindenapotheke (C-21/23

This case, known as the Lindenapotheke case, involved a German pharmacy chain selling over-the-counter medications online via Amazon. The pivotal question was whether personal data collected during the online transactions of these products should be classified as health data under the GDPR, and thereby recognised as a special category of data. The EU Court determined that such information does indeed qualify as data concerning health, even for medications that do not require a prescription. This decision was contrary to the Advocate General's opinion and defied many experts' expectations. The Court adopted a broader interpretation of what constitutes health data than anticipated.

The ruling is significant as it endorses a wide interpretation of sensitive information under the GDPR. It establishes that the purchase of non-prescription medications online can disclose information about an individual's health. The Court highlighted the link between the medication, its usage, and an identifiable individual. This decision could have profound implications for online commerce involving medical products, now requiring explicit consent to process such data.

Additionally, the judgment clarifies the intersection between data protection regulations and competition law, which could lead to substantial consequences. It confirms that the GDPR does not preclude national laws that allow competitors to initiate lawsuits against individuals allegedly breaching the GDPR, based on competition laws against unfair trade practices. This enhances individual rights and ensures a high level of protection. It may also escalate conflicts among competitors, as businesses could leverage GDPR violations to challenge competitors gaining market advantages.

However, the extent to which the EU Court has contemplated the practical implications of this decision remains unclear. The ruling likely necessitates stricter consent requirements for processing sensitive personal data in online commerce than currently practiced and increases the risk of legal challenges from competitors for GDPR breaches. Moving forward, it will be crucial to assess whether products offered, directly or indirectly, could reveal health information or other special categories of personal data, potentially expanding the definition of what is considered sensitive data under the GDPR. For example, purchasing a book about a religion could reveal one's religious beliefs. Another challenge is how online stores can practically obtain consent when the purchaser is not the end user of the medicines or other potentially sensitive products. While I do not have all the answers, it is evident that the definition of what constitutes a special category of personal data will continue to evolve. Moreover, there may be a case for narrowing the scope of what defines special categories or expanding the legal bases, such as incorporating contracts as a potential processing ground under Article 9.

May commercial interest be "legitimate interest"?

Regarding the Case KNLTB (C-621/22

In this case, the practices of the Dutch Tennis Federation, KNLTB, were examined regarding its sharing of member personal data with external entities for payment without obtaining consent. The central issue was whether such practices could be justified under the legitimate interest provision of Article 6(1)(f) of the General Data Protection Regulation (GDPR).

The court confirmed that commercial interests, including marketing activities, could be considered a legitimate interest. However, for the legitimate interest to be legally valid, three criteria must be satisfied. Firstly, there must be a clearly defined legitimate interest. The EU Court of Justice indicated that a variety of interests might qualify as legitimate, provided they do not violate any other laws. Secondly, the processing of personal data must be necessary. This involves assessing whether the data processor has less intrusive alternatives available that could achieve the same objectives. This evaluation should adhere to the data minimisation principle stated in Article 5(1)(c) of the GDPR, which mandates that data processing be adequate, relevant, and limited to what is necessary for the purposes of the processing. In this regard, the EU Court also considers whether members had realistic expectations about how their data would be managed at the time of collection and whether they were informed about the potential sharing of their data. The third criterion is that a balance must be struck between the legitimate interest and the fundamental rights and freedoms of the data subjects. This balancing assessment may incorporate various factors, such as the data subjects' expectations, the extent and nature of the processing, and its impact on the data subjects.

Following this judgment, it is clear that legitimate interest can serve as a basis for data processing for strictly commercial purposes. Nonetheless, this basis is subject to stringent limitations, as demonstrated by the three cumulative conditions outlined above. Consequently, businesses aiming to rely on "legitimate interest" for the commercial sharing of personal data face significant challenges in ensuring compliance with the GDPR. They must depend entirely on thorough evaluations concerning the necessity of the processing and the balancing of interests against the rights and freedoms of the data subjects. Therefore, businesses cannot be certain of their compliance until their practices have been evaluated by data protection authorities or through judicial review. The demand for predictability in complying to GDPR regulations is especially relevant, given the limited legal precedents regarding the application of "legitimate interest" as a basis for processing for commercial ends.

Targeted advertising and the processing of sensitive data

Regarding the Case Maximillian Schrems v Meta Platforms Ireland Ltd (C-446/21

In this case involving Max Schrems and Meta, questions were raised about Meta's handling of personal data, particularly regarding targeted advertising and the processing of special categories of personal data. Schrems alleged that Meta unlawfully processed his personal data, including information about his sexual orientation. Notably, Schrems had not consented to targeted advertising, nor had he publicly disclosed his sexual orientation on Facebook. Meta had gathered information about his orientation from external websites using cookies and other tracking technologies.

The court's decision affirmed that while targeted advertising itself is not illegal, the GDPR's data minimisation principle imposes strict limits on the processing of personal data. The ruling emphasised that sensitive personal data, such as sexual orientation, requires data controllers to exercise caution. Such sensitive data cannot be used for advertising purposes without explicit consent and necessary restrictions on the duration and scope of the processing.

Furthermore, the judgment clarified that a user's public disclosure of sensitive information does not automatically grant platforms like Facebook the right to use this information for other purposes, such as personalised marketing. This sets an important precedent for how social media and other digital platforms must handle personal data. It also refocuses attention on the distinctions between first-party and third-party data.

For those who aggregate information from multiple sources (such as business registration numbers), and many do, it is crucial to review whether the relevant consents are robust enough and that personal data is not retained longer than strictly necessary.

Right to erasure?

Regarding the Case Agentsia po vpisvaniyata (C‑200/23

In Bulgaria, a legal dispute arose regarding the handling of personal data within a state corporate registry agency. A Bulgarian citizen, involved in establishing a joint-stock company, found their personal data included in the company's founding documents, which were subsequently made public by the Registry Agency. The individual demanded that their data be deleted, a request the Registry Agency refused, leading to a lawsuit.

The EU Court addressed several legal issues in this case. Firstly, it was determined that the Registry Agency must be considered both a data controller and a recipient of personal data.

Furthermore, it was clear that the data subject has the right to have personal data deleted if it is processed without legal basis. The court did not consider it relevant that the data subject could have ensured that the Registry Agency received a redacted version of the documents, omitting personal data, as per the Registry Agency's procedural rules. The right to erasure is absolute unless the processing of these personal data is necessary to fulfil a legal obligation or to perform tasks carried out in the public interest.

The third issue addressed by the EU Court was whether a handwritten signature could be considered personal data. This aspect of the case, although not particularly surprising given the definition of personal data in Article 4(1) of the GDPR, which naturally concludes that a signature is information that can be linked to an individual, was affirmed.

The ruling is also noteworthy for establishing liability for damages under the GDPR. The EU Court reaffirmed that when a data subject loses control over their personal data, especially when it is made available on the internet, it can lead to non-material damage. This may trigger claims for compensation, but the data subject must demonstrate a causal link between the violation and the adverse effects they have suffered. There is no requirement for financial loss, but merely demonstrating a breach of the GDPR is insufficient.

Additionally, it was not mitigating for the data controller's liability that there was a statement from the national data protection authority claiming that the Registry Agency was not responsible. This decision underscores the independent responsibility of data controllers to adhere to GDPR regulations, regardless of opinions issued by national authorities.

Is an apology adequate compensation?

Regarding the Case Patērētāju tiesību aizsardzības centrs (C-507/23)

In a case from Latvia, a journalist became embroiled in a legal dispute with the Latvian Consumer Protection Agency after they parodied him in a video without his permission. The journalist demanded the removal of the video and sought compensation for the reputational damage he suffered as a result of the video. The Agency refused to remove the video, denied any breach of privacy regulations, and consequently rejected the compensation claim. This case was quite unusual.

The matter escalated to the Latvian courts, which ruled that the publication of the video was unlawful, yet they declined to award financial compensation. Instead, the Agency was ordered to issue a public apology.

The case was then taken up by the EU Court to assess several issues. Firstly, the EU Court found that a breach of the GDPR does not automatically result in "damage" that qualifies for compensation under Article 82(1). To be eligible for compensation, there must be a breach of the GDPR, a material or non-material damage must have occurred, and there must be an adequate causal link between the GDPR breach, and the damage incurred.

Another issue was whether an apology could be considered adequate compensation. The EU Court emphasised that it is primarily up to national legislation to define what constitutes adequate compensation. However, an apology under the GDPR can be an appropriate form of compensation for non-material damage, especially when it is not possible to restore the original situation, provided the apology fully addresses the damage inflicted.

The final question addressed was whether Article 82(1) on liability for compensation implies that the amount of compensation should be influenced by the data controller's intention or motivation. In other words, whether the degree of fault should be an aggravating or mitigating circumstance depending on the situation. The EU Court determined that the degree of fault of the data controller should not be considered, explaining that the purpose of Article 82 is to compensate the data subject for the damage suffered, and it is not intended to punish the data controller beyond this.

Recognition of transgender rights

Regarding the Case Mirin (C-4/23)

Although this decision does not directly concern to the GDPR, it addresses adjacent issues. It's unsurprising that cases concerning transgender rights are emerging, and this judgment could enhance the acceptance of these rights. The EU Court has recently decreed that member states of the European Union must recognise changes to first names and gender that have been legally executed in other EU countries. This ruling came about in a case involving a British-Romanian citizen who legally changed their gender and first name in the United Kingdom and subsequently sought to have these changes officially recognised in their birth country, Romania.

Romania refused to update his birth certificate and issue new documents reflecting his new identity, arguing that he needed to undergo a national gender transition procedure. This led the individual to take legal action, and the matter was eventually referred to the EU Court to determine whether Romania's refusal complied with EU law.

The EU Court concluded that Romania's refusal violated the individual's rights under EU legislation, particularly the right to free movement and residence. The Court emphasised that it was irrelevant that the changes were requested after the United Kingdom had formally left the EU, as the changes were implemented while the United Kingdom was still a member.

The judgment is viewed as a significant victory for transgender rights in the EU, ensuring that legal identity changes are recognised across member states without the need for further national procedures. This enhances the mobility and rights of EU citizens by streamlining the process for personal identity recognition. Critics from countries such as Hungary and Slovakia have voiced opposition to the ruling, arguing that it replaces national legal principles. The case will now return to the Romanian courts for a final decision, but the EU Court's decision establishes a binding precedent for handling such cases within the EU.

While the case may not have substantial implications in Norway, it holds significant moral importance.

Police access to mobile phones on suspicion of criminal activity

Regarding the Case Bezirkshauptmannschaft Landeck (C-548/21)

In this case, the EU Court dealt with the issue of police access to data stored on a mobile phone related to a criminal case in Austria. An Austrian citizen had his mobile phone confiscated after police found 85 grams of cannabis in a package addressed to him. The police tried to unlock the phone without the necessary approval from either the prosecution or a court, and they did not inform the individual about this attempt. The Austrian court referred the case to the EU Court to determine if Austrian law, which allows such police actions, aligns with EU law. The national court noted that the owner was accused of an offense punishable by a maximum of one year in prison, thus considered a less serious crime.

The EU Court ruled that accessing information on a mobile phone can be a serious violation of an individual's privacy rights. It highlighted that the seriousness of the alleged crime is a crucial factor in determining whether such an intrusion is proportionate. However, the EU Court stated that such intrusions are not limited to fighting only severe crimes. National laws must clearly define the criteria for such intrusions, including what types of offenses can justify these actions. Additionally, accessing the data must be proportional and require prior approval from a court or an independent authority unless there is a clear and justifiable emergency.

The EU Court also mentioned that the affected individual should be informed about the basis for the approval as soon as it does not compromise the investigation. This balances the need for effective crime fighting with the protection of individual rights. Essentially, this reaffirms fundamental privacy and human rights principles. The police must also operate within clear guidelines.

Are data protection authorities required to issue fines for GDPR breaches

Regarding the Case Land Hessen (C-768/21)

This case, although published a week earlier than others and perhaps not as exciting, might be more relevant to countries other than Norway. It's included here for completeness. The case established that supervisory authorities in the EEA are not automatically required to issue orders, corrective actions, or fines for GDPR violations. This allows authorities to decide if such measures are necessary to address situations that have led to breaches or to ensure compliance with privacy rules in the future. Personally, I find it somewhat odd that this case was even brought up—it doesn't seem logical that supervisory bodies should always impose fines for breaches.

The case stemmed from an incident where a bank employee repeatedly accessed a customer's personal data without authorisation. Following this, the employer reported the data security breach to the data protection authority under Article 33 of the GDPR. The employer also disciplined the employee, who assured in writing that she had not misused the information. The employer decided not to inform the affected customer under Article 34 of the GDPR, believing the breach did not pose a high risk to the customer's rights and freedoms. The customer accidentally discovered the breach and complained to the Data Protection Authority, which found the employer's risk assessment adequate and therefore did not take further action against the employer.

The affected customer challenged this decision in court, and the case eventually reached the EU Court to determine whether supervisory authorities must always use their powers under Article 58(2) of the GDPR when privacy regulations are breached. The EU Court ruled that authorities are not required to exercise corrective powers, such as imposing administrative fines, unless it is appropriate, necessary, and proportionate to address the deficiencies and ensure full enforcement of the GDPR. The Court emphasised that authorities have the discretion to select suitable and necessary measures based on each case's specific circumstances.

This judgment clarifies that supervisory authorities have the flexibility to evaluate each case individually and decide the most suitable actions to ensure GDPR compliance. In some cases, where measures have already been taken to stop and prevent further breaches, additional corrective actions may not be necessary.

Do you have any questions?