Newsletter

Privacy Corner

by Eva Jarbekk and Trygve Karlstad

Published:

Computer and lock

As I write this introduction, I am sitting in Stockholm about to attend the Nordic Privacy Arena. Later today, I am scheduled to moderate a large panel discussing whether Europe possesses the optimal framework to remain competitive on a global scale. 

High on the agenda is the recent Draghi report that was released earlier in September, where the EU's former central bank president advocates for regulatory changes that the EU needs going forward. The report includes a chapter on "computing and the AI sector." Interestingly, it suggests yet another set of regulations to assist with this – the "EU Cloud and Development Act." It is recommended to skim through the report

What do our countries need to stay competitive in the future? It's unlikely that we require regulations as those found in China, where the ability to train AI on vast amounts of data is facilitated. Instead, a more balanced approach is necessary. However, clearer guidance from regulatory authorities is essential. It's more important now than ever that these authorities are equipped with sufficient resources to perform their duties effectively. When these bodies issue "opinions" that essentially act as legislation, we must proceed with caution. More on this will be discussed in the initial post below.

Additionally, it's interesting to note that the majority of discussions in Stockholm are centered on AI. Moving forward, I am considering changing the name of our network from PrivacyTech to Privacy&Tech to better reflect our broader engagement with technology, not just privacy issues. As always, there are many topics to cover – happy reading!

Democratic failure at the EDPB?

While I support the EU, it's clear that the institution isn't flawless. Nonetheless, it remains important to identify and discuss areas where improvements are necessary. One such area involves the European Data Protection Board's (EDPB) use of Opinions and Guidelines. Opinions are typically adopted swiftly, within 8-14 weeks, and without prior consultation, whereas guidelines undergo a more extended process and are subject to consultation.

The rules in the GDPR are discretionary. Data protection authorities often rely on various EU guidelines for interpretation. One example is the EDPB's Opinion on "pay – or consent". The EDPB itself believes that Opinions should be used for "targeted questions requiring a swift answer, while guidelines are better suited for matters with a larger scope".

What qualifies as a "targeted question" can of course be subjective, but the mentioned opinion outlines a number of criteria for when a service must be offered for free (yes – really), and it also suggests that these criteria should "typically apply to large online platforms, but not exclusively. Some of the considerations expressed in this opinion may prove useful more generally for the application of the concept of consent in the context of ‘consent or pay’ models."

This approach has been criticized for not being as "targeted" as claimed, given its potentially wide-reaching implications. Moreover, the opinion binds the data protection authorities' interpretation of the rules, leading to a situation where certain services must be provided for free – and this has neither been democratically voted upon in any parliament nor has it been subject to any consultation.

Just as the absence of privacy poses a risk to democracy, I find the EDPB's method concerning from a democratic perspective. The EDPB is now developing a new opinion on AI training, and it's uncertain whether this opinion will withstand scrutiny regarding its development process. It is understandable that many would want to challenge such "Opinions" from the EDPB, and I believe this could lead to increased litigation in the future.

Is Europe losing the AI battle?

You may have heard that Europe is falling behind in AI development, with Norway potentially being the slowest to progress. While I won't delve into the accuracy of this claim, it's essential to acknowledge the importance of a balanced legal framework for AI. As noted in our previous newsletter, the current regulatory environment has led to companies like Apple not enabling the latest AI features on their newest models in Europe, and Meta restricting the launch of some AI models within the region. This situation has prompted several leading AI companies to pen an open letter criticizing EU regulations as inadequate.

In this letter, major technology firms express a widespread concern that Europe's regulations are not only fragmented but also hinder innovation and economic growth. This sentiment is echoed by companies like Apple and Meta, which have had to navigate stringent European privacy regulations and, in some cases, have opted not to introduce certain products or services.

Conversely, the importance of maintaining robust privacy standards cannot be understated. Without such regulations, the use of personal data to train AI models could result in significant violations of individual privacy and autonomy. This concern is particularly pertinent given the upcoming EDPB opinion on AI training, which will influence the future use of personal data in AI development.

There is certainly much to criticize about the approach to AI regulation in Europe, considering both the GDPR and the AI Act. It is crucial that this process is thorough, taking into account the diverse opinions and concerns from both the tech industry and privacy advocates

A thoughtful and well-informed stance from the EDPB will be key in determining which entities can develop AI models in Europe. The EDPB's opinion is significant not only from a privacy standpoint but also in determining whether Europe can compete globally without compromising its privacy values. Finding a balance that allows Europe to harness AI innovations while safeguarding individual rights in an increasingly data-driven world is essential.

Read more here.

Digital Security Act and Norway

Now, let's move on to something concrete and practical. Currently, there's significant attention on the NIS 2 Directive, which EU member states are required to implement by October 17 this year. However, this directive will not immediately impact Norwegian businesses. In Norway, we are slightly behind the curve compared to the rest of Europe, as we are still in the process of implementing the NIS 1 Directive through the Digital Security Act, though the exact date of enforcement remains uncertain.

The Digital Security Act targets providers of critical services across specific sectors, including energy, transport, health, water supply, banking, financial market infrastructure, and digital infrastructure. A recent proposal has outlined regulations to determine the affected entities. There are three notable differences between the NIS and NIS 2 Directives. Firstly, the scope of NIS 2 is broader, encompassing more sectors than the original NIS Directive. Additionally, NIS 2 introduces stringent requirements for risk management and security measures, which include managing risks associated with suppliers. Lastly, NIS 2 provides for stricter sanctions, both financial and in terms of corporate governance responsibilities within companies.

In the short term, it is crucial for most Norwegian businesses to become acquainted with the requirements set forth in the Digital Security Act. Over the longer term, preparing for the eventual adoption of NIS 2, which is likely to be integrated into Norwegian law at some future date, will be essential.

You can find the proposal for the regulation on who is covered, here

Facial recognition – Now also in Denmark

Recent editions of our newsletter have frequently discussed the increasing acceptance of facial recognition technology in various contexts, and now, Denmark has joined this trend. The Danish government has recently authorized the expansion of police powers to use facial recognition technology in investigating serious crimes such as murder, severe violence, and rape. This initiative is part of a broader strategy aimed at enhancing the efficiency of police operations and improving overall security.

However, this advancement has prompted concerns from the Danish Data Protection Authority, leading them to request detailed information from the police. They seek to understand the assessments and precautions being taken in the deployment of facial recognition technology. Specifically, the DPA has asked for details on how the police intend to conduct a data protection impact assessment and the extent of their engagement with the authority before implementing the technology.

While facial recognition technology offers substantial benefits for investigation and intelligence purposes, it also carries significant risks to individual privacy and rights. It is imperative that its implementation is accompanied by stringent guidelines and frameworks to safeguard citizens' personal data.

See more here.

More focus on the significance of automated decisions

A recent court case in Belgium tackled the issue of shadow banning on social media under the GDPR. The case involved Meta reducing the visibility of posts by Tom Vandendriessche, a right-winged Belgian politician, without his knowledge. The court was tasked with determining whether this action breached GDPR provisions on automated individual decision-making outlined in Article 22, as well as the information and transparency requirements stipulated in Articles 13 and 14.

The Belgian court highlighted that when content moderation is executed in this manner, the data processor must adopt measures to safeguard the rights of the data subject. This protection includes providing information about the automated processing, an explanation of the decision-making rationale, and an opportunity for the data subject to challenge the decision. The court concluded that Meta's actions in reducing the visibility of Vandendriessche's posts and implementing shadow banning constituted an automated decision made without "meaningful human intervention," thus violating GDPR Articles 13 and 14.

The judgment is particularly relevant in light of the enormous development of AI models. Both the GDPR and the Digital Services Act (DSA) impose limitations on the use of automated processing can be used in cases like this. Legal precedents such as this one are crucial for establishing clear requirements that must be fulfilled before deploying AI technologies. The more automated models are used, the stricter the requirements for providing explanations and opportunities for the data subject to contest decisions will become.

Read more about the case here.

The Uber Case – What is it really about?

The Dutch DPA has imposed a fine of 290 million euros on Uber for violating privacy regulations under the GDPR. Uber was found guilty of transferring personal data of European drivers to the USA without adequate security measures to protect the data. The transferred personal data included sensitive information such as account details, taxi licenses, location data, photos, payment details, identity documents, and in some cases, criminal and medical records. This transfer persisted for over two years without employing Standard Contractual Clauses (SCCs) as the legal basis for the data transfer.

Uber decided to remove its SCCs, believing they were unnecessary for their data transfers, based on their interpretation of Article 3 of the GDPR. This article extends GDPR's scope to processing outside the EU, when it is related to the offering of goods or services to individuals within the EU. Additionally, Uber relied on a statement from the EU Commission indicating that SCCs were not applicable for data transfers to data processors or processors where the processing is directly subject to GDPR. Furthermore, the Commission had stated that they were in the process of developing new SCCs for such scenarios. 

However, the Dutch DPA disagreed with Uber's interpretation, criticizing the company for not continuing to use SCCs. The DPA contended that Uber could not have inferred from the Commission's statements that SCCs or other transfer mechanisms were unnecessary. This case highlights the complexities businesses face in interpreting GDPR requirements and stresses the need for caution in legal interpretations, particularly regarding international data transfers. Despite Uber's efforts to comply to the GDPR, the Dutch DPA showed little sympathy.

The case is intricate and underscores the complexity of regulatory compliance. The decision by the data protection authority seems formalistic, and it is not surprising that the case has been appealed. Many have criticized the decision, and it raises questions about whether such administrative practices genuinely advance privacy protection.

Read more about the case here.

New SCCs – It’s going to be exciting!

As previously mentioned, the European Commission is in the process of preparing a consultation on new Standard Contractual Clauses (SCCs) for data transfers to data processors outside the EU that are directly subject to GDPR. These new SCCs will coexist with the current ones, which address situations where the data importer is not governed by GDPR.

The recent Uber case has sparked a critical discussion: Are SCCs necessary when the data importer, located outside the EU, is already subject to GDPR? This question has ignited a debate over whether SCCs might lead to redundant obligations and create confusion for businesses striving to navigate overlapping legal frameworks.

The EDPB has determined that SCCs are indeed necessary, even when the importer falls under GDPR jurisdiction. The rationale is that SCCs help mitigate potential conflicts between foreign laws and EU regulations. This decision has tangible implications, as evidenced by the substantial 290 million euro fine levied against Uber in the Netherlands

The forthcoming SCCs are designed to clarify the responsibilities of importers in third countries directly subject to GDPR, aiming to streamline compliance and eliminate redundant requirements that can overburden businesses. The Commission is set to initiate a public consultation in the fourth quarter of 2024, with a draft expected by the second quarter of 2025.

Whether this actually leads to better privacy protection with yet another set of SCCs, I am unsure. However, it will make the formalities easier to manage. Clarifying  

A bit more about cookie banners and the form and colour of buttons

In a case from the Belgian DPA, the design of cookie banners on various websites was evaluated. The DPA found that it was not sufficient to click a button labeled "click more" to reject cookies, and it required that the reject button be presented alongside the accept button to provide a real choice regarding consent to cookies. Furthermore, the use of colour on the buttons was considered misleading, and graphically, it was too complicated to withdraw consent compared to giving it. As a consumer, I completely agree.

Read more about the case here.

New decision from the CJEU – Sharing contact information – C-17/22 and C-18/22

The case concerns whether a group of investment companies could demand that a trust fund disclose contact information about its participants with indirect ownership stakes in investment funds. The defendant, the trust fund, opposed disclosing this information, maintaining that contractual clauses prohibited such data sharing with other shareholders. German courts were uncertain whether existing national legal practice, which mandates the defendant to share the data unless there is an abuse of rights, is compatible with the GDPR. Therefore, the case was referred to the EU Court of Justice.

The EU Court of Justice examined this issue in light of several provisions of the GDPR, particularly focusing on Article 6(1), which addresses the legality of data processing. The Court considered various legal bases for processing:

  1. Article 6(1)(a) GDPR - Consent: Since the data subjects had not given consent, the data could not be shared.
     
  2. Article 6(1)(b) GDPR - Necessary for the performance of a contract: After reviewing the contracts entered by the data subjects, the Court found that these agreements explicitly prohibited disclosing information about their identities
     
  3. Article 6(1)(f) GDPR - Legitimate interests: While the Court acknowledged that there might be a legitimate interest in accessing the information, it noted that less intrusive methods, such as directly asking the data subjects if they wish to be contacted, should be considered. The Court expressed reservations about the sufficiency of this basis for legal processing.
     
  4. Article 6(1)(c) GDPR - Legal obligation: The Court observed that although there is no explicit statutory obligation, existing legal practice might potentially be considered a legal obligation. However, it emphasized that for legal practice to qualify as a legal obligation, it must be sufficiently clear and predictable. The determination of whether these criteria are met was left to the national courts.

Three new AG opinions

Insight into algorithms in automated processing – C-203/22

The case in question revolves around a mobile operator who declined to agree to a contract with an individual based on an alleged lack of creditworthiness, assessed using services from Dun & Bradstreet in Austria. The individual, seeking transparency, requested details about the algorithm used for this automated credit assessment, sparking a series of legal inquiries that led to a referral to the EU Court of Justice for a preliminary ruling.

Advocate General De La Tour recently issued his opinion on the matter. He stressed the necessity for the data subject to receive "meaningful information" about the logic involved in the automated processing, as stipulated in Article 15(1)(h) of the GDPR. He further emphasized the data subject's "right to an explanation" of the mechanisms employed in the automated decision-making process. This right is crucial not only for verifying the legality of the data processing but also for facilitating the exercise of other GDPR rights, particularly those associated with Article 22.

The Advocate General emphasized that the information provided must adhere to transparency requirements and should include detailed insights into the context and logic of the automated processes. The explanation should be clear and accessible, and if necessary, supplemented with additional details. However, he clarified that the explanation need not delve into the operational specifics of the algorithm due to its complexity. Instead, it should offer a comprehensible description of the logic applied, including the methods, criteria, and their respective weightings. Additionally, the data subject should be able to use this information to verify the accuracy of the data used and to evaluate the compliance of the processing with these data. Once concluded, this case is expected to provide valuable clarifications regarding the obligations to explain the workings of algorithms.

GDPR, correction, and trans rights – C-247/23

A transgender person who was granted refugee status in Hungary filed a complaint against the Hungarian immigration authorities for incorrect registration in the asylum registry. When applying for refugee status, the individual clarified that he identified as a male and used his transsexuality as a basis for recognition as a refugee. The data controller, i.e., the immigration authorities, registered the individual as female. Later, the registered individual requested that the entry in the asylum registry to be corrected from female to male in terms of name and gender, but this request was denied on the grounds that the individual could not provide proof of completed gender-confirming treatment.

The case has since been escalated to the EU Court of Justice, focusing on the interpretation of Article 16 of the GDPR. Advocate General Collins has recommended that the court interpret Article 16, in conjunction with Article 5(1)(d), to mandate that a national authority managing a refugee registry must correct personal data concerning gender if inaccurately registered, upon the individual's request. Moreover, Collins argued that it should not be necessary for the individual to prove completion of gender-confirming surgery to amend their gender data. Article 16 does not stipulate any such proof requirement. The necessity for proof in data correction should be evaluated individually, but it should not be mandatory for the individual to demonstrate a specific interest in the correction or that the inaccuracy has caused harm.

Calculation of GDPR fine - C-383/23

In a recent opinion, Advocate General Medina clarified how fines for GDPR breaches should be calculated for data controllers or processors that are part of a larger enterprise. He emphasized that the maximum fine level should consider the total annual turnover of the entire enterprise, including the parent company. This approach sets a ceiling on the potential size of fines, underscoring that there is often substantial scope for imposing significant penalties. This perspective is particularly relevant given the recent trend of both hefty fines and numerous reprimands being issued.

However, it's crucial to understand that this is just one of many factors considered in determining the actual fine. Data protection authorities or courts must also conduct a detailed assessment based on additional criteria, such as the enterprise's decision-making authority, the severity of the breach, and the involvement of different units within the enterprise. This comprehensive evaluation ensures that fines are not only substantial but also appropriately tailored to the specific circumstances of each case.

Swedish confusion about consent, contracts, mergers and responsibility

The Swedish DPA (IMY) conducted an audit at Expressen Lifestyle AB and found that the company had processed personal data without stating the correct legal basis. Since the implementation of the GDPR in 2018, Expressen had primarily relied on contractual necessity and legitimate interest to process subscription data, rather than consent.

However, an inconsistency was found in the privacy statement of the company's online store, which incorrectly stated that the legal basis for processing was consent. This error was compounded by additional information about the right to withdraw consent, further suggesting that consent was the legal basis for processing. Once the inspection commenced, Expressen promptly took steps to amend the information on their online store.

The IMY concluded that the original wording in the privacy declaration misleadingly indicated that consent was the legal basis for processing personal data. Since Expressen's actual processing was based on contractual necessity and legitimate interest, the IMY determined that Expressen had breached Article 13(1)(c) of the GDPR by stating an incorrect legal basis. The authority deemed the violations to be less severe, noting that the website was not the primary platform used by subscribers to sign up for subscriptions, the number of affected individuals was limited, and the breach did not result in serious consequences for the registered individuals. Consequently, IMY issued a reprimand without imposing fines.

This case highlights the critical importance of ensuring that all communications regarding the legal basis for processing personal data are accurate and clear. It's common for there to be confusion between contracts and consent, and the distinction can often be subtle. If you rely on either of these as a basis for processing, it is advisable to review the information you provide to your customers and consider whether any updates are necessary to ensure clarity and compliance.

Read more here.

Some data protection authorities also being sued

As many are aware, there has recently been a particular focus on whether the so-called "pay or okay" is in line with the data protection regulation. "Pay or okay" involves giving users a choice between being tracked and profiled for marketing purposes or paying a fee to avoid (targeted) advertising.

The background is that in the summer of 2021, an individual filed a complaint against Der Spiegel's "pay or okay" banner. After nearly three years of evaluation, the data protection authority in Hamburg concluded that this is, in principle, permissible. However, the decision is highly controversial. The criticism points out that the facts of the case were treated superficially, and questions whether the users' consent can really be considered voluntary when over 99.9% of users accept tracking under such circumstances.

In response to these concerns, noyb has filed a lawsuit against the data protection authority regarding their handling of the "pay or okay" banner. The lawsuit particularly raises questions about the data protection authority's impartiality. It is highlighted, among other things, that the data protection authority had been in close dialogue with Der Spiegel throughout the process and had advised on proposed changes. Additionally, it is pointed out that the data protection authority charged Der Spiegel a fee for administrative costs that was significantly lower than typical legal advisory fees, further fueling concerns about the authority's neutrality in this matter.

Read more about the case here.

And noyb continues to complain about EU institutions

Noyb shows no signs of slowing down in its data protection advocacy efforts. In addition to two ongoing cases involving complaints against the European Data Protection Supervisor (EDPS), noyb has now filed a formal complaint against the EU Parliament itself.

The specifics of when and how the breach occurred remain unclear, with the breach only being discovered several months after the fact. In response, noyb has approached the EDPS with a complaint highlighting the EU Parliament's failure to adhere to key data protection principles, such as data minimization and proper storage duration. The complaint emphasizes that many of the compromised details had been retained by the EU Parliament for a decade, significantly longer than necessary. Additionally, it points out that the EU Parliament had previously been aware of vulnerabilities in its data security systems and had experienced similar cyberattacks. Through this complaint, noyb seeks to ensure that the EDPS enforce compliance with data protection regulations by the EU Parliament, safeguarding personal data against future breaches.

Read more about the complaint here.

Another case on access, including information in emails

This case involves a former employee and customer of the Swedish train operating company SJ, who had requested access to their own personal data. However, SJ was unable to provide this due to technical difficulties with bank-ID verification on the company's customer portal. As a result, SJ requested additional information for manual identification, which was verified by the registered individual three months later using mobile bank-ID.

Initially, the registered individual received information about the processing of personal data related to their employment relationship, but the information only covered the period from 2008 and excluded technical data. Subsequently, the registered individual received incomplete information about the customer relationship. SJ claimed that all necessary information was available on their website yet failed to guide the individual on how to access the missing data.

The individual subsequently filed a complaint with the Swedish DPA (IMY). The IMY determined that the timeline to respond to the data access request started with the individual's initial request, but this timeline was extended until SJ received the required identification information. Nevertheless, SJ did not provide access to the data within this adjusted deadline, nor did they inform the individual of any deadline extension, thereby violating Article 12(3) of the GDPR. Additionally, the IMY found that the incomplete information provided constituted a breach of GDPR Article 15. Given that these were considered minor breaches, SJ received only a reprimand from the IMY.

Read more here.

Public access law and GDPR – Not simple

The Norwegian Data Protection Authority has issued a reprimand to the Stavanger Labour Party for processing personal data on behalf of the majority parties in Stavanger. This action followed complaints from several individuals who received political advertising via email during the municipal elections on August 20, 2023. The emails targeted parents of children in kindergartens and schools in Stavanger, utilizing contact information provided by the municipality under the Norwegian Freedom of Information Act.

The complaints raised concerns about the legality of how both the municipality and the political parties managed the personal data. Therefore, the Data Protection Authority requested a statement from the Stavanger Labour Party to clarify the purpose and legal basis for the email campaign.

In its decision, the DPA concluded that the Stavanger Labour Party failed to adequately assess the legal basis for processing the personal data and did not properly inform the affected individuals about this processing. Due to these shortcomings, the party received a reprimand. The Data Protection Authority has since closed the case.

Read more here

Is CSAM coming after all?

In the previous newsletter, I wrote that the CSAM proposal had been withdrawn. Now, the Hungarian presidency has declared that they intend to advance with the law and are aiming for an agreement at the Council's meeting in October. Despite modifications to the methods of monitoring and searching within chats, these changes are unlikely to appease critics who argue that the measures are overly intrusive. For those interested in a deeper dive, Politico has released internal documents detailing the process, which can be accessed here: an internal document published by Politico.

Fines are not so exciting in themselves, but some are interesting

The Dutch DPA has imposed a fine of 30.5 million euros on the American company Clearview AI for illegal data collection for facial recognition. Clearview AI has built a database containing over 30 billion images, including images of Dutch citizens, collected without their knowledge or consent. These images, scraped from the internet, have been converted into unique biometric codes. Clearview's services are widely used by intelligence and investigative agencies outside the EU, by uploading an image that is compared to Clearview's enormous database.

There were particularly two issues that constituted a breach of the data protection regulation. Firstly, the Dutch DPA deemed Clearview's facial recognition technology as highly intrusive to the rights of the registered individuals. The database stored special categories of personal data, and the DPA found none of the exceptions in Article 9 to be fulfilled. Secondly, Clearview has not been sufficiently transparent about its usage of the images and biometric data and has failed to cooperate with requests for data access.

This decision is part of a broader trend, as several other European data protection authorities have also fined Clearview for similar infringements. Despite these penalties, there has been little indication that Clearview intends to modify its practices, raising significant concerns about its ongoing approach to data privacy.

Read more here.

A high-profile case about data scraping has reached a conclusion

A settlement has now been reached in a high-profile case about data scraping. The State of Texas has sued Meta for illegally using facial recognition technology to scrape biometric data from millions of residents in Texas without consent. According to Texas, Meta had scraped this information from photos and videos that users uploaded to Facebook. The parties have now agreed on a settlement, which involves Meta paying 1.4 billion dollars to Texas.

Read more here

Do you have any questions?