Newsletter

Privacy Corner

by Eva Jarbekk and Jenny Nondal

Published:

Camera surveillance

Introduction

Merry December to you all! Privacy news never seems to end, and I've had to sift through a lot to bring you what I believe is worth your time. Below are some news highlights that caught my attention.


Many may have noticed that several media houses are changing their cookie practices, becoming more detailed. However, cookie rules in Norway haven't changed, so why is this happening? The background lies in the fact that many Consent Management Platforms (CMPs) for cookie usage are international and aim for a consistent setup across multiple countries. Many of these platforms are also part of an advertiser industry association called IAB. IAB has its own code of conduct named TCF (Transparency and Consent Framework), which has recently been updated to TCF 2.2. The reason for the change is a long story we won't delve into here, but a data protection authority deemed their previous consents inadequate. Now, TCF 2.2 is very explicit about what is acceptable and what is not. TCF 2.2 came into effect on November 20th this year and has had significant ripple effects, especially in providing clear consents for cookie usage. Additionally, TCF 2.2 has detailed descriptions of what consents and legitimate interests can be used for, clarifying a wide range of treatments. This is crucial for everyone involved in advertising.

EDPB with new guidelines for cookies, etc.

On November 15th, the European Data Protection Board (EDPB) published new guidelines related to Article 5(3) of the EU's ePrivacy Directive. This provision concerns the storage of information on the user's device or access to information on the device – in other words, cookies. However, it also applies to pixels, beacons, and other technologies that collect information from the user's device. As you are aware, consent is required to perform these actions. The objective is to determine whether new tracking technologies are also covered by the directive. Spoiler alert: yes, they often are.


The Guidelines provide a detailed analysis of the four key elements in the application of Article 5(3), (i) ‘information’, (ii) ‘terminal equipment of a subscriber or user’, (iii) ‘electronic communications network’, and (iv) ‘gaining of access' and ‘stored information' / 'storage’, and some examples of when EDPB considers Article 5(3) to be applicable. Below, we have summarized the key takeaways from the examples. Several of the examples are well-known and established interpretations of Article 5(3). However, the suggested application of Article 5(3) on collection of identifiers from tracked URLs distributed over a public communication network is a broader interpretation of Article 5(3) than what had previously been well established.


URL and pixel tracking


  • Collection of identifiers from tracking pixels and tracked URLs is considered a 'gaining of access', making Article 5(3) applicable.

Local processing


  • Technologies involving local processing, instructed by software distributed on users’ terminal making processed information available to selected actors through client-side API, sending processed information back over the network, constitutes a 'gaining of access to information already stored', making Article 5(3) applicable.
  • The fact that the information is produced locally does not preclude the application of Article 5(3).

Tracking based on IP only


  • Article 5(3) applies to advertising solutions collecting only the IP address for user tracking across domains, even though the instruction is made by a different entity than the receiving one, if the information originates from the user's terminal equipment.

Intermittent and mediated IoT reporting


  • Article 5(3) applies to IoT devices connected to a public communications network if the device is instructed to send dynamically stored data to a remote server as such instruction constitutes a 'gaining of access', even if the information is streamed or cached only for intermittent reporting.
  • If an IoT device is connected via a relay device, transmission to the relay may fall outside Article 5(3) ePD, but Article 5(3) applies if the information received by the relay device is instructed to send to a remote server.

Unique Identifier


  • Article 5(3) applies to collection of unique identifiers on websites or applications, derived from persistent personal data and hashed on the user's device, as it involves instructing the browser to send the information, constituting a 'gaining of access'.

The Guidelines can be found here.

AI Act Approaching – maybe

There is significant activity surrounding the upcoming AI Act in the EU. As known, the proposal is in the final stages of the legislative process, but negotiations are currently at a standstill due to disagreements regarding the regulation of so-called "foundation models" in artificial intelligence. ChatGPT is a typical foundation model built on a large amount of data.


Several EU countries, including France, Germany, and Italy, oppose the regulation of these models in the AI Act and instead propose behavioral guidelines without sanction mechanisms. This creates tensions, as others advocate for more binding regulations to ensure the safer use of advanced AI.


The disagreement revolves particularly around the risk-based classification, where the most powerful AI models would be subject to strict rules. Opponents of this model prefer self-regulation. This has led to a certain stagnation in the negotiations.


It's worth noting that France, Germany, and Italy have reportedly reached an agreement on the regulation of foundation models, emphasizing self-regulation. It's not uncommon for such disagreements to arise among member states, and I believe the AI Act will be adopted, maybe with some changes. They might consider placing more of the regulations in annexes, making it easier to make adjustments if needed in the future than in the regulatory text. Read more about this in the links below.


Link 1
Link 2
Link 3
Link 4

Italy Focuses on AI Training with Open Data

Italian authorities have initiated an investigation to determine if websites are taking sufficient measures to prevent AI platforms from collecting large amounts of personal information through web scraping. The Italian supervisory authority has a particular focus on AI. Depending on their findings, the authority states that urgent measures may be taken. Emergency measures are becoming increasingly common in the privacy world. However, it will be interesting to see if they uncover conditions they consider objectionable.


The case is discussed here.

Multiple class actions underway in the Netherlands

Two consumer organizations in the Netherlands, Consumentenbond and Privacy Protection Foundation, have filed a class action lawsuit against Google for privacy violations. They claim that Google conducts continuous surveillance and sharing of personal data through online advertising. 82,000 consumers have joined the lawsuit.


Another group, Stichting Data Bescherming Nederland (SDBN), has sued Amazon for tracking the activities of website visitors without their consent. This lawsuit represents around five million Amazon accounts in the Netherlands.


These lawsuits are part of a growing trend where consumer groups seek compensation for alleged privacy breaches by large companies like Amazon and Twitter. Previously, Facebook was found guilty of illegally using personal data in the Netherlands.


European legislation and court decisions in the Netherlands have put pressure on companies to change practices and respect privacy laws. These class actions aim to obtain compensation for affected consumers and force companies to change their practices to comply with privacy regulations.


It's worth noting that companies involved in so-called "litigation financing" are actively considering whether claims under GDPR are a viable investment – monitoring large companies for breaches that affect many users and considering financing lawsuits for a share of any potential winnings. I've always thought this to be an unusual business model, but I hear that some consider it a good investment strategy.


The aforementioned class actions are mentioned here:



Link 1
Link 2
Link 3

Data Act: New EU legislation for data sharing

The EU has recently passed the Data Act, a law that simplifies the process of demanding access to data generated by connected products. This law facilitates broader access to and sharing of such information, anticipating increased value creation. Products and services are to be designed to allow easy access to information for others.


The law will change how businesses and consumers use and share data, expected to enable more innovation and improve access to useful information. The Data Act sets standards for how data is shared and used. This data isn't necessarily personal information, but could, for example, be about how your washing machine communicates with the manufacturer's cloud service. Access to this information could make it easier to find alternative repair services if the machine breaks down.


If you work for a company that offers connected services or products to customers, this is a regulation you need to familiarize yourself with.


The law ensures, among other things:


  • Fair data sharing: Rules that ensure data is shared in a way that is fair for all parties. No one gains an unfair advantage from the information sharing.
  • Support for small and medium-sized enterprises: It aids smaller businesses by giving them easier access to essential data.
  • Prohibition of barriers to switching cloud providers: The law stipulates that businesses cannot set up unnecessary barriers that make it difficult for other businesses to switch to a different cloud provider. This means companies can easily change who they use for cloud services if necessary.

More information about the Data Act can be found here.

Companies trying to escape being affected by the Digital Markets Act coming into force in spring 2024

Under the new Digital Markets Act (DMA), the EU has classified technology giants like Google, Amazon, Apple, TikTok (ByteDance), Meta, and Microsoft as "gatekeepers" due to their significant market influence. The DMA aims to level the playing field for smaller businesses against these industry giants. Not surprisingly, these companies are not thrilled about this. Meta, Apple, and TikTok have all taken legal steps by appealing their "gatekeeper" status to the European Court of Justice.


Apple is challenging its status as a "gatekeeper" for the App Store, iOS operating system, and Safari browser. Meta accepts the "gatekeeper" status for Facebook, Instagram, and WhatsApp but disputes the status for Messenger and Marketplace, arguing that they are part of Facebook and not separate services. TikTok contends they shouldn't be classified as a "gatekeeper" as they are new to the market and struggle to compete against established companies like Meta and Google. They also point out that they do not meet the law’s requirement of a 75 billion Euro annual turnover.

The Irish DPC specifies what is necessary to ensure certainty about an individual's identity

Airbnb was reprimanded by the DPC for violating data minimisation and data retention principles. Airbnb asked a customer for ID documents and a new picture to confirm identity at booking. The customer felt this was excessive and complained to the data protection authority in Berlin, which forwarded the case to the DPC.


The DPC ruled that there was no legal basis to demand ID in this manner, as there were other less intrusive methods to confirm the user's identity. The DPC also noted that Airbnb retained ID copies for too long, even after identity had been confirmed.


DPC has issued Airbnb reprimands five times in the past year for privacy violations. Although the DPC considers these breaches serious, they have opted for reprimands rather than fines. This might be due to the DPC's preference to prioritize follow-up, correction, and guidance over stricter sanctions, especially if Airbnb shows a willingness to improve and comply with requirements following earlier reactions. Airbnb might be thankful that the Norwegian Data Protection Authority hasn't focused on them.


Read more about this here.

Swedish IMY fines company for incorrectly sending personal data

The Swedish Authority for Privacy Protection (IMY) has imposed a fine of 500,000 kronor on a company for accidentally emailing a large group of customers with a file containing thousands of other customers' financial data.


IMY investigated the case and discovered that the company accidentally attached an Excel file with personal data instead of a PDF report on the performance of funds. The mistakenly sent file included customers' names, personal identification numbers, banking information, email addresses, individual fund choices, and the latest values of their fund investments. Fortunately, the file did not contain information like account numbers or login details. It involved personal data of over 52,000 customers.


IMY concluded that the company had processed personal data in violation of the General Data Protection Regulation (GDPR) by failing to ensure adequate protection of the data. Therefore, IMY imposed a fine of 500,000 kronor on the company for the breach of privacy legislation. IMY emphasises that such violations often result from human errors and emphasises the importance of having technical and organisational security measures in place to reduce the risk of such incidents. While this is not new, it serves as yet another reminder of the need for caution.


The case is mentioned here.

The Danish Data Protection Authority has been active – two significant new cases

The Digitalization Agency processes superfluous personal data in administration of digital driving licenses


The Digitalization Agency has been criticized by the Data Protection Authority for processing too much personal data in the administration of the digital driving license. They store information on all citizens with a valid Danish driving license, nearly 4 million people, although only about 1.7 million have the digital driving license app.


Datatilsynet fant (ikke overraskende) at behandlingen av personopplysninger om over 2 millioner borgere som ikke har sluttet seg til den digitale førerkortordningen, var i strid med prinsippet om dataminimering, som krever at man bare behandler nødvendige opplysninger. Datatilsynet understreker viktigheten av å unngå unødvendig opphopning av personopplysninger og fastslår at Digitaliseringsstyrelsen fremover kun kan behandle opplysninger om de som faktisk har valgt den digitale løsningen.


The Data Protection Authority found (unsurprisingly) that the processing of personal data of over 2 million citizens who have not joined the digital driving license scheme was in violation of the data minimization principle, which requires that personal data is only processed as necessary. The Data Protection Authority emphasizes the importance of avoiding unnecessary accumulation of personal data and states that going forward, the Digitalization Agency can only process information about those who have actually chosen the digital solution.


Despite the Digitalization Agency's argument about limitations in the existing system as a reason for storing all the information, the Data Protection Authority believes this does not justify processing unnecessary information about over 2 million citizens.


Municipality's legal basis for AI solution to identify citizens in need of maintenance training and rehabilitative efforts


The Data Protection Authority has evaluated the Copenhagen Municipality's legal basis for developing and operating an AI solution to identify citizens in need of maintenance training and rehabilitative measures. The municipality aims to use AI to support case workers by identifying citizens with such needs based on historical data.


The Data Protection Authority agrees that the general development, operation, and training of such an AI solution are in line with the General Data Protection Regulation (GDPR), but emphasizes the need for a supplementary national legal basis. The requirements for this basis will vary depending on whether it relates to the development, training, or operation of the AI solution.


Processing personal data for the development and training of an AI solution can more easily comply with the legal framework compared to the operation of such a solution, pursuant to the Danish Data Protection Authority. In the operational phase, there is a demand for a clearer legal basis due to the large amount of sensitive information processed by AI solutions.


I have included this case because it's important to see that the development, training, and operation of a system are different treatments – and they cannot always be justified under the same provision.


The cases are mentioned here:



Link 1
Link 2

Is it legal to block ad blockers?

YouTube has been actively developing technology to detect ad blockers. Now, they can determine if ad blockers are turned on, and if so, videos won't play. Instead, users receive a message to either accept the ads or subscribe to YouTube Premium.


Privacy advocates have raised concerns about YouTube's tools for identifying ad blockers, arguing that they violate the ePrivacy Directive, specifically Article 5.3. This provision, as is well-known, requires websites to request consent before storing or accessing information on a user's device. YouTube, on the other hand, argues that ad blockers violate the platform's terms of service and prevent creators from earning revenue from ads. Therefore, the company believes it is necessary to have a tool that can identify and stop ad blockers.


This issue is now being treated as a complaint case by the Irish Data Protection Commission (DPC) and is mentioned here.

NOYB continuously identifying new cases

NOYB has filed a complaint against the European Commission with the European Data Protection Supervisor (EDPS) for using illegal methods on Twitter to promote a controversial law about chat control regulation. They argue that targeting information at users on Twitter violated GDPR privacy rules and the established democratic procedures among EU institutions.


The proposed chat control law in the EU has been heavily criticized by various groups, including industry, society, and member states, fearing it could pave the way for extensive surveillance of online communication.


NOYB specifically reacted to the Commission's use of targeted advertising based on political and religious viewpoints, which clearly breached privacy regulations. It was also noteworthy that the Commission had previously warned against this type of advertising, describing it as a serious threat to a fair and democratic electoral process. NOYB has also requested that EDPS impose fines in this matter.


This is a rather unusual case and will be interesting to follow. If EDPS imposes fines, I believe it would be the first time.


The case is mentioned here.

Privacy in the UD differs from Europe, and it is quite different in Canada too

In Canada, the police are considering the use of technology that gives them access to home surveillance cameras, a practice already in use in several American cities. This system, known as Fusus, allows the police to monitor live video feeds from privately-owned cameras at homes and businesses to oversee emergency situations and potential criminal activities. They believe that Fusus helps solve crimes faster and provides the police with advance information about potential emergencies.


Fusus grants police direct access to private surveillance cameras, raising concerns among privacy researchers due to the possibility of surveillance without judicial approval. Access is supposed to be based on agreements with the owners of the private cameras.


The fact that Canada is even considering implementing this kind of technology indicates that their approach to privacy differs significantly from Europe, which was somewhat surprising to read. Generally, Canada has privacy legislation and views on privacy that are quite close to those in Europe.


You can find the case here.

GDPR to be evaluated

EU countries are set to evaluate the General Data Protection Regulation (GDPR) five years after its implementation. They are focusing on challenges related to its implementation, especially for small businesses and public bodies. While the law has enhanced trust and legal certainty regarding privacy, member states are still calling for additional guidance and guidelines. The evaluation will address issues such as the use of personal data in research and the need for clearer guidelines on data processing of minors. The Council also calls for improvements in the enforcement of the law and desires greater clarity in international data transfers. Overall, GDPR is considered a success, and there is little indication of any drastic changes on the horizon.


More about this can be found here.

The Privacy Appeals Board affirms that the Data Protection Authority must consider privacy issues

The case involved a complaint from an individual, A, regarding the Data Protection Authority's decision to close a case without assessing whether the Personal Data Act was violated. A believed his privacy was infringed by the bankruptcy trustee of the company X AS, following the company's bankruptcy. The trustee had sent out reports to several recipients, alleging A's involvement in illegal activities. This resulted in damage to A's reputation and, according to A, was a clear breach of his right to privacy and improper handling of his personal data. The Data Protection Authority concluded the case without delving into details or determining any breaches of the Personal Data Act.


The Data Protection Authority justified its decision by stating that the bankruptcy estate was deleted, and therefore, there was no need for further investigations. However, the Privacy Appeals Board reviewed the case and found that the Data Protection Authority could not close the case without assessing whether the personal data were unlawfully processed. The Board believed that a more thorough evaluation was needed to determine the legality of the processing of A's personal data under the General Data Protection Regulation. Consequently, the Data Protection Authority's decision was overturned, and the case was referred back for a new assessment.


This case is practically significant as it highlights the Data Protection Authority's large caseload and the need to prioritize, while fundamentally asserting that the Authority cannot neglect to consider privacy issues.


The case can be found here.

Finally: attempts to challenge GDPR

TikTok has challenged aspects of the GDPR through a case at the EU Court of Justice. The case involves a disagreement between the Irish and German data protection authorities regarding TikTok's breaches of GDPR rules and their interpretation. The disagreement was presented to the European Data Protection Board (EDPB), which issued a so-called "Binding Decision" that binds the Irish Data Protection Authority. TikTok has taken an unusual step by arguing that the entire process of imposing "Binding Decisions" violates EU legislation. TikTok's position, that GDPR violates EU rules, is somewhat ironic.


Previously, similar cases were dismissed by the EU Court. WhatsApp, for example, attempted to run similar argument, which was rejected on the grounds that "Binding Decisions" were imposed on Ireland, not the company itself. TikTok is taking the argument a step further by challenging GDPR law itself. They highlight that the process of issuing binding decisions might conflict with EU law, especially the rights related to fair trial and the right to appeal. (The company does not have the right to appeal EDPB's decisions.)


If TikTok wins, it could have significant implications for how EU privacy legislation is enforced, especially for companies operating from Ireland and looking to avoid harsh penalties from other European countries. On the other hand, it seems unlikely that they will prevail.


Success for TikTok would significantly impact the "One Stop Shop" mechanism. However, this might not necessarily benefit companies, as it would mean that regulatory authorities in all member states could start their own investigations and enforce violations independently. This would make the enforcement of regulations very costly and complicated. Overall, it might be preferable that violations are handled by a single regulatory authority.


These aspects are discussed here.

Do you have any questions?