Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Security cameras mounted on wall.

Advent, cookies and Copilot

It's Advent, there are mulled wine and cookies everywhere and now we have a new law on cookies in Norway! The new Electronic Communications Act has been passed, and although it has not yet come into effect, rumours suggest it will do so on January 1 2025. I think this needs to be confirmed asap by the authorities, or else the implementation should be postponed to July 1 2025. We'll see. In any case, it's crystal clear that going forward, there will be more focus on cookies in this country. And there are really no surprises with what's coming in the new rules - we will soon have the same rules as in Europe.

How difficult will it be to act according to these new rules? Well, it depends on how websites are set up and what data is shared with whom. Many Norwegian websites have not been adapted to the European regime because it has been advantageous to use the Norwegian rules, which were more lenient. 

Do your users have the option to decline tracking? I guess so, but is it on the first or second layer of the banner? Are only necessary cookies set if the user declines cookies? Are you sure that only data for which there is consent is collected? Does the tracker only collect data for one purpose, or could it possibly collect data for multiple purposes? Is the consent granular in the same way? Does the information in the cookie banner tell users which third parties you share data with? And if you share data with third parties, do you review how the recipient uses the data? And can you handle access requests that ask where this type of data has been shared? I'm just asking.

I have a reasoned suspicion that there is quite a bit to do here, and I think some changes to make may be surprising. We'll have a chat about it in the network over Christmas. In the meeting now before Christmas, we will, among other things, talk about surveillance in the workplace. There are constantly new solutions there, and the question often arises: what do we really need and have to tell the employees? Additionally, we will discuss what internal rules one should have for recording meetings in Copilot.

Speaking of Copilot, the sandbox reports from the Data Protection Authority and NTNU on the implementation of Copilot are highly recommended for Christmas reading. The Data Protection Authority does not give a clear "yes or no" as to whether a company can use Copilot, but they point out many aspects that must be described. It is also interesting that they write that Copilot must also be assessed against the rules in the email regulation – one would not want to illegally monitor employees. I further note that there is considerable scepticism about consent as a legal basis for using Copilot, so there is much to consider going forward. I feel that this is so detailed that it could be challenging for many to do it correctly.

In Europe, the EDPB has organised so-called "Stakeholder events" for specially invited guests. At these events, they have collected input for an Opinion on possible legal bases for LLMs and a new Guideline on "Pay or Consent." The discussions in both meetings were very lively and somewhat polarising - there is a lot at stake. The results from the EDPB may come as an early Christmas present, at least the Opinion on "Pay or Consent".

It has clearly been an eventful year in law and tech, and it seems that this trend will persist. I wish you a very happy December and happy reading about the selection of interesting cases below!

Boring, but important: access control

A recurring theme in the decisions from the Data Protection Authority (DPA) is access control. Access control is hardly the most exciting topic within privacy law or one that involves the most complex legal questions. Despite this, practice from the DPA shows that this is something that more businesses and organisations should pay more attention to.

A recent decision regarding a penalty fee to the University of Agder (UiA) illustrates this point well. An employee at UiA discovered that several documents containing personal data had been openly accessible in multiple Teams channels. Consequently, employees had access to certain information without needing to know, including names, social security numbers, and information about facilitated exams and exam attempts. It is estimated that nearly 16,000 data subjects were affected by the lack of access control, including employees, students, refugees from Ukraine connected to the university, and other external persons.

The Data Protection Authority (of course) assumes that this is a breach of the UiA's duty to maintain the confidentiality of personal data, to ensure adequate internal control, and to ensure adequate security of personal data. The DPA's decision particularly highlighted the folder with personal data about the 64 Ukrainian refugees that had been accessible to all students as a specific breach of confidentiality.

It is also worth reminding ourselves that it is useful to have access logs. They are the only ones that can potentially show that information has not been accessed and misused.

Going forward, it will likely become even more important to be conscious of access control. For example, this should be considered before using AI models like Copilot to ensure that one has control over which personal data Copilot has access to.

It may be wise to review how your access control is set up!

Read more about the fine to UiA here.

Not all fines become a reality

The Data Protection Authority has recently issued a reprimand to Disqus, an American company that provides comment section solutions and advertising to websites. From mid-2018 to the end of 2019, the company provided a comment section solution to certain Norwegian websites. Personal data was shared between Disqus and its parent company, Zeta Global, without the knowledge of the responsible parties of the Norwegian websites. Therefore, according to the DPA's assessment, Disqus had disclosed personal data about data subjects in Norway without valid consent. The DPA originally notified a penalty fee of NOK 25 million for violations of privacy regulations. More recently, however, the DPA dropped the fee entirely and has only issued a reprimand in place of a penalty fee.

Disqus disagreed with several of the factual circumstances on which the DPA based its notification. The Authority lacked documentation to substantiate its claims and chose to accept Disqus' version of the facts in its final assessment. The case also experienced a lengthy processing time at the DPA, which influenced the decision on the final sanction.

It should be noted that Schjødt represented Disqus in the case.

Read more about the case here.

Consent from employees for biometrics?

A recent decision from the Belgian Data Protection Authority indicates that obtaining valid consent for the processing of biometric data in an employment context will be challenging. Biometric data are personal data derived from the processing of physical, physiological, or behavioural characteristics that can identify a person. This typically includes fingerprints or facial images used for facial recognition. These are sensitive personal data and, therefore, are subject to strict regulations under the GDPR.

The case involved an employer who used a fingerprint-based time registration system provided by a subsidiary of a Japanese conglomerate. An employee was concerned about violations of the GDPR, particularly due to the risk that data could be transferred to a country outside the EU.

The Data Protection Authority confirmed that when processing sensitive personal data, the data controller must have a legal basis under Article 6 and an additional basis for special categories of personal data under Article 9. This is not surprising.

The Data Protection Authority found that the employee's consent was not validly obtained for several reasons. Firstly, the consent was not informed. Information about the time registration system was only provided on employment through a welcome brochure and later included in the employment regulations. Furthermore, the consent did not fulfil the requirement for clarity. Although the workers signed the employment regulations and welcome brochure, the Data Protection Authority believed that this did not constitute clear consent to the processing of their personal data. Thirdly, workers' consent was not considered voluntary because there were negative consequences for not consenting, including sanctions for non-compliance.

The Data Protection Authority also noted in its conclusion that the purposes of the time registration system were not always specified in the documentation and that there were many alternatives to biometric registration that could achieve the desired objectives with less extensive intrusion into the workers' privacy.

Read more about the decision here.

Joint Controllership

It can often be challenging to determine when there is a joint controllership. Joint controllership occurs when two or more separate controllers jointly decide on the purposes and essential means of the processing. However, it is often difficult to ascertain when this is the case.

This can occur when businesses are closely linked to each other, for example, in a supplier relationship, as well as in businesses that usually do not have much to do with each other.

A case that illustrates this is the recent decision from the Belgian Data Protection Authority involving the company Freedelity. The company specialises in data collection via electronic ID cards (eID). The services that Freedelity offers make it possible to store commercial benefits from various retailers in one place, namely Freedelity's platform. Therefore, Freedelity offers its services both to retailers that share their customer data and directly to consumers. The Authority has ordered the company to change its practices in a number of respects.

Firstly, Freedelity must change its procedures for obtaining consent. Freedelity set up its consent solution so that one had to accept Freedelity's general terms and conditions in order to participate in the loyalty programs offered on its platform. The Authority found that Freedelity was not permitted to make consent conditional on further processing or non-essential terms to access commercial benefits in this way. Furthermore, the company had to inform consumers about the purpose of each processing activity in a clear and transparent manner. Freedelity had set up a consent solution on its website, where the consent request occurred simultaneously as the user entered their eID to create an account. From there, it referred to a privacy statement. However, the Data Protection Authority found that it was not sufficiently clear whether the user was only giving consent in connection with the creation of the account or also to the subsequent processing of personal data. Therefore, the company was required to ensure that the data subjects could give their consent unambiguously and specifically for the different processing purposes. In addition, Freedelity did not ensure that the data subject could easily withdraw their consent. Although the company offered a way to withdraw consent through "My Page" on their website, this was not found to be simple enough. The Data Protection Authority emphasises that the user had to go through several steps to be able to withdraw consent, whereas it should be as easy to give consent as it is to withdraw.

The company also could not collect data from consumers' eIDs that are not necessary for the intended purposes, and they must delete unnecessary data that has been previously collected.

At the same time, Freedelity had to limit the retention period for personal data to a maximum of 3 years from the last activity performed by the consumer and delete data that has been stored for more than 3 years. Initially, they had set the retention period to 8 years, which was disproportionately long according to the Authority's assessment.

The decision also illustrates the roles and responsibilities of Freedelity as a data processor, data controller, and joint controller. These roles are crucial in determining the obligations one has under GDPR.

Freedelity was considered the data controller for its own "Freedelity file." This implies that Freedelity is the data controller when each individual user creates a personal account, which provides access to "My Page" on Freedelity's website. Furthermore, the Data Protection Authority states that when a retailer terminates its collaboration with Freedelity, Freedelity remains the data controller for future processing, even though the data were originally collected by the retailer that has ended its contractual relationship with Freedelity.

However, the Data Protection Authority also identified instances of joint controllership where the purpose of collecting and processing personal data was closely linked.

This was relevant for the personal data from the various retail chains that were shared with the Freedelity platform in order to identify consumers. Freedelity believed they were the data controllers for this purpose as well since the retailers did not make any decisions relating to the processing of personal data. However, the Authority emphasised that the companies also had an interest in using Freedelity's platform to access and use customer data collected via eID.

The conclusion about joint controllership can indeed be compared to the shared responsibility that entities like Meta and businesses have using their Facebook pages for enterprises. Similarly, this applies to placing advertisements on other people's websites. Generally, I believe that using various web solutions, typically login solutions often results in joint controllership. This is not difficult to manage, but it requires precise contractual arrangements around such a service.

Freedelity expresses strong dissatisfaction with the decision. In a statement, they claim to have been subject to a six-year-long process with the Authority, which has been characterised by unreasonable demands that have not considered the company's attempts to improve and adapt its systems. The company points out that its rapid growth may have been a triggering factor for the Authority's actions, and they question whether developing quickly and contributing innovative solutions is considered an infringement in Belgium. Freedelity's frustration is evident, but such sentiments are not uncommon in similar processes.

Read more here.

The statement from Freedelity is found here.

DPF has been reviewed for the first time

The European Data Protection Board (EDPB) has released its first report and evaluation of the EU-US Data Privacy Framework (DPF). The EDPB has praised the efforts of the American authorities and the European Commission for the implementation of the DPF.

Regarding the use and enforcement of requirements for companies that are self-certified under the DPF, EDPB notes that the US Department of Commerce has taken all the relevant steps to carry out the certification process. This includes the development of a new website, updating procedures, collaborating with companies, and conducting activities to raise awareness. Furthermore, the US Department of Commerce has implemented the possibility for complaints from EU citizens and has published comprehensive guidance for handling complaints on both the European and American sides. They write that the low number of complaints so far suggests that American authorities must ensure that certified companies adhere to DPF principles.

EDPB has further encouraged the development of more guidance from American authorities and would like to participate in the development of such guides in collaboration with American authorities. For example, EDPB requests guidance that clarifies the requirements that DPF-certified companies must comply with when transferring personal data they have received from European companies. Finally, the council recommends that the next review of the adequacy decision should take place within three years.

It will be interesting to see how this collaboration will further develop under President Trump, who has previously been critical of the surveillance law regime that introduced the DPF. It should also be noted that not everyone believes that the DPF actually holds up legally, and it is possible that there will be new complaints about the DPF. For instance, NOYB is very clear in their belief that the DPF does not hold up. However, there might currently be more important things for NOYB to prioritise than to complain about the DPF.

Read more here.

AI Office has started working

The European AI Office plays a crucial role in the implementation of the AI Act and has now begun its work. The purpose of the AI Office is to support the development and use of reliable AI systems, and they aim to work towards a unified European AI governance system.

As one of its first tasks, the AI Office initiated a consultation to define what constitutes an AI system and what should be considered prohibited use of AI systems under the AI Act. The consultation aims to gather practical examples and use cases from AI system providers, businesses, public authorities, academics, and other public representatives.

Furthermore, they will closely examine illegal data scraping of internet or CCTV material for facial recognition databases and artificial emotion recognition in the workplace or education. The consultation will also look into some of the criticism that the EU has received for the AI Act not sufficiently regulating the export of high-risk AI systems to countries outside the EU.

Read more here.

Do you have any questions?