Privacy Corner

by Eva Jarbekk, Sigurd Fjærtoft-Andersen and Katarina Foss-Solbrekk


Security cameras mounted on wall.

In Norway, lot of attention has in recent days been directed at new court cases about privacy decisions. Both Meta and Grindr have initiated legal action against the Norwegian State to have certain decisions set aside, which partly concern the legal basis for processing, requirements for consent and the size of fines. The Norwegian Data Protection Authority has, perhaps naturally, stated that these companies are players who have the resources to get legal assistance to defend their business models, which the Norwegian Data Protection Authority's director states that several authorities are critical of. In my opinion, the cases are highly different, but they concern uncertainty in the interpretation of the GDPR, which in itself is not surprising. The degree of uncertainty is also varying. Generally speaking, I am rooting for data protection authorities, who do important work, but I also think it is good that we get legal clarifications in this area. It will be useful both for the authorities and others who have to practice the legislation. And the Norwegian Data Protection Authority can hardly count on companies, no matter how big they are, to accept being fined millions without trying out legal arguments which have not been tried before. I think we will see more lawsuits in the future –privacy lawyers should probably dust off the black cloak.

However, a lot of other things have happened too, below are some.

Lawsuit against the data privacy framework

The General Court of the European Union has heard a case on temporary measures to stop the implementation of the EU-U.S. Data Privacy Framework. The General Court's decision came as a response to a complaint against the legality of the transfer agreement for personal data between the EU/EEA and the USA, which entered into force on 10 July this year. The complaint was filed by Philippe Latombe, who is a French member of the European Parliament. The General Court stated that Latombe cannot prove the individual and collective damage that he believes the new transfer regulations entails.

Despite Latombe's claim for a temporary stop of the implementation of the regulations being rejected, there is reason to assume that he will later be able to have his main claim heard as to whether the Data Privacy Framework as a whole is legal. The General Court's decision shows that the legality of the Data Privacy Framework is still a topical issue, and there is reason to believe that the legality of the adequacy decision will be a debated topic for a long time to come.

Who is responsible for having a data processing agreement in place? A belgian decision clarifies this

The Belgian Data Protection Authority has issued a decision that affects the question of who is responsible for having a data processing agreement in place between the data controller and the data processor.

On 4 September 2020, a Belgian data subject filed a complaint against both the data controller and data processor to the Belgian Data Protection Authority for breach of GDPR Article 28 (3), which requires the data controller to enter into a data processing agreement with any data processors. The complaint was prompted by the fact that the data subject received a parking fine from the municipality on 20 May 2020. In connection with the municipality's documentation of the parking violation, the data subject requested, amongst other things, the data processing agreement, between the municipality as data controller and the third-party data processor, which was used when distributing and collecting the parking fine.

During the Belgian Data Protection Authority's processing of the complaint, it became clear that a data processing agreement between the municipality and the third-party processor was not entered into until 27 July 2020, and that this data processing agreement contained a retroactive clause. The key question for the Belgian Data Protection Authority was thus whether the retroactive clause is legal according to GDPR Article 28 (3), and whether the data controller or the data processor is possibly responsible for the fact that there was no data processing agreement at the time of processing.

The Belgian Data Protection Authority concluded that the processing constituted a violation of GDPR Articles 28, 14 and 12. The Belgian Data Protection Authority affirmed that the existence of a retroactivity clause does not remedy the absence of a data processor agreement at the time of processing, and justified this by saying that such retroactive clauses would provide a means to circumvent data protection regulations and GDPR Article 28 (3). The Belgian Data Protection Authority also concluded that both the municipality and the third-party processor were responsible for breaches of data protection legislation. As such, both the data controller and the data processor were reprimanded by the Belgian Data Protection Authority.

Though no fine was issued in this case, I think this is a situation that can apply to quite a few instances and that this is motivation for putting data processing agreements in place. Read more about the decision here.

About the recording of customer calls

This is also a topic that is relevant to many. The Austrian Federal Administrative Court ("BVwG") has adjudicated a case about legitimate interest as a basis for recording customer calls. A data subject filed a complaint against a bank to the Austrian Data Protection Authority because the bank recorded customer calls without the customers having the opportunity to object to such recordings and the processing of personal data that this entailed. The Austrian Data Protection Authority upheld the data subject's complaint, and the bank appealed the case to the courts. In addition to asserting that the bank had a legitimate interest in processing personal data through the recording of customer calls, the bank stated that it was impossible not to record all incoming calls as a result of national laws imposing a duty on banking institutions to record calls relating to investment services.

The bank's claim was rejected by Austrian courts. In relation to the question of legitimate interest, the court stated that the bank's interest in "ensuring quality" of its services was not sufficiently specific to constitute a valid legal basis for the processing. In response to the bank's argument that national laws make it impossible not to record all incoming calls, the court clarified that it is the data controller's responsibility to establish internal procedures that ensure that compliance with national laws is done in a way that does not conflict with data protection rules. In extension of this, the court also stated that the bank could have established procedures where the topic of an incoming call, and also a possible obligation to record calls about investment services, was made clear at the beginning of the call so that the bank would avoid recording calls where this was not required by law.

BVwG then concluded that the bank's recording of customer calls violated data protection rules, as the processing of personal data neither had a basis in law, nor was it in the bank's legitimate interest according to GDPR Article 6(1)(f).

I myself have to admit that I get a little annoyed every time I call somewhere and I'm told that the call is "being recorded to improve our customer service" and I can't decline. The last time this happened was when I was going to call for a package on its way to me. It should not be necessary to record such calls. And where are they stored? Was the call ever listened to by anyone to improve the parcel service? Almost wondering if I should request access myself...

Read the case in its entirety here.

Groupe Canal+ in France has been fined for marketing activities

On 19 October 2023, the company Groupe CANAL+ was fined EUR 600,000 by the French Data Protection Authority (CNIL) for breach of several articles of the GDPR in the period from November 2019 to January 2021.

CNIL had received 31 complaints related to so-called "cold calls" from Groupe CANAL+ to potential customers who have not previously been in contact with the company. CNIL also received other complaints related to the lack of data protection rights for data subjects, security issues related to passwords for the company's employees, and the company's failure to notify CNIL of a personal data breach in 2020.

CNIL discovered that Groupe CANAL+ used electronic sales campaigns carried out by third-party suppliers to identify potential customers for the company's services. Personal data obtained by third-party suppliers was then used by the company in connection with cold calls to potential customers.

The company could not prove that the potential customers had given valid consent to receive calls as the customers were not informed about the identity of the third-party supplier who carried out the electronic sales campaigns. Although the potential customers had electronically consented to receive electronic door sales, CNIL concluded that the consents were not valid because the customers were not informed on whose behalf the consent was obtained. This is a key element for many and aligns with other decisions in the past. When the company could not document consent, CNIL concluded that the company breached GDPR Article 7(1) and Article 4(11). It is important to be able to document consent!

CNIL further revealed that the company had avoided informing data subjects that they received the data subjects' telephone numbers from a third party, so that the data subjects had not been informed of the purpose and their rights in connection with the company's processing of such personal data. CNIL also concluded that Groupe Canal+ breached GDPR Article 14, as well as its obligations under GDPR Articles 12, 28(3), 32 and 33.

As for the breach that was not reported, it concerned the names and addresses of approximately 10,000 customers who, for a period of 5 hours and 36 minutes, had been available to other customers. That is thought-provoking in itself – is this a type of breach you would report yourself?

The size of the fine was determined on the basis that Groupe Canal+ breached GDPR to a significant extent, and that some of the breaches are of a more serious nature. However, CNIL also emphasized certain mitigating circumstances, including the fact that the company has implemented extensive measures and in this way brought itself into compliance with several of the points in the GDPR.

Read the summary of the case here.

The Swedish data protection authority issues administrative fine against H&M for its marketing practices

IMY, the Swedish Authority for Privacy Protection, issued an administrative fine of SEK 350,000 against H&M for its marketing practices on 17 October 2023. Six individuals living in various European countries, such as Poland and Italy, submitted complaints objecting to the receipt of direct marketing from H&M. As H&M is headquartered in Sweden, IMY investigated the complaints.

IMY’s investigation concluded that H&M violated data protection law because H&M did not, without undue delay, cease to handle the complainant’s personal data for marketing purposes despite the complaints objecting to such processing. Nor did H&M, according to IMY, have sufficient systems and routines in place to make it easier for complainants to exercise their right to object to direct marketing.

It should be easy to avoid taking part in advertising and offers you are not interested in, said Albin Brunskog, IMY’s head of unit.

For more information see.

The privacy appeals board on rectification of information

On 15 September 2023, the Privacy Appeals Board issued a decision in case PVN-2023-04 on rectification of personal data at the Norwegian Labour and Welfare Administration (NAV). The data subject (a private individual) had appealed the Norwegian Data Protection Authority's decision of 20 December 2021, after the data subject had filed a complaint to the Authority with a demand to rectify information in a decision from the Authority on the right to unemployment benefits.

The case before the Privacy Appeals Board was prompted by the data subject receiving unemployment benefits from NAV in the autumn of 2015. After applying for support for establishing a business, NAV made a decision on 2 October 2015 to suspend unemployment benefits as a result of their understanding that the conditions for the right to unemployment benefits were not met. After further correspondence, NAV reassessed the case, and granted the data subject unemployment benefits in a new decision on 1 December 2015. On 17 March 2016, NAV issued a new decision which reversed the decision from 2 October 2015 on the suspension of unemployment benefits. NAV concluded that unemployment benefits should never have been suspended and granted the data subject unemployment benefits for the entire period during which payment had been suspended.

The data subject demanded that the decision on the suspension of unemployment benefits from 2 October be rectified by NAV in accordance with GDPR Article 16. NAV, for its part, claimed that the decision of 2 October 2015 had been rectified as a result of the subsequent decisions of 1 December 2015 and 17 March 2016, and that a note about "See amendment decision 17 March 2016" was added in the decision of 2 October 2015.

The Norwegian Data Protection Authority concluded that the conditions for rectification according to GDPR Article 16, first sentence, were not met, because the information that was requested rectified is not an "incorrect" representation of NAV's assessments at the time of the decision. The Privacy Appeals Board upheld the Data Protection Authority's decision in its decision of 15 September 2023 and emphasized that errors in previous decisions as a result of an incorrect understanding of factual or legal circumstances are not errors that can be requested rectified under GDPR Article 16. However, they can be changed by the decision being appealed, set aside or through new decisions. In addition, the Privacy Appeals Board pointed out that NAV's decision of 2 October 2015 correctly expresses the facts that NAV assumed when the decision was made, and that the three decisions taken together express that the data subject was entitled to unemployment benefits at the time of application and that the suspension of unemployment benefit was therefore unlawful. On this background, it was concluded that the connection between the decisions means that NAV has not registered incorrect information about the data subject, and that the condition for rectification according to GDPR Article 16 is therefore not met.

Read the Privacy Appeals Board's decision in its entirety here.

For discussion of the case in English.

The Danish data protection authority addresses cookies

The Danish Data Protection Authority reported Texas Andreas Petersen A/S to the police on 6 October 2023 for collecting and processing personal data about website visits to
without a legal basis. Upon reviewing the website, the Danish Data Protection Authority discovered that a number of cookies were being used to collect and disclose information about visitors to Google and Meta.

Because Texas Andreas Petersen A/S processed personal data without a legal basis, whereby such processing potentially involved a large number of website visitors, the Danish Data Protection Authority issued a fine totalling DKK 200,000. Another reason for why the fine reached this amount was because the Danish Data Protection Authority has provided guidance on how personal data concerning website visits should be processed since February 2020. The fine is not very high in a European context, but it is high in a Danish context as the DPA there hardly renders any fines anymore.

It is worth noting that Norway takes a different approach to our Danish neighbours. For the time being, consent to cookies can be given through the browser’s settings, and certain exceptions to the consent requirement and obligation to provide information apply under Norwegian law. This is, however, expected to change once the new electronic communications law takes effect. We will of course notify you when it does!


Are providers of co-location in data centres data processors?

The Danish Data Protection Authority has, following an inquiry from Region Midtjylland, considered whether providers of co-location of servers which are used for processing personal data should be considered data processors. In its inquiry, Region Midtjylland stated that co-location is a storage service provided by IT companies, and that the region places its own servers in a server cabinet with a company that provides the server cabinet.

The Danish Data Protection Authority generally concluded that a business, authority or other organisation that provides co-location of servers should not be considered data processors for the organisations or businesses to which the co-location service is delivered. As a prerequisite for its conclusion, the Danish Data Protection Authority emphasized that the provider of co-location does not have access to personal data on the servers the business stores. As an example, the Danish Data Protection Authority cites situations where the customer has placed its own servers with power and internet connections in a locked server cabinet, to which only the customer has access.

To justify its point of view, the Danish Data Protection Authority also emphasized that the provision of co-location for servers is primarily about the provision of a service other than the processing of personal data, through the provision of physical facilities, internet and power supply, and that the provider therefore does not initially have access to the information stored on the servers. The Danish Data Protection Authority emphasizes, however, that the statement only constitutes a starting point, and that other circumstances may lead to the co-location provider being considered a data processor.

As examples of circumstances that may lead to the provider being considered a data processor, the Danish Data Protection Authority highlights situations where the provider has access to the server cabinet, so that the personal data can be accessed. Other situations include where the provider can replace or otherwise process the hard drives that are stored or where the servers can be moved, turned off and on or otherwise handled. Another situation is where the provider offers additional services beyond just physical facilities, electricity and internet, for example services in the form of firewalls, back-up or other security measures that include the processing of personal data.

The Danish Data Protection Authority also noted that the co-location provider should, as a basis, be the data controller for the processing of personal data that takes place as part of the physical security measures that the provider has established, such as, e.g., registration of visitors, logging of key tags and camera surveillance. The Danish Data Protection Authority further noted that it is the businesses that use co-location who are obliged to establish and carry out controls with satisfactory processing security in line with GDPR Article 32. This means that the customer must be aware of the provider's security measures, and assess whether these are sufficient in relation to the processing activities carried out on the servers.

Overall, there is still reason to emphasize that the Danish Data Protection Authority's statements suggest that providers of server co-location should not be regarded as data processors, unless specific circumstances of the provider's service indicate that the provider is nevertheless involved with the personal data that is processed on the servers.

The decision can be read in its entirety here:


More attention to employee monitoring

The Norwegian Data Protection Authority has again turned its attention to how companies monitor employees, with a focus on the use of cameras in workplaces with young workers. As announced on 19 September 2023, it will carry out several camera inspections in the future. And it will do so unannounced, without any prior notice. This is highly unusual, underlining how seriously the Norwegian Data Protection Authority takes this issue.

This announcement comes in the wake of an unannounced inspection at Fast Candy AS, a candy store in Oslo. The Norwegian Data Protection Authority discovered that large parts of the candy store were subject to camera surveillance, including the area behind the till and the warehouse/office area. The camera solution also had continuous remote access and audio recording. Based on these findings, the Norwegian Data Protection Authority issued several orders. Fast Candy must stop the camera surveillance behind the check-out section in the store as the Norwegian Data Protection Authority affirms that employees should not be subject to monitoring behind the till, unless special circumstances render such monitoring necessary. The store must also adjust or remove the camera surveillance of the warehouse and office area, as well as stop all use and the possibility of remote access and sound recording through the camera surveillance. Lastly, the company must provide employees with further information about camera surveillance and establish internal controls for such surveillance.

Although the Norwegian Data Protection Authority recognises that it is easy to implement affordable camera solutions, many of these have pre-set settings which are not privacy friendly. Companies are responsible for using the camera in a lawful manner and cannot rely on these settings, affirmed the authority.

Employee monitoring is notably also on the agenda abroad.

The Information Commissioner's Office (ICO) has recently issued new guidance on employee monitoring. Monitoring may entail tracking phone calls, messages and keystrokes, as well as web camera footage, audio recordings, screenshots or the use of specialist monitoring software to track employee activity, according to the ICO. If organisations wish to monitor employees, they must take certain steps, including providing employees with information about the nature, extent and reasons for monitoring, ensure that such monitoring has a clearly defined purpose, use the least intrusive measure available and provide alternatives for employees who do not wish to use biometric access controls for workspace access, such as swipe cards or PIN numbers. Even though we often say that the UK allows more surveillance on employees than the Scandinavian countries, the guidance here is quite parallel to what the Scandinavian DPAs would write.

The Austrian data protection authority also recently held that a controller, who had installed GPS-tracking devices on company vehicles used by its employees, could neither rely on a legal obligation, nor legitimate interests as the lawful basis for processing. While it recognised certain legitimate interests of the controller, including using the GPS data to calculate hours worked, the expenses incurred to adequately compensate employees, and locating employees so they could be sent to clients in need, the Austrian data protection authority found that the controller could access such information and execute these tasks without the GPS-tracking device, meaning the required conditions to rely on legitimate interest were not satisfied. Nor could the controller rely on the existence of a legal obligation as the controller satisfied the Austrian law on working time prior to the use of the GPS-tracking device and did so with less intrusive means. The controller was ordered to stop all processing of personal data via the GPS-tracking system.

For further information, see here, here and here.  

France publishes practical guidance on AI development

The CNIL, the French data protection authority, has shared practical guidance for the development of artificial intelligence ("AI") systems. Four things are particularly worth mentioning.

First, with regards to defining a purpose, the CNIL states that the processing operations carried out in the deployment phase and the development phase pursue, in principle, a single overall purpose when the operational use during the deployment phase is precisely identified from the development stage. The matter is less clearcut for general purpose AI systems. In these cases, the operational use may not be explicitly identified from the development phase, meaning the purpose of processing in this phase must include comprehensible information on the type of system, as well as its functionalities and capabilities.

Second, although controllers must find an appropriate legal basis for their AI system, the CNIL provides that consent, legitimate interests, public interest and performance of a contract may all be used as a legal basis, depending on the context.

Third, as developing AI systems requires significant data, including personal data, it comes as no surprise that the CNIL clarifies that data protection impact assessments for the development of AI systems must address specific AI risks. For example, the risk of sharing false content about real people. Fourth, controllers must continuously monitor and update data minimisation and data protection measures to ensure that those adopted during the data collection phase do not become obsolete.

The guidance is available here.

The scope of the Norwegian data protection authority's ban on Meta – can you advertise there now?

On the 13 October 2023, the Norwegian Data Protection Authority announced that there is no general prohibition to post advertisements on Facebook. The reason behind this announcement is that the data protection authority has received several inquiries from companies, wondering whether they were affected by the Norwegian Data Protection Authority’s temporary ban on behaviour-based marketing on Facebook and Instagram. Under the ban, Meta is prohibited from tailoring advertisements based on monitoring and profiling of Norwegian users. The ban currently awaits review from the European Data Protection Board, who is due to publish its decision by 27 October 2023.

Albeit not directly affected by the ban, the Norwegian Data Protection Authority encourages companies to practice awareness when assessing whether to post advertisements on Meta’s platforms. This entails that the company is responsible for making necessary assessments and must be aware that Meta is currently engaging in illegal behavioural advertising.


Data protection also applies in politics

Amid the municipal election in Norway, the majority parties in Stavanger, including the Labour party and Centre party, sent an email containing political advertising to several parents whose children attend kindergarten and school in Stavanger municipality in august 2023. The parents’ contact information was not public knowledge. The parties retrieved access to such information from Stavanger municipality itself using the Public Information Act. Even though personal data may be accessed under the Public Information Act, any further processing must be compliant with data protection law.  

Unaware that their personal data was being processed and accessed in this manner, several parents submitted a complaint to the Norwegian Data Protection Authority. Said complaint questioned the legality of both the political parties’ and Stavanger municipality’s processing of their personal data.

Based on these complaints, the Norwegian Data Protection Authority sent a demand for an explanation to the Stavanger Labour Party on behalf of the majority parties. Said letter asks for further information on who was responsible for the processing, what personal data was processed, the purpose of processing, its legal basis, what information data subjects received, as well as storage periods.

The deadline to respond was on 29 September 2023. The Norwegian Data Protection confirms that they received a report and are now assessing it. More information will surely soon follow, and we eagerly await the outcome, which will no doubt have implications for other political parties wishing to share political advertisements.


Data breaches are expensive in the US

Following claims relating to a data breach which exposed sensitive data from 13,000 nonprofits in 2020, Blackbaud, a fundraising software company, agreed to pay $49.5 million in settlement fees. Data privacy breaches are, in other words, expensive in the US!

An unauthorised third party gained access to Blackbaud’s data, which Blackbaud publicly acknowledged on 16 July 2020. What Blackbaud allegedly failed to disclose, however, was the extent of the breach and the sensitive nature of the data which had been stolen. Over one million files were implicated. As such, the attorney generals of 49 states and Washington D.C pursued data privacy claims against Blackbaud. Although Blackbaud admits no wrongdoing under the settlement agreement, it must pay the settlement fees, and agrees to strengthen its practices relating to data security and notifying customers following a data breach. A third party will also assess Blackbaud’s compliance with the settlement agreement for seven years.


And finally, an overall and principled matter – the CSAM proposal

Recently, the EU voted on the so-called CSAM proposal, which is part of the EU's measures to combat child abuse material on the internet. The proposal can in practice make encrypted, secure messaging services that we know today, such as WhatsApp, iMessage and Signal, unlawful.

The proposal has been hotly debated, partly because this will open the way for mass monitoring of messaging services that could affect source protection, the transfer of health data and general privacy. The critics of the proposal claim that the regulations will be easy to circumvent for the criminals it is intended to stop, and that the proposal may therefore have greater consequences for ordinary people than for the criminals. The critics also note the services may have to monitor all content that is sent, to be sure that they comply with EU regulations.

Among other things, the EU has proposed making scanning for child abuse material mandatory by requiring service providers to use technology to track down, report and remove abuse material from their services. For its part, the Norwegian Data Protection Authority has stated that the EU's proposed regulations will be easy for criminals to circumvent, for example by encrypting abusive material before it is sent or uploaded. The Norwegian Data Protection Authority has also stated that they support efforts to ensure effective measures against sexual abuse of children online, but that the proposed CSAM regulations raise serious challenges for privacy. The Norwegian Data Protection Authority has many good points here, and punching holes in encryption solutions has a number of major privacy consequences.

If the EU decides to adopt the CSAM regulation, it will be adopted in 2024 at the earliest.

Read more about the case here.

Do you have any questions?