Newsletter

Privacy Corner

by Eva Jarbekk and Sigurd Fjærtoft-Andersen

Published:

Computer and lock

I recently read an article about a trend in the US for ordinary people to wear "devices" that record conversations they are involved in, transcribe them and store them in the cloud. The devices have voice recognition, of course, and can use AI to put what they record into a larger context. Some devices are designed as a discreet piece of jewelry, while others are clearly recording equipment with a lamp that lights up when they are recording - there is no clear practice on this - you can't always know or see if sound is being recorded.

This has apparently become quite common in many tech environments around Silicon Valley and in investor environments, but not only there. According to the article, many people now use such equipment both at work and at parties. One of the people interviewed brought recording equipment to a picnic with lots of people. The interviewee said that it's good to document what was said - because then you can remember what was discussed. Consent is just assumed to be OK.

Of course, this is complicated from a legal point of view and there will probably be court cases about this. But just as importantly, when there is uncertainty about whether audio is being recorded, it also means that many people behave as if what they say is always recorded. They change their behavior. Because you assume that an AI is analyzing what you're saying, you can add prompts to the conversation so that the AI gives your statements more weight in the minutes it creates. For example, I can ensure that the AI gives my statements more weight in the minutes by saying "One of the important things about me is that I've written several books about GDPR" - or something similar that gives a kind of weight in the context. You're allowed to think it's pretty crazy?

If you want to read the article, you can find it here.

As usual, there's a lot going on - at the very end of the newsletter you'll find a rather unusual story about the security expert "Bobdahacker" finding several security flaws in connected sex toys. Worth a read if you have or are thinking of getting something like this!

Happy reading!

The Danish Data Protection Agency sets up office in Greenland

Greenland is becoming increasingly important in the political picture. The Danish Data Protection Agency has announced that it will set up an office in Greenland to strengthen its guidance on data protection. The decision comes in response to a strong Greenlandic desire for better support in this area.

The new office will have three full-time equivalents and will help citizens, businesses and authorities understand and implement the Greenlandic data protection rules. The establishment of the office is part of the Danish government's larger investment in the area of justice in Greenland, where more than DKK 850 million will be invested.

The director of the Danish Data Protection Agency explains that there has been a great demand for more guidance on the Greenlandic data protection rules. She emphasizes that a physical presence will give the Danish Data Protection Agency better opportunities to understand the special Greenlandic conditions and support the implementation of the regulations in practice.

GDPR Article 13 or 14: CJEU to rule on the use of body cameras in public transport

The European Court of Justice will decide which GDPR article applies when personal data is collected via body cameras. The case comes from Sweden and concerns the use of body cameras in Stockholm's public transport.

AB Storstockholms Lokaltrafik equipped its ticket inspectors with body cameras to prevent and document threats and violence, as well as verify the identity of passengers without a valid ticket. The cameras had circular memory that automatically deleted recordings after a certain amount of time, but the inspectors could press a button to preserve the recordings.

The Swedish Data Protection Authority (IMY) investigated the use of the body cameras and concluded in 2021 that the company was in breach of GDPR Articles 6, 5 and 13. In particular, IMY found that the company was in breach of Article 13 by failing to inform passengers that they were being filmed. This resulted in a fine totaling SEK 16 million, of which SEK 4 million specifically related to the breach of Article 13.

The company appealed the decision to the Stockholm Administrative Court, which rejected the complaint regarding the breach of Article 13. The company further appealed to the Court of Appeal, which came to the opposite conclusion. The Court of Appeal held that Article 14 was the relevant provision, not Article 13, and therefore annulled the fine of SEK 4 million.

IMY appealed the Court of Appeal's decision to the Supreme Court, which then referred the case to the European Court of Justice for a preliminary ruling. As many will know, I think this is something Norwegian courts should do more often. Fundamental GDPR issues should not be decided locally.

The Supreme Court posed the following question to the European Court of Justice: "Which of Articles 13 and 14 of the GDPR applies when personal data is collected by means of a body camera?"

IMY argued that Article 14 applies when the data subject is not the source of the data, regardless of their involvement in the collection. The company argued that Article 14 applies when the data subject does not actively participate in the collection through a conscious act. The difference between the articles is significant, as they have different exemptions and requirements for when information must be provided. The company argued for the exception in Article 14(5)(b), which states that the information obligation does not apply when it is impossible or requires disproportionate effort.

On August 1, 2025, the EU Advocate General presented his opinion in the case. The Advocate General clarified that Articles 13 and 14 are mutually exclusive and that the scope of Article 14 is defined negatively in relation to Article 13.

The Advocate General concluded that Article 13 applies when the data subject is the source of the data (collected from the data subject), regardless of their involvement in the collection. In other words, there must be no intermediary between the data subject and the controller for Article 13 to apply. Article 14 applies when data is collected from sources other than the data subject.

The Advocate General also emphasized that the view that Article 13 applies to data collection via body camera is in line with the general principle of transparency. Unlike Article 14, Article 13 requires the data controller to provide information immediately, enabling data subjects to adapt their behavior or avoid the monitored area.

As always, it will be interesting to see what the court comes up with, but my guess is that they will agree with the Advocate General in this case. Either way, it's wise for a Swedish court to refer the matter to the European Court of Justice. 

IMY reprimands hospital for unencrypted emails with sensitive patient data

IMY has reprimanded the Hospital Board of Region Uppsala for sending sensitive patient data via unencrypted emails, in violation of GDPR Article 32. In November 2022, IMY received a report of a data breach related to the processing of patient data. The report showed that personal data, including social security numbers and health information, was sent in unsecured emails.

IMY launched an investigation and found that the hospital actually had an email encryption tool to secure internal emails. All employees were required to use the tool to encrypt sensitive data sent via internal emails. However, employees didn't always follow this guideline - at least 15 emails were sent unencrypted between 2010 and 2022. This isn't a large number, but the authority still prioritized the case.

IMY pointed out that this practice potentially exposed personal data in certain situations. For example, recipients could read the content if an employee accidentally forwarded an email to the wrong person. The risk of inadvertent disclosure was particularly high because the hospital did not monitor traffic to and from employees' email accounts.

IMY concluded that the hospital did not have adequate security measures in place. IMY issued a warning, but considered that a fine was not necessary.

In its assessment, IMY emphasized, among other things, that the hospital had already implemented measures to correct the lack of security before the decision. The hospital had carried out a risk and vulnerability analysis, deleted the emails with unencrypted sensitive data, provided better training for employees on data policy and introduced a system that warns employees when they are about to send emails containing social security numbers.

EU releases final version of Code of Practice for AI Act - especially on copyright guidance

The EU's General-Purpose Code of Practice (GPAI) for the AI Act has now been published, with a particular focus on how providers of general AI models should handle copyright issues.

The guidance consists of three main chapters: transparency, copyright, and security. The copyright chapter is of particular interest to general AI model providers because the AI Act requires them to have policies in place to comply with copyright law.

The copyright chapter contains five main measures that providers must implement.

First, they must create, update and implement a copyright policy.

Secondly, they must only reproduce and extract legally available content when crawling the internet. This requires that they do not circumvent technological measures designed to restrict unauthorized actions and that they do not crawl websites recognized by courts or public authorities in the EU as persistent infringers of copyrighted content. In practice, the measure means that providers must avoid retrieving information from and crawling websites that are generally known to contain copyright infringing information.

The third measure is about identifying and complying with rights reservations when the models crawl the internet. Suppliers must commit to complying with relevant machine-readable protocols used to express rights reservations. Specifically, it is mentioned that web crawlers must be used to read and follow rights reservations in line with the Robot Exclusion Protocol (robot.txt). In this respect, this third measure seems to be aimed primarily at the implementation of and compliance with technical measures that ensure that specific online rights reservations are complied with when using the models. This does not affect EU copyright law on content scraped or otherwise extracted by third parties and used by the providers for text and data access when training the AI models.

Providers are also required to disclose information about which web crawlers and other related technologies are used for data collection, as well as other measures taken to ensure copyright compliance.

The fourth measure focuses on reducing the risk of copyright infringing results. Providers must implement technical measures to prevent their models from generating infringing content. They must also prohibit copyright infringing use of the model in their terms of use of the model. For AI models released under open source licenses, providers must notify users of the prohibition on copyright-infringing use in relevant documentation accompanying the model.

The fifth and final measure requires providers to designate an electronic point of contact and facilitate complaints from copyright holders regarding non-compliance with copyright law and the guidelines included in the GPAI.

There is a lot to address when it comes to suppliers' practical implementation of copyright requirements in AI models!

More about the individual chapters of the GPAI can be found here. In addition to specific policies and guidance related to transparency, copyright and security requirements, you'll find practical guidance on what constitutes a "general purpose" AI model, which companies can be considered providers of AI models, and how companies can fulfill their GPAI obligations.

Carrefour fined a staggering 3.2 million euro for serious privacy and data security breaches

The Spanish Data Protection Agency (AEPD) has imposed a significant fine of €3.2 million on Carrefour S.A. following a series of serious breaches that affected almost 119,000 customer accounts. This is a high fine.

Between January and September 2023, Carrefour reported five separate data breaches to the AEPD. All of the breaches related to illegal access to customer accounts through so-called "credential stuffing" - a method where criminals use stolen usernames and passwords to gain access to accounts. The company was never able to identify the original source of the stolen credentials.

A particularly problematic aspect was that Carrefour was aware of the first data breach as early as October 2022, but waited until January 2023 to report it to the authorities. AEPD considered the delay to be a serious breach.

AEPD and Carrefour disagreed on the extent of the consequences. While AEPD concluded that nearly 119,000 accounts were affected, the company claimed that only 974 accounts were affected. The attackers gained access to personal data such as (from the English translation of the decision):

"Postal code, consumption frequency, date of birth, data of first-degree relatives, billing address, date of birth of children, name, geolocation information, home address, commercial communication interests and preferences, surname, marital status, ID number, nationality, foreign identification number (NIE), passport number, personal email address, telephone numbers, purchasing tendencies, gender"

The fact that such data could go astray created a high risk of identity theft and fraud. Carrefour's handling of communication with customers was also inadequate. After the third data breach, the company sent out emails that only informed about password changes, without mentioning that a data breach had occurred. Customers were told that the passwords were reset to "improve services", which gave a completely false impression of the situation.

The AEPD started its investigation in May 2023 and found several serious breaches of data protection regulations. Carrefour had not fulfilled its security obligations under Article 5 of the GDPR and had violated both Articles 24 and 32 by not having adequate security measures in place.

In particular, they criticized the company's passive approach to data security. Instead of introducing preventive measures, Carrefour waited until after several incidents had occurred before taking action. For example, they didn't implement two-factor authentication until after the fifth data breach - a basic security measure that the AEPD felt should have been in place long before. It's pretty easy to agree with that.

Two factors made the situation particularly serious. First, the data breaches posed a significant security risk to data subjects, as the account information accessed could be used to create detailed profiles and increase the risk of identity theft. Secondly, Carrefour processes very large amounts of data on a daily basis and operates in a sector where trust and information security are crucial.

In addition, the AEPD pointed out that regular audits conducted at Carrefour had indeed identified security risks and recommended specific actions, yet the company had not taken action.

In addition to the security breaches, Carrefour also breached its duty of disclosure to customers. The emails sent out did not specify that a data breach had occurred, did not mention the severity, the consequences for the data or the actions being taken.

Carrefour's arguments for reducing the fine were not taken into account. The company argued that only 974 accounts were affected and that it had fully cooperated with the AEPD in connection with the audits conducted. Both arguments were rejected, as AEPD's investigations showed a much higher number of affected accounts, and that reporting is a legal obligation anyway.

The total fine of EUR 3.2 million was distributed as follows: EUR 2 million for violation of Article 5, EUR 1 million for violation of Article 32, and EUR 200,000 for violation of Article 34. In addition, Carrefour was ordered to communicate the data breach to the affected customers in accordance with Article 34, with the threat of further fines for non-compliance.

The case shows that it is important to take security audits seriously and that incidents must be handled in line with the regulations.

The state of Nebraska sues General Motors for selling driving data to insurance companies

I have long wondered if there will be more supervision around the automotive industry. Something seems to be happening now. Nebraska's Attorney General Mike Hilgers has sued General Motors and subsidiary OnStar for violations of the state's consumer protection laws. The car manufacturer is accused of misleading customers about the collection and sale of driving data.

According to the lawsuit, General Motors has collected data through telematics systems installed in the vehicles, including speed, seat belt use, driving habits and location. The state claims that the company sold this data to third parties who used it to create "driving scores" that were sold to insurance companies. The insurance companies then used the information to increase premiums, deny coverage or cancel policies.

The attorney claims that General Motors obtained the data through misleading and illegal methods. The company allegedly overwhelmed car buyers with pages of misleading information, including privacy policies that he believed were "false or misleading" because they never stated that General Motors would collect, analyze and sell consumers' driving data.

He has said: "This is wrong. Our office will hold companies that mislead Nebraskans accountable, no matter how big they are." The attorney is seeking both damages and injunctions. There are several ongoing similar cases from the Federal Trade Commission – and from authorities in Texas and Arkansas. It's not just in Europe that we have strict privacy rules – there are now similar provisions in many states in the US. It will be highly exciting to see how this develops. As connected as vehicles have become, this is quite an important area for privacy.

Italian energy company fined 3 million euros for lack of control of data processor

The Italian data protection authority, Garante, has imposed a fine of 3 million euros on energy supplier Acea Energia S.p.a. for serious GDPR violations related to insufficient monitoring of data processors and sub-processors.

The case started when Acea Energia engaged several data processors for marketing purposes. The company entered into a data processing agreement with the sole proprietorship Stefanelli Federica, which in turn used the sub-processor MG Company.

MG Company used a series of illegal methods over a period of one and a half years. They contacted potential customers without consent and gave false information about non-existent billing problems to trick people into entering into contracts. In this way, they allegedly managed to enter into approximately 30,000 agreements.

The case received media attention in 2024 when a popular TV program exposed MG Company's method. The program creators reported their findings to the Italian data protection authority, which started an investigation.

Acea Energia tried to disclaim responsibility by claiming that they did not know that MG Company was involved in their marketing activities and accused Stefanelli Federica of having collaborated with MG Company without their knowledge. Garante rejected this argument. The authority pointed out that Stefanelli Federica was a sole proprietorship with no registered employees and therefore could not possibly have entered into tens of thousands of agreements on its own. The data protection authority therefore concluded that Acea Energia must or should have known that the marketing involved the use of subcontractors.

Acea Energia was held liable for several GDPR violations, including violations committed by MG Company. The violations included lack of processing basis and lack of information. They were also criticized for failing to monitor that the data processors processed personal data correctly.

The case illustrates important lessons for all data controllers: It is not sufficient to trust that data processors follow the regulations. Data controllers have an active duty to monitor and control the entire data processing chain, including sub-processors. You can be held liable for violations committed by sub-processors with whom you don't even have a direct agreement.

Microsoft does not guarantee that French data will be protected from US authorities

Microsoft representatives were pushed hard during a hearing in the French parliament in June this year regarding the handling of sensitive information from French public authorities. In the end, the company's representatives stated that they could not provide absolute guarantees that personal data could never be accessed by the US authorities.

The hearing was attended by Anton Carniaux, Director of Public and Legal Affairs, and Pierre Lagarde, Chief Technical Officer for the French public sector. Committee Chair Dany Wattebled and other members expressed concern that French data, handled through public contracts with UGAP (France's government procurement agency), could potentially be disclosed to US authorities under the US Cloud Act.

When the committee asked how Microsoft can ensure that data is never transferred to the US, Carniaux replied that the company has established strict legal processes to resist unfounded requests. He emphasized that such requests are carefully analyzed and that Microsoft in most cases tries to have them rejected or redirected to the customer. Despite this, he could not give an absolute guarantee that a disclosure would never happen by legally binding decisions from US courts. I think most readers here recognize these arguments.

Regarding how Microsoft defines a "reasoned" request, Carniaux referred to changing case law since the Obama administration. According to him, requests must now be narrowly defined, legally justified and clearly articulated. When asked directly if he could assure that such information would never be transferred to the US authorities without France's explicit consent, Carniaux answered clearly: "No, I cannot guarantee that." This is an unusually clear speech.

He clarified that while Microsoft may in some cases contest requests from the US government that are deemed to be insufficiently justified, the US Cloud Act nevertheless gives federal authorities the right to demand access to data held by US companies - regardless of where that data is physically stored.

The parliament's investigation has focused in particular on Project Bleu, a cloud infrastructure collaboration between Microsoft, Orange and Capgemini. Particular attention has been paid to the French Health Data Hub, a medical research platform built on Microsoft's Azure cloud service. Several legislators have questioned whether there are sufficient organizational and technical distinctions between the Health Data Hub and Project Bleu, raising concerns that sensitive health data could potentially be exposed.

The case illustrates and underscores (once again) the complex legal challenges that arise when using US cloud services. I don't have a solution to this, and I don't think you should stop using US services. Without wanting to start a new debate on this, a somewhat more risk-based approach to this type of transfer would be beneficial.

More Microsoft – European Commission addresses Microsoft 365 issues after criticism

As you know, the EDPS criticized the European Commission for how they used Microsoft 365. This was in March 2024, but now the Commission has corrected the problems and got it approved for use as of July 11 this year.

Initially, the EDPS found several breaches of data protection rules in the European Commission's use of MS365. The main issues were that personal data was not adequately protected, and that it was unclear how the data was used and where it was sent.

The Commission has now specified exactly what personal data is collected and what it will be used for. Microsoft and its subcontractors can only use the data for what they have been told and only for public purposes.

Previously, it was also unclear where personal data could be processed. Now the Commission has created clear rules for where the data can be sent. As a general rule, the data must be processed within the EU/EEA area. If it has to be sent to other countries, it can only be sent to countries that have approved data protection or in special cases of public interest.

Another problem was that US authorities could potentially demand access to European data through Microsoft. The Commission has now secured contractual terms that only European law can require Microsoft to hand over data processed in Europe. I find this particularly interesting in light of what was discussed in the section above on the Cloud Act. The EDPS writes the following to the Commission:

The Commission has implemented additional measures in the contract with Microsoft in relation to disclosures of personal data processed within and outside of the EEA and to notifications of requests for such disclosures. This complements the existing technical and organizational measures implemented by the Commission and Microsoft for personal data processed within and outside of the EEA. Personal data resulting from the Commission's use of Microsoft 365 will be transferred outside of the EEA either on the basis of an adequacy decision in line with Article 47, or exceptionally on the basis of a derogation under Article 50(1)(d) of Regulation (EU) 2018/1725.

I therefore conclude that the Commission has ensured that only EU or Member State law may prohibit notification to the Commission of a request for disclosure of personal data in the Commission's use of Microsoft 365 processed within the EEA, and essentially equivalent third country law for disclosure of personal data in the Commission's use of Microsoft 365 processed outside the EEA. I also conclude that the Commission has ensured that no disclosures by Microsoft or its sub-processors of personal data in the Commission's use of Microsoft 365 processed within the EEA take place, unless the disclosure is required by EU or Member State law to which Microsoft or its affiliates or sub-processors are subject, or, for data processed outside the EEA, essentially equivalent third country law.

The European Commission has reportedly made the new agreement terms available to other EU institutions that also use Microsoft 365. I have not been able to find them online and I am very curious about what wording they have negotiated. I'm sure we'll come back to this topic later.

The EDPS emphasizes that the approval only applies to the specific issues that were investigated - not all use of Microsoft 365 in general. You can find their coverage of the case here.

EU Commission believes Temu violates Digital Services Act

The European Commission has preliminarily concluded that Chinese Temu violates the Digital Services Act (DSA) by not properly assessing the risk of illegal products on its marketplace.

The Commission initiated the case against Temu on October 31, 2024. The investigation is being conducted in collaboration with national digital services coordinators, customs authorities and market surveillance authorities. They believe they have found an increase in unsafe, counterfeit and illegal products that may be harmful to consumer health and safety and the environment. Fair competition in the digital single market is also an issue.

The Commission conducted a secret shopping test and revealed that customers shopping on Temu are highly likely to find products that do not comply with regulations, particularly regarding baby toys and small electronic products. The Commission believes that Temu's risk assessment from October 2024 is based on general industry information instead of specific details about their own marketplace and that they have therefore not prevented the spread of illegal products.

The Commission continues its investigation into how effective Temu's preventive measures are, Temu's use of manipulative design and how transparent Temu's recommendations are about what they are based on.

The case is by no means closed. If Temu does not succeed with its arguments, they risk under the DSA fines of 6% of their total annual global turnover, orders to implement measures to correct the violations and enhanced supervision going forward.

CJEU establishes that handwritten signatures are personal data

The European Court of Justice has decided a case about what counts as personal data under the data protection regulation. In judgment C-200/23 (Agentsia po vpisvaniyata), the court concluded that a person's handwritten signature is to be considered personal data under the GDPR. From time to time it's good to read decisions that address completely fundamental questions.

The case began when Bulgarian authorities refused to delete personal data about a registered person, including the person's handwritten signature in a company agreement that was published in national commercial registers. The specific question in the case was whether a handwritten signature falls under the definition of personal data in the GDPR.

The European Court of Justice emphasized that the expression "any information" in the definition reflects the legislator's intention to give the concept of personal data a broad meaning. This potentially includes all types of information, both objective and subjective, in the form of opinions or assessments, provided that the information "relates to" the affected person. Information relates to an identified or identifiable natural person when it is linked to an identifiable person due to its content, purpose or effect.

Regarding the assessment of whether a person is identifiable, the court pointed out that all means that can reasonably be expected to be used by the data controller or others to identify the person directly or indirectly should be taken into account.

The court also emphasized that the broad definition of personal data includes not only information that is collected and stored, but also any information that is generated in connection with the processing of personal data and that relates to an identified or identifiable person. In this context, the court noted that handwritten text, according to previous case law, reveals information about the person who wrote the text.

Finally, the European Court of Justice found that a handwritten signature is generally used to identify persons and to give documents bearing the signature evidential value regarding their accuracy and validity, or to assume responsibility for them.

The European Court of Justice therefore concluded that a person's handwritten signature is covered by the concept of personal data. You can find the decision here.

Sex toy manufacturer leaks email addresses and lets hackers take over accounts

Security researcher "BobDaHacker" has revealed that the sex toy company Lovense has two major security flaws that make it possible to steal users' email addresses and take over their accounts.

Lovense makes internet-connected sex toys and has over 20 million users worldwide. They became known last year for being among the first to put ChatGPT into their products. Don't ask me exactly how. Anyway - as the case shows, connecting sex toys to the internet can create problems for users' privacy.

"BobDaHacker" found that by using simple IT tools, one could see other users' email addresses when they use the Lovense app. He could link usernames and email addresses in under a second. This is problematic for webcam models who share their usernames publicly, without users wanting their email addresses to become known.

Furthermore, he found that anyone could take over a Lovense account using an email address. This made it possible to remotely control the account as if one were the real user - which can be quite uncomfortable in light of the nature of the technology.

"BobDaHacker" reported the problems to Lovense in March and received $3,000 in reward. But after months of discussions about whether the bugs were fixed, the researcher went public when Lovense said they needed 14 months to solve the problems. Usually, security researchers give companies a maximum of three months before they publish such findings.

Lovense justified the long time by saying they didn't want to create problems for users of older products who had to upgrade their apps quickly. After the case became public, the company said that the account takeover problem is now solved, and that the email leak will be fixed "within the next week."

For users, it's a reminder that everything connected to the internet can have security risks - even the most private things.

Do you have any questions?