Eva Jarbekk
Partner
Oslo
Newsletter
by Eva Jarbekk
Published:
Welcome to this year's second newsletter. We are only three or four weeks into the year, but there is plenty to write about. I have focused on the decisions and cases that I consider most important and have managed to keep the newsletter to less than 10 pages. Below you will find a discussion of (yet another) very important Meta case concerning what they – and others – must disclose in response to requests for access. There are also several cases concerning how data processors can be held liable for security breaches. But it is not only Europe that enforces privacy rules. Even Disney has been hit hard in the US for collecting information about children without asking their parents. And while we're on the subject of the US: Negotiations are currently underway on an agreement that will give US authorities access to European biometric databases. What does this mean in practice? It's not entirely clear yet.
Happy reading!
This is perhaps one of the most important decisions in recent times. After a long legal process since 2011 and two rounds in the CJEU, Austria's Supreme Court has finally given Max Schrems and noyb an important and definitive victory.
Schrems demanded (among other things) access to all personal data Meta had about him, while Meta believed that he should only be given partial access. Meta refused to disclose information used for algorithmic processing and profiling. The company claimed that this would require a disproportionate amount of work and that some of the information was trade secrets.
The ruling states that Meta must give users access to absolutely all personal data the company has about them. This includes data used to train algorithms and build profiles, the sources of this data, the recipients who have had access to it (remember that the term "recipient" is interpreted broadly), and the purpose of each individual processing operation.
The court ruled that the right of access is fundamental and covers all personal data, including that used for automated decisions and profiling. Meta cannot prevent disclosure on the grounds of complexity or trade secrets. If the company processes the data, the user has the right to know about it. However, I believe that a slight reservation must be made here, as Meta may not have argued sufficiently that trade secrets were involved – perhaps the argument could be made differently.
Furthermore, the ruling states that sensitive personal data must be protected separately. The interesting question is what this means for the distinction between sensitive and non-sensitive data. Meta processes enormous amounts of data, and much of this can indirectly reveal sensitive information. The same applies to many other players. When everything is intertwined in algorithms and profiles, how can one distinguish between what is what? And how should this be presented in a way that actually makes sense to the user? The extent of the consequences of this depends, however, on whether a system chooses to protect "everything" as if it were sensitive personal data – or whether a distinction is made between ordinary and sensitive data. Often, practical advice is to treat everything as if it were sensitive.
For companies that engage in profiling and automated decision-making, the message is clear: the right of access is not optional and it covers everything. It is not enough to provide access to the simpler data. Users have a right to know what happens to their information. I believe this case is a warning to many, as many will probably find it challenging to meet such access requirements.
Both the Norwegian and Swedish data protection authorities have dealt with cases concerning the division of responsibility between data processors and data controllers.
The Norwegian Data Protection Authority recently imposed a fine of NOK 250,000 on Timegrip, a company that was originally a data processor and supplier of a time recording system for employees of the retail chain Enklere Liv Retail AS. When the chain went bankrupt, an employee wanted access to his worked hours in order to file a claim against the bankruptcy estate. The bankruptcy estate also wanted the lists. Timegrip refused to grant access.
Interestingly, Timegrip believed that, as a data processor, it had no obligation to provide the complainant with the data. They believed that a data processor can only process personal data based on instructions from the data controller – and the data controller was bankrupt and no longer existed. The Data Protection Authority, on the other hand, believed that there should always be someone responsible and that there should not be a situation where there is only a data processor and no data controller. When Timegrip continued to store the information while deciding who should have access to it, they themselves became the controller. The next step in the Data Protection Authority's argument is that Timegrip then unlawfully refused to disclose the information – and was therefore fined for the violation.
In Sweden, IMY (the Swedish Data Protection Authority) has fined the data processor SportAdmin SEK 6 million for lack of data security. The company supplies systems for sports clubs and had implemented weak security measures, with vulnerabilities in the system that allowed unauthorized persons to access personal data.
The background to the case was a hacker attack in January 2025, in which the attacker stole information about more than 2.1 million people and published it on the "Darknet". This included information about children, their names and contact details, social security numbers, and the sports and clubs they were active in. It also included sensitive health information and specially protected personal data. Obviously not good.
IMY said that although hacking can never be completely ruled out, you must have a level of security that is appropriate for the personal data you handle, and in this case, Sportadmin had been passive in the face of known risks. I think it is relevant to look at the specific shortcomings that motivate such a fee. IMY writes the following about what happened:
In connection with a change in the login procedure for association websites on June 28, 2022, a code change was implemented whereby a special variable was introduced on one of Sportadmin's web pages. When the variable was added, Sportadmin overlooked applying its existing security method for protection against SQL injections. Since the change did not introduce a new variable into the systems, but rather involved the reuse of a variable that Sportadmin considered secure and which does not normally require the application of additional security methods, the flaw was not detected. The unprotected variable was then used directly in communication with the database, which led to an increased risk of data intrusion using SQL injections. It was at this variable that the SQL injection in question was performed and which probably caused the incident. [..]
And
"The code change implemented in June 2022 was classified as a high-risk change and was therefore reviewed by additional personnel. Despite this, the lack of protection against SQL injections was not detected due to deficiencies in the company's review procedures for more complex code changes. The company has identified the following deficiencies.
The risk classification was too one-sided because it was mainly based on the impact on the login procedure and the risk of unauthorized access to user data. This meant that other types of attacks, such as SQL injections, were not taken into account to a sufficient extent.
The combination of a technically complex environment where older (legacy) code was mixed with newer implementations contributed to the current vulnerability not being identified during the review.
There was a lack of additional review procedures when changing code that posed a high risk in combination with particular dependence on older code. For example, they could have included mandatory review by more people. Furthermore, automatic security reviews of the code could have identified vulnerabilities.
Code review was subjective and unclear.
Since code review was not a mandatory step when changing code at the time and the risk assessment criteria were subjective, the decision to perform a review was based on an individual assessment in each case. Sportadmin's monitoring system did not alert to suspicious activity in connection with the intrusion on January 14, 2025, and thereafter.
I will not comment much on how to quality assure code, but changing login routines appears to be a relatively common change in a system. Changes can pose security risks. How information security is to be safeguarded is a management responsibility – even if management does not do this itself, systems must be in place to ensure that someone takes care of this.
It is also important to note that IMY did not go after the sports clubs responsible for processing the data, but directly after the data processor. They correctly assumed that data processors have an independent responsibility to implement adequate security measures in accordance with GDPR Article 32. This responsibility follows from the regulations themselves, not just from the data processing agreement.
Consequences of these cases? The Norwegian Data Protection Authority says that data processing agreements can usefully regulate what should happen in a bankruptcy situation. This may be ambitious, but for data processors with many similar agreements, it may be a good idea to incorporate this.
Together, these cases send a clear message: the line between data processor and data controller is not as clear-cut as many have believed. Responsibility can shift, and data processors in particular must be prepared for this.
The French data protection authority CNIL continues its active enforcement practice. The telecommunications company Free has been fined €15 million after a data breach allowed a third party to access the personal data of over 24 million customers. CNIL found that Free had not implemented an adequate authorization process for connecting its employees to VPN. The consequences of this weakness were so serious that CNIL considered it a violation of GDPR Article 32.
For Free Mobile, the case was different, with CNIL imposing a fine of €27 million for a breach of GDPR Article 5-1e. This time, Free had failed to sort and delete personal data in an appropriate manner. This concerned data on former subscribers, where Free Mobile retained more than was necessary and justified for keeping accounting data.
I have a hunch that this may apply to more companies than Free Mobile. The fine imposed on Free Mobile may motivate others to focus more on data deletion. The company was also ordered to sort and delete data relating to former users within six months.
It is easy to believe that privacy is something that only applies in Europe, but Disney has just been reminded that the US authorities also take this seriously. The company has reached a settlement with the Federal Trade Commission (FTC) and is paying $10 million for violating COPPA – the US law that protects children's privacy online.
Disney collected personal information from children under the age of 13 without first obtaining parental consent. This applied, among other things, to Disney's various apps and websites, where children could register and use the services without their parents being involved.
COPPA is quite clear on this point: if you collect information from children under the age of 13, you must first obtain parental consent. Disney did not have this in place, and the FTC responded. Children need extra protection, and parents must be involved when information about them is collected.
The case shows that even large, resourceful companies can get the basics wrong. The case is a good reminder that some principles are universal. Regardless of whether you operate under GDPR, COPPA, or other regulations, it is about protecting vulnerable groups and ensuring that processing is lawful and fair.
You can read more about the case here.
The European Court of Justice has ruled on how long public transport companies can store information obtained via body-worn cameras. The case concerns the Swedish public transport company Storstockholms Lokaltrafik (SL).
SL's ticket inspectors had cameras on their uniforms that recorded both video and audio of passengers. The cameras were set to automatically delete all recordings that did not relate to the issuing of fines or other hazards. SL argued that the cameras were intended to protect the inspectors and to ensure that passengers who were to be fined could also be identified later.
One issue in the case was whether the information should be considered to have been collected from the data subjects themselves or from third parties, i.e. whether Article 13 or Article 14 applies. The court ruled that the information should be considered to have been collected from the data subjects themselves, even though they were not aware of this. It emphasized that Article 13 of the GDPR specifically expresses the data subject's right to be informed at the time of collection, while Article 14 of the GDPR is designed for situations where data is collected indirectly from third parties, which makes immediate information impractical. Applying Article 14 of the GDPR to body camera recordings would undermine the principle of transparency in Articles 5 and 12. Applying Article 14 of the GDPR in such circumstances would also enable covert surveillance, as data subjects would not necessarily be informed at the time their data is collected. Good reasons from the court here, then.
This is therefore essential for the information obligation, which was not complied with in this case. The problem was not the camera itself, but the fact that passengers did not know they were being filmed. SL could probably have avoided the fine of SEK 4 million by introducing simple measures, such as putting up warning signs on the vehicle.
You can read more about the case here.
The Lithuanian Data Protection Authority has issued a reminder that data controllers must be transparent about which data processors they use.
The case concerns a company that did not provide users with sufficient information about who actually processed their personal data. In this case, the company had a generic privacy statement that said they "could use data processors," but did not provide any specific information about who these were.
The Lithuanian Data Protection Authority considered this to be insufficient. Users have the right to know who actually has access to their data, not just that "someone" may have it.
GDPR Articles 13 and 14 require controllers to inform data subjects about who processes their data. This also includes processors, i.e. companies that process data on behalf of the controller.
This case is a good reminder that transparency is not optional. It is not enough to have a long and generic privacy policy that covers all eventualities. The information must be specific and understandable.
For businesses, this means that you must keep track of which data processors you use and inform your users about this. And yes, this may mean that your privacy policy needs to be updated when you change suppliers.
And for those who think that "no one reads the privacy policy anyway," that's not the point. The point is that the information should be available to those who want to know. Transparency is about giving users the opportunity to make informed choices.
You can read more about the case here.
In 2022, CNIL was notified of a data breach at the company DEEZER. Information about their users had been published on the darknet, and this involved their former data processor MOBIUS SOLUTIONS LTD (Mobius). Mobius had assisted with personalized campaigns.
CNIL found that Mobius had not deleted the data after the contract had ended. Mobius had retained a copy of the information of over 46 million DEEZER users after the contractual relationship had ended. This was despite their obligation to delete it.
Mobius stated that the data had been copied by three of its employees without the company being informed. However, the CNIL considered that the company was responsible for the actions of its employees, as the data was stored together with data from other customers.
Mobius also did not have a proper record of the processing activities it had carried out, which constituted a breach of GDPR Article 30.
Mobius was fined €1 million for multiple violations.
The decision shows how important it is to have procedures in place to ensure that all data is deleted when contracts are terminated. Deleting what you say you will delete may sound simple, but in practice it can be complicated.
You can read more about the case here.
The Danish Data Protection Agency has given DSB Service and Retail permission to continue with camera surveillance of employees, despite protests from the employees. The case shows that employers can in certain cases monitor employees even if they do not like it.
DSB had installed cameras in the workplace to prevent and investigate theft by both customers and employees, thereby ensuring safety. The employees believed this was a disproportionate invasion of their privacy and complained to the Data Protection Authority.
The Data Protection Authority assessed the case and concluded that the surveillance was lawful. The reasoning was that DSB had a legitimate interest in protecting its property and ensuring a safe working environment, and that this interest outweighed the employees' privacy interests.
The Authority emphasized that the cameras did not monitor areas where employees had a particular expectation of privacy, such as changing rooms or toilets. The surveillance was also limited to what was necessary to achieve the purpose.
The case illustrates that surveillance of employees is not automatically illegal. The decisive factors are whether there is a legitimate interest, whether the surveillance is proportionate, and whether the employees' rights are adequately safeguarded. Employers may introduce surveillance if there are good reasons for doing so and these can be documented.
You can read more about the case here.
Another case from Denmark concerns an app that could track the geographical position of public transport passengers and whether this is legal. When the supervisory authority consulted data protection authorities in other countries about how they assess such cases, they received quite different answers. The case is so fundamental that the Data Protection Authority will therefore ask the EDPB for an opinion on how to interpret the law.
The background is the Data Protection Authority's investigation of the Rejsekort app for public transport. The Rejsekort app allows users to pay for public transport by checking in and out with their mobile phone. For this to work, the app needs to know where the user is located. The problem was that tracking could continue even after the journey had ended if passengers had forgotten to activate the app's automatic functions.
The question is fundamental: How much geographic tracking is necessary to provide a public transport service? What is in line with the principles of data minimization and privacy by design?
It is not particularly surprising that these are difficult questions. On the one hand, geographic tracking is necessary for the app to function. On the other hand, detailed movement patterns are recorded, which can be very intrusive.
It will be interesting to see what the EDPB will say. A clarification here will be important for everyone who offers services that require geolocation tracking, not only public transport, but also taxi apps, delivery services, and much more.
For the time being, the advice to companies engaged in geographic tracking is that they must be prepared for the rules to become stricter.
You can read more about the case here.
The US has introduced a new requirement for countries participating in the Visa Waiver Program. The new requirement is that they must enter into a bilateral "Enhanced Border Security Partnership" with the US Department of Homeland Security (DHS).
This will give the DHS access to national biometric registries for immigration control and security assessments. The US expects agreements to be in place by the end of 2026.
The EU is currently negotiating a framework agreement that will provide overall terms for member states' bilateral information exchange with the US. The exchange will apply to information, including biometric data, stored in national databases in member states.
It will be up to each member state to negotiate with the US to determine which national databases and data the US will have access to. So this is not a massive transfer of entire databases, but a system where information exchange will be used for "screening and verifying the identity of travelers to determine whether their entry or stay would pose a risk to public safety or security."
The framework agreement will have clear limitations on the purpose of the exchanged data, with specific triggers for information exchange, as well as security measures to prevent mass transfer of data. It will also be in line with the EU Charter of Fundamental Rights, the GDPR, the Police Directive, and the AI Regulation.
Since Norway is not a member of the EU, this framework agreement does not apply directly to us. Norway already has a visa waiver agreement with the US (although we must have an entry permit), so it is unclear whether we will have to enter into a separate EBSP agreement or whether the US considers our existing arrangement to be sufficient.
This is an issue that deserves attention, and there is some skepticism about allowing the US access to the information.
You can read more about the matter here.
Partner
Oslo
Managing Associate - Qualified as EEA lawyer
Oslo
Partner
Oslo
Partner
Oslo
Partner
Oslo
Partner
Oslo
Senior Associate
Oslo
Associate
Oslo
Senior Lawyer
Stockholm
Associate
Stockholm
Senior Lawyer
Stockholm
Associate
Stockholm
Partner
Oslo
Senior Associate
Oslo