Eva Jarbekk
Partner
Oslo
Newsletter
by Eva Jarbekk and Sofie Axelsson
Published:
This month's Privacy Corner brings together a set of cases and developments that, taken together, paint a consistent picture: privacy obligations are being enforced more seriously, more broadly, and with less tolerance for the argument that compliance can come after the contract is signed. And as the Omnibus proposals are still in progress and changing, I have put focus on other business.
The case that stands out most is the Danish Datatilsynet's final ruling against 51 municipalities over their use of Google Workspace in primary schools. All 51 received a formal serious criticism, alongside two warnings. The core issue was the absence of transfer impact assessments for data flowing through Google's sub-processor chain to countries without EU adequacy decisions — a reminder that a contract, however well drafted, cannot override what a foreign government may demand.
Across the Atlantic, California is moving to introduce a whistleblower scheme under the CCPA framework, allowing employees and contractors to report privacy violations directly to the California Privacy Protection Agency — and receive a reward of between 15% and 33% of any fines collected. For any organisation with a California footprint, this changes the compliance calculus considerably: the risk is no longer just a distant regulator, but potentially someone already inside the business.
Elsewhere, the newsletter covers Amazon's unlawful recording of workers' most sensitive personal data in Italy, the EU's adequacy decision for Brazil, WhatsApp's landmark court victory on EDPB oversight, and a series of enforcement actions on erasure rights, loyalty programme data misuse, and breach notification failures. There is much to consider this month - again. Happy reading.
What happens when the people most likely to know about a company's privacy violations are the ones sitting right inside it?
California is about to find out. A new bill proposed under the CCPA framework would introduce a formal whistleblower scheme, allowing employees, contractors and others to submit complaints directly to the California Privacy Protection Agency — and get paid for it. We are talking about a reward of between 15% and 33% of any fines collected as a result of a successful enforcement action. That is a serious financial incentive given that CCPA fines can run into the millions.
For companies with operations in California, which covers a remarkably large slice of global business, this shifts the risk profile of non-compliance considerably. It is one thing to weigh the abstract possibility of a regulator investigation. It is quite another when your own staff may have a direct financial stake in reporting gaps in your privacy programme. I think many compliance teams will need to take a hard look at whether their internal processes are truly robust, or whether they have simply been relying on the fact that enforcement has historically been slow and under-resourced.
The bill also includes anti-retaliation protections for employees who come forward. Taken together, the incentive structure here is not unlike what the US Securities and Exchange Commission built into financial regulation, and that program has been remarkably effective at surfacing misconduct.
Practical recommendations
If you or your clients operate with a California footprint, now is a good moment to revisit CCPA compliance in earnest. Whistleblower schemes have a way of concentrating minds. The question is no longer just whether the regulator will come knocking, it is whether someone already inside the building might help them do so.
You can read more about the matter here.
If you run HR systems, attendance platforms or any form of workforce monitoring, this case should land on your desk immediately.
The Italian data protection authority, the Garante, has issued an emergency prohibition against Amazon Italia Logistica, ordering the company to stop processing the personal data of more than 1,800 workers at its Passo Corese logistics facility. The order took effect immediately, following joint inspections carried out in February 2026 together with the National Labour Inspectorate and the Guardia di Finanza's specialist privacy unit.
What was actually going on?
Through a platform connected to the attendance system, accessible to numerous managers, Amazon had been systematically recording detailed notes on workers following interviews conducted when they returned from absence. The data collected went far beyond what any employment relationship could justify. Specific medical diagnoses such as Crohn's disease and herniated discs, participation in strikes and trade union activities, and deeply personal family circumstances including terminally ill relatives were recorded. This information was retained for up to ten years after the employment relationship ended. Four cameras positioned near toilets and rest areas added another layer of unlawful surveillance.
What does this mean for your business?
The legal point is straightforward: under GDPR, employers cannot process data that is irrelevant to assessing an employee's professional aptitude. Health conditions, union involvement and family situations are special categories of personal data, requiring explicit and specific legal grounds. Grounds that standard employment necessity simply cannot provide. I wonder how many companies running similar attendance or return-to-work interview processes have genuinely audited what their managers are actually recording on those platforms. The gap between policy and practice in day-to-day HR operations may be wider than compliance teams realise.
The Garante has also extended the prohibition to any other Amazon logistics sites in Italy using the same platform in a comparable way. That systemic framing matters. It signals that authorities are increasingly looking beyond the individual incident and asking whether a flawed practice is embedded across an organisation.
For businesses with large shift-based or logistics workforces, where attendance management is intensive and manager discretion is broad, this is a timely prompt to review not just your policies, but what is actually being captured in your systems.
You can read more about the matter here.
For most businesses operating in Europe, transferring personal data outside the EU is a regular nuisance comprised of contractual clauses, transfer impact assessments, adequacy decisions. It is not only background noise. That is why the European Commission's latest adequacy decision deserves attention: Brazil has just joined the shortlist of countries to which personal data can flow freely from Europe.
The European Commission has adopted a formal adequacy decision under Article 45 GDPR recognising Brazil’s data protection framework as providing an essentially equivalent level of protection. Brazil’s data protection authority has adopted a corresponding recognition of the EU framework under the LGPD, enabling reciprocal data flows.
In practical terms, personal data can now be transferred from the EU to Brazil without the need for additional transfer mechanisms such as Standard Contractual Clauses or transfer impact assessments.
Together, the EU and Brazil represent a combined population of approximately 670 million people. This creates one of the largest jurisdictions globally in which personal data can circulate between two major economies without supplementary transfer safeguards.
Why Brazil, and why now?
The legal groundwork has been building for some time. Brazil adopted its General Data Protection Law, the LGPD, in 2018, closely modelled on the GDPR. It subsequently established an independent data protection authority, the National Data Protection Authority. The Commission has now concluded that the LGPD offers a high degree of convergence with the GDPR in terms of scope, individual rights, obligations on organisations, oversight and enforcement. It is worth noting that this decision also arrives in the wake of the EU-Mercosur trade agreements signed in January 2026, so the timing carries a deliberate geopolitical message about multilateralism and rules-based cooperation.
For European companies already operating in Brazil (many within manufacturing, agri-business, financial services and technology) this removes a layer of compliance friction that has long been a background irritant. Data sharing between group entities, transfers to Brazilian service providers, and cross-border HR data flows all become considerably simpler. The same applies in reverse for Brazilian businesses expanding into the EU market.
I think it is also worth pausing to consider the broader signal this sends. The EU's adequacy framework has historically moved slowly, and the list of recognised countries has grown cautiously. Adding Brazil, a major emerging economy with a GDPR-aligned framework, suggests that the Commission is willing to use adequacy decisions as a tool of strategic partnership, not just a technical rubber stamp. Whether that approach will extend to other jurisdictions in Latin America or beyond remains to be seen. The Commission will review the decision after four years. For the time being, the path is clear.
You can read more about the matter here.
When a major cloud platform is procured, the focus tends to be on functionality, price and contractual terms. The question of where personal data ends up travelling often comes later, if it comes at all. The Danish Datatilsynet's latest decision in the long-running Chromebook case is a reminder that "later" is not good enough.
Background
The case concerns 51 Danish municipalities' use of Google Workspace for Education and Google Chrome Education in primary schools. The Datatilsynet has now issued its final ruling. The verdict is a formal serious criticism of all 51 municipalities, accompanied by two warnings. The decision comes after years of investigation and follows an October 2024 opinion from the European Data Protection Board (EDPB) on exactly how far a data controller's responsibility extends when using processors and sub-processors. The short answer, as the EDPB made clear: further than most organisations have assumed.
The sub-processor problem
The crux of the matter is third-country transfers. Specifically, what happens when Google, as the municipalities' data processor, passes data onwards to its own sub-processors, many of whom are located outside the EU? Data flows from Google Cloud EMEA in Ireland to Google LLC in the US, and from there to a range of further sub-processors in countries including India, Mexico, Taiwan, Chile, Singapore and Hong Kong.
The municipalities had relied primarily on the EU-US Data Privacy Framework (DPF) to justify transfers to the US, and on Google's contractual chain to cover onward transfers. The Datatilsynet accepts that the DPF provides a valid transfer basis for transfers to Google LLC itself. But it does not accept that this is sufficient for the entire sub-processor chain. For transfers to countries without an adequacy decision, the so-called "unsafe third countries", the municipalities should have carried out transfer impact assessments (TIAs) to verify whether the four European Essential Guarantees established by the Schrems II ruling were actually met in practice. They had not done so. There was no documentation showing that anyone had genuinely interrogated whether contractual protections alone could withstand governmental access regimes in countries like India or Taiwan.
Datatilsynet is direct on this point: a contract between a sub-processor in a safe third country and a sub-processor in an unsafe one does not, by itself, guarantee an equivalent level of protection. No contract can control what a government in that country may demand.
It is especially interesting to read that the decision states that:
"The Data Protection Authority finds that the following countries in the sub-processor chain should have been subject to an investigation before the municipalities entered into the contract with Google:
What does this mean for your business?
I think the most striking passage in the entire decision is the closing remark from the Datatilsynet, where it states plainly that all of this could, and should, have been avoided if the relevant data protection assessments had been completed before the product was procured and put into use. That is not a subtle message. It is a direct challenge to the way procurement has worked in both the public and private sectors for years: choose the product, sign the contract, and sort out the compliance questions afterwards.
The decision also makes clear that if you choose a product where processing activities or contractual terms change frequently, you must be capable of continuously documenting that processing remains lawful after each change. If you cannot, you must either stop using the product or switch supplier. That is a significant operational implication for any organisation heavily reliant on US-based technology providers.
Practical recommendations
If your organisation uses Google Workspace, Microsoft 365, or any comparable platform that relies on a global sub-processor network, this decision may warrant a concrete response. Some actions to consider:
Datatilsynet has signalled that it will take a harder line on sanctions in future cases where these principles are not followed, particularly where users have limited choice of supplier. That warning applies well beyond Danish municipalities.
You can read more about the matter here.
Still on the topic of tech platforms in schools. The Austrian data protection authority has ruled that Microsoft unlawfully placed tracking cookies on the device of a minor using Microsoft 365 Education. No consent was obtained. The school did not know. The Ministry of Education did not know. According to Microsoft's own documentation, those cookies were analyzing user behaviour, collecting browser data and serving advertising. Microsoft has been ordered to stop within four weeks.
This is the second Austrian decision against Microsoft in this set of proceedings. The first, in October 2025, concerned a violation of the right of access. There was also a jurisdictional dimension worth noting: Microsoft argued that its Irish subsidiary should be treated as the relevant controller, neatly routing the case towards a jurisdiction where enforcement has historically been slower. The Austrian authority rejected that argument and held that Microsoft in the US is making the relevant decisions.
I find it difficult to read this decision as anything other than a warning to every organisation running Microsoft 365, not just schools. The Education and standard business suites share the same underlying infrastructure. German data protection authorities have previously raised concerns about Microsoft 365's GDPR compliance, and this ruling adds weight to that picture. If your organisation has not yet audited what tracking is actually active in your Microsoft 365 environment, and on what legal basis, that conversation is overdue.
You can read more about the matter here, and access the decision in full here.
When a landlord turns down a rental application, what happens to all the documents that were handed over, identity papers, bank statements, proof of income? Once the purpose of collecting that data has fallen away, the right to retain it falls away with it. It is simple, and yet it is hard. When personal data is no longer needed, it must be deleted. Especially so when the data subject is actively demanding erasure and the controller cannot point to any legal basis for holding on to it.
A family of three applied to rent a property in 2023, providing the landlord with a range of personal documents as part of the application process. When the landlord declined to enter into a lease agreement, the family requested erasure of their data under Article 17 GDPR. The landlord did not respond. Not to the erasure request, not to a subsequent mediation attempt. The family eventually filed a formal complaint with the data protection authority, which ordered the landlord to delete the data within 30 days and reminded them that under Article 12 GDPR. Controllers are also obliged to inform individuals of the measures taken in response to such requests.
The point is widely overlooked, particularly in property rental, recruitment and financial services, where large volumes of sensitive personal data are routinely collected from people who ultimately do not become customers or tenants. I wonder how many letting agencies and recruiters have a clear process in place for handling erasure requests from unsuccessful applicants. This case suggests the answer is: not enough. A rejection is not just the end of a commercial conversation. It triggers an obligation to act promptly, and with documentation.
You can read more about the matter here.
Nearly eight years into the GDPR, a sweeping EU-wide investigation has confirmed what many privacy professionals have long suspected, a significant number of organisations are still not handling erasure requests correctly.
At its February 2026 plenary, the EDPB adopted a report on controllers' compliance with the right to erasure under Article 17 GDPR. The findings come from a coordinated enforcement exercise involving data protection authorities from 33 EU and EEA countries, with 764 responses collected from public bodies and private companies alike.
The failures are consistent and, frankly, basic. Across participating jurisdictions, authorities identified:
Neither of the last two issues has a simple fix, but organisations that have not explicitly addressed them in their erasure procedures have a compliance gap that regulators now may be actively looking for.
The CEF exercise for 2026 has already been announced, and the topic will be compliance with information obligations. If you have not recently audited your privacy notices and transparency documentation, now is the time.
You can read more about the matter here.
It might sound like a procedural technicality. In practice, it is a meaningful clarification of how the GDPR's enforcement architecture works, and who can challenge what, before whom.
Background
The case has its roots in a €225 million fine imposed on WhatsApp Ireland by the Irish Data Protection Commission in August 2021, following a binding decision by the EDPB under the GDPR's dispute-resolution mechanism. WhatsApp challenged that binding decision before the EU General Court, which dismissed the action as inadmissible in December 2022. The reasoning was that the EDPB's decision was not a challengeable act and that WhatsApp was not directly concerned by it. WhatsApp appealed to the CJEU.
The CJEU set aside the General Court's order. A binding EDPB decision that determines whether a controller has infringed the GDPR, and amends the corrective measures to be imposed, is an act open to challenge before the EU courts. It produces binding legal effects, definitively settles the EDPB's position. It is not a merely preparatory step. Where it alters a controller's legal position without leaving discretion to the national supervisory authority, the controller has standing to bring an annulment action. The case has been referred back to the General Court for examination of the merits.
What does this mean in practice?
Until now, companies subject to EDPB binding decisions faced an awkward procedural position. They could challenge a national DPA's implementing decision before a national court, but had limited ability to directly attack the EDPB decision underpinning it. That route is now open. EDPB decisions are not insulated from judicial scrutiny simply because they form part of a cooperative regulatory process.
Some LinkedIn discussions in response to this ruling raise a broader question worth pondering: does this ruling mean that big tech companies, particularly dominant social media operators, can now be held to account more quickly? More power to the EDPB, faster decisions, faster consequences?
EDPB's binding decision mechanism was designed precisely to prevent lead supervisory authorities from issuing decisions that are too lenient, a concern that has historically been associated with Ireland's role as the de facto regulator for much of big tech in Europe. Strengthening judicial oversight of the EDPB itself could, in theory, sharpen the mechanism rather than blunt it.
That said, I wonder whether the optimism is fully warranted. The WhatsApp case has been in litigation since 2021. The admissibility question alone has taken years to resolve. The substantive merits have not even been examined yet. Large operators have the resources to litigate every procedural step, and they do. Judicial accountability for the EDPB is a good thing. Whether it translates into faster outcomes is a different question entirely.
You can read more about the matter here and here, and find some commentary here.
Loyalty programmes collect a great deal of personal data. That data is valuable, not just to the company running the programme, but to third parties willing to pay for access to it. This case is a reminder that collecting data for one purpose and quietly using it for another is not a grey area. It is a straightforward GDPR violation.
What happened?
The French CNIL investigated a company operating a loyalty programme through which members were invited to consent to receiving offers by SMS and email. What members were not told was that their data would also be transmitted to a social media platform, which matched it against user profiles to serve targeted advertising. Approximately 10.8 million clients were affected, of whom 1.6 million received targeted advertising as a result.
The violations were multiple and, taken together, paint a picture of a compliance framework that was more cosmetic than substantive.
On consent and lawfulness, the membership form contained no reference to data transmission to a social media platform or to targeted advertising. Consent was therefore neither informed nor specific, rendering the processing unlawful under Article 6 GDPR.
On transparency, the relevant information was technically available, but buried. Data subjects had to scroll to the bottom of the page, follow links placed far from the consent button, and consult two separate documents to piece together a complete picture. Retention periods were not stated. The targeted advertising purpose was not visible at the point of consent. That is not transparency; it is the appearance of it.
On security, the company's password policy and data storage conditions fell below the standard required under Article 32 GDPR.
On data protection impact assessments, the scale of the processing, millions of records cross-referenced with a social media database, clearly triggered the obligation to conduct a DPIA under Article 35 GDPR. No such assessment had been carried out.
Finally, the company's website placed optional cookies on users' devices without consent, in violation of French cookie law implementing the ePrivacy Directive.
The outcome
The CNIL issued two fines: €2,500,000 for the GDPR violations and €1,000,000 for the cookie breach. A total of €3,500,000.
Practical consequences
For any organisation running a loyalty programme, a customer account system, or any other data collection mechanism that feeds into third-party marketing arrangements, this decision is directly relevant. The key question is not whether consent was obtained, but whether that consent was genuinely informed about every purpose for which the data would be used, including onward transmission to third parties. If the answer is anything less than an unambiguous yes, the legal basis for that processing is in doubt.
The transparency findings are equally instructive. Scattering required information across multiple documents, placed far from the point of consent, is a pattern regulators across Europe are increasingly unwilling to tolerate. Accessibility and clarity are not optional refinements, they are substantive legal requirements.
You can read more about the matter here.
See also: CNIL's decision
A German administrative court has clarified what Article 15 GDPR actually requires, and what it does not.
A public health authority carried out a home visit following a third-party report raising concerns about a woman's welfare, and subsequently created internal medical notes and administrative records. The data subject requested access to all records held about her. The authority complied but redacted information relating to its staff and the reporting third party. The data subject sought full, unredacted disclosure.
The court dismissed the claim. Drawing on the CJEU's judgment in CRIF (C-487/21), it confirmed that the right to a copy under Article 15(3) GDPR is a right to a faithful reproduction of the data subject's own personal data, not a right to receive entire documents in unredacted form. Provided the disclosed information enables the data subject to understand their data and exercise their rights effectively, redacting third-party information from the same documents is entirely lawful.
The court also made clear that Article 15 GDPR is not the appropriate vehicle for contesting the factual accuracy of administrative or medical assessments. That is a matter for other legal remedies.
Redacting third-party data before disclosing records is not only permissible, in some cases it may be required. The key obligation is to ensure that the data subject's own information remains complete and intelligible after redaction.
You can read more about the matter here.
A DPO cannot effectively oversee compliance with decisions they are themselves responsible for making. The Polish DPA has fined Poczta Polska €232,000 for getting this wrong.
While investigating a data breach notification from the Polish national postal operator, the UODO discovered that the company's DPO simultaneously held several senior management roles. Positions included Director of an organisational unit responsible for information protection, and proxy for the information security management system. In practice, this meant the DPO was tasked with monitoring compliance with data protection measures that he had himself designed and implemented. The company had not documented any assessment of the conflict, nor put in place any safeguards to address it.
The DPA found a violation of Article 38(6) GDPR, which prohibits the appointment of a DPO whose other tasks and duties would result in a conflict of interest. The controller's argument, that direct reporting to the Management Board and staff support were sufficient, was rejected. Structural independence matters, and it cannot be substituted by organisational proximity to senior leadership alone.
The controller reorganised its DPO function during the investigation, removing the conflict. This did not, however, eliminate liability for the period of infringement.
The DPO role requires genuine independence. Any organisation where the DPO also holds responsibility for determining the means or purposes of data processing should treat that as a red flag, document a conflict of interest assessment, and consider whether the roles can legitimately coexist.
You can read more about the matter here.
An employer cannot require staff to hand over their private mobile numbers simply because a client's platform demands it. The Spanish DPA has made that clear with an €80,000 fine.
A customer support company providing services for a Chinese client asked its employees, during training sessions, to write down their personal mobile phone numbers on a blank sheet of paper. The numbers were needed to set up two-factor authentication on the client's platform. No specific privacy notice was provided. Employees subsequently received SMS messages directly from the client on their personal phones without having consented to this. The company's own DPO had advised against the practice, but was overruled.
The DPA found a violation of Article 6(1) GDPR. Contractual necessity, one of the lawful bases relied upon, must be interpreted strictly. Processing must be objectively necessary, not merely convenient. Requiring employees to use their personal devices as an authentication tool in an employment context does not meet that threshold, particularly where less intrusive alternatives existed. The fact that the controller ignored its own DPO's advice was treated as an aggravating factor.
What does this mean for businesses?
Employers must provide the tools needed to perform work. Where a client's system requires employee authentication, the employer bears responsibility for ensuring this happens through lawful means, including by pushing back on clients whose technical requirements would otherwise compromise staff privacy.
In the case at hand, the controller acknowledged responsibility and paid a reduced fine of €48,000 following voluntary payment.
You can read more about the matter here.
A cyberattack is one thing. How a company handles the aftermath is quite another. The French CNIL's decision against Free Mobile is a reminder that regulators are not just asking whether a breach occurred. They are asking whether it could have been made harder, whether the data exposed was even supposed to still be there, and whether affected individuals were told what they needed to know to protect themselves.
In September 2024, an attacker infiltrated the information systems of Free Mobile. Free Mobile is a subsidiary of the ILIAD group and one of France's largest mobile operators, with around 15.5 million subscribers. The breach exposed personal data from 24 million subscriber contracts, including IBAN details. The company became aware of the breach in October 2024, notified the CNIL, and emailed affected customers. Over 2,500 complaints followed, prompting a full CNIL investigation.
What did the CNIL find?
Three separate violations emerged, each telling its own story.
On data retention, the CNIL found that Free Mobile had been holding data on former subscribers well beyond any legitimate purpose. Millions of records were retained without justification, in breach of the storage limitation principle under Article 5(1)(e) GDPR. The uncomfortable reality is that a significant portion of the 24 million affected contracts likely included people who were no longer even customers. Data that should have been deleted years earlier was sitting in the system when the attacker came knocking.
On security, the CNIL found that basic measures had simply not been implemented. Authentication for the company's VPN connections, used for remote working, was insufficiently robust. Detection systems for abnormal behaviour on the network were ineffective. The CNIL was careful to note, drawing on the CJEU's ruling in VB v Natsionalna agentsia za prihodite. That the goal is not to eliminate all risk, but to reduce its probability and limit its severity when it materialises. Free Mobile fell short of that standard.
On breach notification, the company did send an email to affected individuals and set up a toll-free number. But the email omitted key information: what remedial measures the company had taken, what the likely consequences of the breach were for those affected, and what steps individuals could take to protect themselves. Under Article 34 GDPR, such information is not optional. The point of breach notification is enabling people to act.
What does this mean?
The €27 million fine reflects the scale of the breach and the financial capacity of the company. But the underlying failures are not unique to a telecoms giant. Excessive data retention, weak VPN authentication and incomplete breach notifications are compliance gaps that appear across sectors and company sizes.
The Free Mobile case is also a pointed reminder that a data breach is not just an IT incident, it is a moment when the entire data governance framework is put under scrutiny at once. Retention policies, security architecture and breach response procedures are all examined simultaneously. Organisations that have been treating any one of these as a lower priority should reconsider that approach before a breach forces the issue.
You can read more about the matter here.
Partner
Oslo
Associate
Stockholm
Managing Associate - Qualified as EEA lawyer
Oslo
Partner
Oslo
Partner
Oslo
Partner
Oslo
Partner
Oslo
Senior Associate
Oslo
Associate
Oslo
Senior Lawyer
Stockholm
Senior Lawyer
Stockholm
Partner
Oslo
Senior Associate
Oslo
Senior Associate
Stockholm