Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Security cameras mounted on wall.

I recently spoke at the major privacy conference in Sandefjord and had been assigned the topic of how to "handle the Data Protection Authority". I had to laugh a bit when I saw the theme, as it can seem rather delicate, and I will return to this in a later newsletter because I believe I have some experiences that may be useful for others as well.

It is impressive to see what an arena the Sandefjord seminar has become; this year there were nearly 450 participants! That is a great many people who set aside a day and a half to discuss privacy. The presentation by the Data Protection Authority's Line Coll focused strongly on the fact that the authority is struggling to have the resources to process all cases, and they have actually made adjustments to which breaches they believe should be reported. The Authority's website about what should be reported was updated on 16 October. It is worth noting that they have removed the sentence stating that if you are uncertain whether the breach should be reported, it is better to report it to the Data Protection Authority to be on the safe side. Completely gone. And in the list of typical security breaches, it no longer states that they "shall" be reported - now it states they "may" be reported. At the same time, the category "dispatch errors" has been moved from being the first example to becoming the last example. The threshold for reporting obligations is thus being raised somewhat. I do not believe privacy suffers from this, quite the contrary.

I also take this opportunity to tell you about a new Schjødt DIGITAL. We have an exciting AI-focused programme with panel discussions featuring Norwegian and international companies and authorities working with AI, along with some pointed short presentations. It will be useful, practical and interesting. It is in person in Oslo on 25 November from 1 pm onwards. We do not have room for 450 participants, but for 100, and it may fill up quickly. Send me an email at eva.jarbekk@schjodt.com if you do not usually receive invitations from the tech group at Schjødt and would like to attend.

GDPR as an Economic Winner – Yes, You Read That Correctly!

We must begin with some good news. When someone mentions GDPR, jubilation and confetti are not always the usual reaction. But what if GDPR has actually saved Europe billions of euros? Not just in theory, but in real money that has not disappeared from the pockets of ordinary people and businesses.

A new study initiated by the French data protection authority, CNIL, has estimated the impact that GDPR's provision requiring data subjects to be notified of personal data security breaches has had on identity theft. The figures are actually quite impressive.

Without such a requirement to notify about a security breach at a company, the damage falls asymmetrically. Customers are exposed to the consequences (often identity theft) without knowing which company was responsible for the leak. The reputation of the responsible company will not suffer damage. Such a situation can lead to underinvestment in cybersecurity because part of the damage does not affect the company itself.

Let us look more closely at cybersecurity economics, where GDPR suddenly appears as the hero we did not know we needed. If you want to read the report, you can find it here.

When Numbers Become Difficult - But Important

Empirical studies in cybersecurity face challenges due to a lack of available data. In France, estimates of the cost of cybercrime vary significantly - estimates range from 119 billion euros to 2 billion euros. That is a difference of 117 billion euros. Not exactly small change. But these have been rough estimates. The Commission has therefore established an E-crime project to find better figures, and CNIL has published a summary of this. They have chosen to focus on identity theft.

Researchers sent questionnaires to 6,394 people in various European countries and asked about both reported and unreported cybercrime. They asked actual people about their experiences, not just what was reported to the police.

They applied four main types of identity theft: theft where criminals gain access to bank accounts, credit card information, PayPal accounts or online shopping accounts.

The Numbers That Make You Raise Your Eyebrows

Based on the surveys, they estimate the cost of identity theft in France at between 1 and 3.4 billion euros over 4 years. For the entire EU, the figure is between 6 and 15 billion euros. That is a lot of money disappearing due to cybercriminals.

But here comes the good news: They have found that GDPR has helped avoid between 54 and 132 million euros in losses related to direct costs of identity theft in France, and between 405 and 988 million euros in losses at EU level. Nearly a billion euros saved - just in the EU, and just from a single GDPR provision!

What is also interesting is who actually saves this money. On average, 70% of losses caused by identity theft are compensated by businesses. In France, individuals avoided between 16 and 40 million euros in losses, and businesses between 39.5 and 96 million euros. At EU level, individuals avoided between 105 and 257 million euros in losses, and businesses between 164 and 402 million euros.

So both you and I as consumers, and the businesses that must handle the mess when something goes wrong, come out better because of this notification requirement. It is win-win.

But it gets better. Direct losses are only the tip of the iceberg. Indirect costs are an important part of the costs caused by cybercrime. It affects society as a whole because weakened trust in online activities where sensitive data must be shared results in reduced business revenue.

Think of it this way: if you have been subjected to credit card fraud, how likely is it that you will shop as much online afterwards? A Belgian study estimates that 5.9% and 10.4% of the population have limited their use of online banking and e-commerce respectively due to cybersecurity risk. In one survey, 29% of respondents said they were uncertain about shopping online due to payment security concerns. Surveys show that the number of weekly online transactions falls from 1.32 to 1.02 after identity theft.

People simply shop less online when they are afraid. That is bad for e-commerce, bad for the economy, and bad for everyone involved in e-commerce.

The indirect cost of credit card identity theft is estimated at between 206 and 412 euros, or between 52% and 105% of the direct monetary impact of cybercrime. In other words: for every euro stolen directly, almost as much disappears in lost trust and reduced commerce.

With indirect costs included, it is estimated that GDPR has helped avoid between 90 and 219 million euros in losses related to total costs of identity theft in France, and between 585 and 1,427 million euros in losses at EU level. We are now talking about up to 1.4 billion euros saved at EU level. From one single article in GDPR.

There Is More Than Notification to the Data Subject That Counts

If one looks at other than the effect of breach notifications, IBM's "Cost of a Data Breach 2023" report shows that encryption reduces the average cost of a security breach by 5%. GDPR Article 32 concerns such security. Other principles in GDPR that come into play concern data minimisation and storage limitation. The consequences of these have not been quantified, but they will also have significance.

What Does This Actually Mean?

Economic research enables one to view cybersecurity as an investment decision for businesses. GDPR encourages businesses to invest more in cybersecurity to limit the impact of cybercrime. GDPR is not just a regulatory framework that is a burden - it is actually a way to get businesses to invest in something that benefits everyone.

Next time someone complains about GDPR and all the effort it entails, you can remind them of this: GDPR has already saved Europe hundreds of millions of euros, perhaps over a billion, just by reducing identity theft. And that is before we begin to calculate all the other benefits.

Yes, GDPR requires work. Yes, it costs money to implement. But it pays off. And in this case, the return is measured in fewer stolen identities, safer online shopping and increased trust in the digital society.

Paris Prosecutor's Office Investigating Siri

The Paris prosecutor's office has opened an investigation into Apple following a complaint from the French human rights organisation Ligue des droits de l'Homme.

The case is based on testimony from a former employee of an Apple subcontractor, who in 2019 analysed thousands of Siri recordings that could reveal both intimate moments and confidential information. The former employee went to the French prosecutor's office after both the French data protection authority CNIL and the Irish data protection authority DPC did not pursue the case. Rather surprising, actually.

The complaint has also led to a class action lawsuit in France, inspired by a similar case in the USA where Apple agreed in December 2024 to pay 95 million dollars in settlement.

Are Loyalty Programmes Really Free When You Pay with Data?

A German consumer organisation sued Lidl because they marketed their loyalty programme "Lidl Plus" as free, even though customers had to share personal data to participate.

The consumer organisation believed that personal data constituted payment, and that Lidl therefore breached the rules about stating the total price of the service.

The court did not agree. They ruled that "price" in consumer protection legislation only covers payment in money or digital values, not personal data.

Since Lidl had explained the data collection clearly in the terms and conditions, and consumers did not pay money, it was lawful to call participation free. The court believed that anyone who read the terms and conditions would receive detailed information about which data is collected and processed. The case shows that personal data and money are treated differently legally in different legal frameworks - even though both have value.

A Controversial Judgment - VG Düsseldorf - 29 K 6375/25

A German court has reached a rather sensational conclusion about the deadline in GDPR Article 12(3). The court believes that the one-month deadline for responding to requests only applies to providing a status report on measures that have been implemented - not to actually fulfilling the access request. In the case, the data controller had responded within the deadline that they had received the request and were working on the case, and that was sufficient according to the court.

However, the judgment's interpretation is highly controversial. Existing case law from several countries shows that the data controller must actually fulfil the access request in full within one month, possibly with a two-month extension if necessary.

The case was dismissed as inadmissible. It will be interesting to see whether this interpretation holds water going forward, or whether it will be contradicted by other courts. For now, it stands quite alone, and I think I would be cautious about advising clients to apply it uncritically when responding to an access request.

When Deletion Is Not Enough - Dutch Judgment on Notification Obligation

A Dutch organisation working with the protection of children and young people, RBV, received an important lesson about GDPR when they deleted a person's file upon request. The problem? They forgot to notify others who had received the data that it had been deleted, as Article 19 of GDPR requires. The court ordered such notification.

The judgment emphasises that data controllers must have an overview of whom they have shared data with. It is therefore not sufficient just to press "delete" - you must also remember to pass that information on to everyone who has received the data.

When AI Meets Job Seekers - The Boundary Between AI Support and Automated Decisions

An Austrian court has ruled that an AI model developed by the public employment service to predict job seekers' future chances in the labour market had valid legal basis under national law. The model, called AMAS, uses personal characteristics such as age, gender, education and previous career to predict integration chances, but advisers were trained to use AMAS only as a secondary opinion and had to explain the conclusions to job seekers.

The data protection authority had originally prohibited the use because they believed it lacked legal basis and that it was automated decision-making under GDPR Article 22. But the court concluded that it was not automated decision-making because the advisers had a real role in the decision-making process, unlike the SCHUFA case where there was no significant human involvement.

Finally, the court ruled that there was no breach of GDPR, since the data controller could rely on national legal provisions as legal basis. I think this is an important decision about where the boundary lies between AI support and automated decisions.

Training Data for AI Models - New Taxonomy from OECD

As is well known, there is a multitude of mechanisms used to collect training data for AI. The performance and reliability of AI models are closely linked to the quality and diversity of the data used for training. It is not just about feeding in as much data as possible - it is about which data, where they come from, and how they are collected. In practice, AI developers often use several data collection mechanisms simultaneously to build comprehensive training datasets.

There is often limited insight into the datasets used as training data, which makes it difficult to assess their quality. This can probably be explained by the legal risks companies incur when they disclose the datasets used, as well as concerns about proprietary information.

OECD has now developed its own taxonomy based on which sources developers obtain data from: i) directly from individuals and organisations; and ii) from third-party suppliers.

It goes too far to go into the details of the taxonomy here, but if one is to discuss the use of specific cases, it provides a useful starting point for which terminology to use. I believe, for example, that it will be useful for working with DPIAs and FRIAs for AI.

Moreover, it gives decision-makers and stakeholders a structured approach for policy discussions about privacy, data governance and trustworthy AI development. That is quite useful in itself. You can find it here.

Consultation Response on Disclosure of IP Data for Prevention Purposes

The Government wants to give the police and PST access to IP data to prevent serious crime, not just to solve crimes that have already occurred.

The Data Protection Authority has submitted a consultation response and is not impressed. Director Line Coll calls it a change of tack that will significantly expand the group of people affected and give the authorities far greater access to information.

The entry criterion is highly discretionary, which increases the risk of mission creep and a chilling effect on lawful expressions and activity online. The Data Protection Authority is not convinced of the proposal's proportionality and believes it may be contrary to the ECHR and EEA law.

The Authority recommends that disclosure of IP data should be approved by a court in advance, not just controlled by supervisory bodies afterwards. They also miss more principled and comprehensive assessments of both the need for and the consequences of the proposal. The Data Protection Authority believes the legislative proposal provides a good opportunity for the Government to assess state authorities' instruments, legal bases and practice more holistically. Prevention is important, but not at the expense of fundamental rights.

I believe there will be considerable discussion about this proposal.

Data Protection Authority Puts DR on Data Diet - But Accepts Mandatory Login

The Danish Data Protection Authority has concluded its investigation of DR (Danmarks Radio)'s requirement for mandatory login on DRTV, and the conclusion is something of a mixture: The login requirement itself gets the green light, but the way DR collected names receives sharp criticism.

When DR introduced a login requirement in autumn 2024 to watch streaming content, complaints poured into the Data Protection Authority, which therefore started an investigation in spring 2025. DR argued that they have an obligation to deliver a modern public service that matches users' expectations, and that a streaming service without login is simply not modern enough to fulfil their statutory duties.

The Data Protection Authority concluded that DR can use Article 6(1)(e) in GDPR - that is, processing that is necessary to perform a task in the public interest - as the basis for data collection, since their public service obligation is not entirely unambiguously defined and lies outside the Authority's core area.

But then came the criticism: Right up until 5 September 2025, DR breached the principle of data minimisation by asking users to provide their actual name with the text "Fortæl os dit navn" (Tell us your name), even though DR itself admitted that they did not need this information.

Now the text has been changed to encourage users to choose a profile name - that is, a nickname - and specify that one does not need to use one's real name. The moral? You can require login, but do not be greedy with the data. I suspect there are several out there who ask for names when it is actually not necessary.

The Austrian Data Protection Authority Goes on Austerity

The Austrian data protection authority (DSB) has announced that they must drastically cut their activities due to significant budget cuts, despite the fact that the workload is only increasing. With 53 employees and 19 administrative interns, they are to enforce the privacy rights of 9 million people - something that was already almost impossible.

Now the budget for 2026 is being cut, and most of the 20 interns cannot be replaced. The consequence? The Data Protection Authority will in future only provide opinions on legislative proposals in exceptional cases, and they will no longer investigate companies on their own initiative unless someone tips them off about serious breaches.

Germany, for example, spends approximately twice as much per capita on its data protection authorities as Austria, and according to GDPR Article 52(4), Austria is actually obliged to provide adequate funding.

Max Schrems from noyb warns that fundamental rights become worthless if they only exist on paper, and points out that if DSB had imposed proper fines, it would have been a source of revenue for Austria - a fine against Google alone would have covered Austria's share of the Brenner tunnel at 6 billion. Whether that in itself is a legitimate consideration for imposing fines is questionable.

Now the organisations epicenter.works and noyb have complained about Austria to the EU Commission, which can start proceedings against the country. Case processing already takes far longer than the statutory deadline of six months, and many cases drag on for years.

This is a sad example of how demanding it has become to enforce GDPR and how many resources are required.

Spanish Data Protection Authority Fines Business Data Company €1.8 Million

The Spanish data protection authority (AEPD) has fined Informa D&B €1.8 million for breaches of GDPR in the processing of personal data about business owners. The company processed personal data from over 1.6 million self-employed persons through a data agreement with CAMERDATA, including names, tax numbers, addresses, telephone numbers and business codes. CAMERDATA had received the data from the Chamber of Commerce, which in turn had received them from the tax authorities under strict confidentiality requirements - the data should only be used to create the public business register and for administrative purposes, not for commercial exploitation by third parties.

AEPD ruled that purchasing data from apparently legitimate sources does not provide protection if the original collection lacked proper legal basis. The Authority concluded that INFORMA has not documented that they have conducted a balancing of interests as a basis for legitimising the processing of data about self-employed persons for commercial purposes.

AEPD also found breaches of the information obligation under GDPR Article 14 - the company claimed that it would be a disproportionate effort to inform 1.5 million business owners individually, but failed to implement alternative notification systems. The Authority rejected this justification and pointed out that the mere occurrence of an alleged disproportionate effort does not automatically exempt from the obligation to provide information.

The fine of a total of €1.8 million was divided with €900,000 for lack of legal basis under Article 6.1 and €900,000 for lack of information under Article 14.

Equally important was that AEPD also ordered Informa D&B to delete all affected personal data within three months.

This is yet another important reminder to everyone who purchases data from others: you must actually check that the supplier has permission to share the information!

Deepfake App That Undresses People: Italian Data Protection Authority Says Stop

The Italian data protection authority has put its foot down and ordered an immediate halt to the processing of Italian users' personal data for the Clothoff app.

The app offers an AI service that creates "deep nude" images - that is, fake nude images of real people. Anyone, including minors, can upload images and create such content, even of children. There is no system to check whether the person in the image has consented, and nothing to show that the images are artificially generated.

The Authority has started a broader investigation of all "nudifying" apps and believes the services pose a serious risk to fundamental rights, human dignity and privacy - particularly when minors are involved. Several incidents in Italian media show how misuse of such tools creates unrest in society.

Venice Fined for Overzealous Data Collection

I know that many readers have attended the major privacy conference in Venice. Venice has now received a fine of €10,000 from the Italian data protection authority for collecting too much data in connection with its so-called "tourist tax". The city introduced a fee for access to certain areas during high season, with exceptions for, among others, residents, commuters and students.

The problem? The municipality required that people who were exempt should log in to a website, provide personal data, download a QR code, show it to inspectors along with a self-declaration - and then have the data checked afterwards by the tax office. But - most of the information was never used for anything. Certain groups, such as students, could simply have shown their student card.

The Data Protection Authority concluded that the system was unlawful, excessive and contrary to the principle of privacy by design. In addition, the municipality had set up payment kiosks with poor security. In total, there was a €10,000 fine for breaches of GDPR.

This is particularly interesting considering that one of the largest GDPR conferences is held in Venice!

United Kingdom Will (Probably) Continue as an Approved Privacy Country

Brexit has created much headache in many areas, and privacy is definitely one of them. Since the United Kingdom left the EU, the country has had so-called adequacy status - that is, approval from the EU Commission stating that British privacy legislation is good enough that we can transfer personal data there without additional security measures. A completely ordinary data processing agreement is sufficient to use British IT suppliers.

But this approval expires on 27 December 2025, and the EU Commission has therefore proposed to extend it until December 2031. On 20 October 2025, the EDPB gave its opinions on the Commission's proposal. The EDPB provides advice to the Commission before it makes a final decision. Generally, they are positive, but there are several areas that concern the EDPB, and which they believe the Commission must monitor closely going forward.

One of the biggest concerns relates to how the United Kingdom now assesses whether other countries have adequate privacy protection. Now it should only be assessed whether the level of protection in the other country is not "materially lower" than in the United Kingdom. Another thing that concerns the EDPB is that the British Secretary of State has now been given extended powers to amend privacy rules through so-called secondary legislation, which does not require as thorough treatment in parliament as ordinary laws.

The Secretary of State can make changes in areas such as international data transfers, automated decisions and the governance of the British data protection authority (ICO). The EDPB believes this may lead to the United Kingdom moving away from EU standards.

Now the EU Commission will make the final decision. For the time being, it appears that the United Kingdom will continue in the privacy club - and given the political situation, anything else would be surprising now.

Do you have any questions?