Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Security cameras mounted on wall.

The Norwegian Data Protection Authority has presented an overview of recently adopted regulations from the EU. There is the AI Act, of course, but also the Data Governance Act, the Digital Markets Act, the NIS 2 Directive, the General Product Safety Regulation, the Digital Services Act, the Critical Entities Resilience Directive Digital Operations Resilience Act (DORA). And new legislation that is expected to come shortly is the AI Liability Act, the Consumer Credit Directive, the European Health Data Space, the European Media Freedom Act and as many as nine other Acts.


From the ECJ there was 32 decisions on privacy in 2023. A search on the EDPB's pages shows that they have published 117 documents in 2023. You can get dizzy with less! Fortunately, not everything is equally important.


In any case, going forward we will have to relate to much more than the GDPR – privacy creeps into many other laws as well. To find relevant GDPR material, I often use GDPRhub which has a fairly complete overview of European decisions, both from courts and data protection authorities. It is a good and easily searchable website that NOYB has taken the initiative to create.


Below are a few cases from the turn of the year that have caught my interest. Happy reading!

Deletion of customer data in loyalty programs

The Finnish Data Protection Authority has presented a practical decision on storage periods within retail. It concerns Kesko, Finland's largest retail chain, and their loyalty program. They stored customer data and purchase data for as long as the customer relationship lasted. The Finnish Data Protection Authority determined that the link to the customer relationship could entail storage for the entire lifetime of a customer and that this was too long. It was also highlighted that purchase history can be used to derive special categories of data. Kesko should also have ensured that customers could choose how much data was collected about them. Kesko was fined, but there is every reason to believe that storage periods and deletion will receive attention in the future. The case is discussed here.

What are adequate security measures?

When wondering what security measures you need to implement, it can be useful to consult a new guide from the Danish Data Protection Authority. The Danish Data Protection Authority receives several reports weekly about breaches of personal data security and many concern accidental access or forwarding of personal data. The Danish Data Protection Authority believes that the breaches could have been avoided with appropriate security measures. The Danish Data Protection Authority has therefore identified 10 typical breaches and provides advice on relevant security measures to reduce the risk of these incidents.


The guide is in particular aimed at employees with influence on an organisation's rules, procedures, training and technical configurations in order to protect against breaches of personal data security. It will probably be quite useful reading for many. Considering the attention NAV has recently received for inadequate access control in Norway, it is interesting to see that the first deviation being addressed is precisely "access rights as needed".

You can find the overview here.

More about what good access control is

The Danish Data Protection Authority has also published a guide specifically for access management. The Danish Data Protection Authority writes that managing user access is a fundamental part of data security, but that many find this difficult. Lack of access management increases the risk of breaches of personal data security – everything from unauthorized access by employees and abuse by former employees, to targeted hacker and ransomware attacks. The guide is available here.

Difficult with consent in hospitals

At the end of November, a slightly unusual case came from the Danish Data Protection Authority. Aarhus University Hospital published some patient photos on Instagram – based on consent. It probably sounds perfectly OK to many.


However, one patient approached the Danish Data Protection Authority and wondered if this was allowed. The Danish Data Protection Authority opened a case and concluded that consent could not be used as legal basis because there is no balance between the patient and the hospital. The hospital is in a position of power, although of course they will not give patients who do not consent inferior medical treatment.


Balance between the parties in consent situations is important and is known to be difficult in working relationships. I am perhaps somewhat surprised by the decision, but it will of course depend on the circumstances in which the hospital asked for permission to publish.

You can find the decision here.

Are your developers using JavaScript?

On 20 January 2022, the Danish Data Protection Authority received an inquiry from a citizen regarding the Danish Agency for Digital Government's use of JavaScript in MitID, the Danish digital ID. The person claimed that JavaScript is outdated and not safe to use for authentication. This has been a topic in the security industry for a long time.


The Danish Agency for Digital Government explained that risk assessments of MitID had been carried out, including code quality. However, they had not considered possible risks for the data subjects when using JavaScript.


The Danish Data Protection Authority noted that the GDPR requires that the data controller shall manage the risk of the rights and freedoms of the data subjects before processing begins. In addition, the Danish Data Protection Authority noted that according to Article 5(1)(f) of the GDPR, personal data shall be processed in a way that ensures appropriate security of the personal data concerned. The Danish Data Protection Authority also pointed out that both Article 5(2) and Article 24(1) of the GDPR state that a data controller shall be able to demonstrate compliance with the GDPR. In order for a supervisory body to be able to assess whether an appropriate level of security has been ensured, data controllers shall document identified risks and the measures that have been implemented.


Not surprisingly, the Danish Data Protection Authority concluded that the risk of the data subjects should have been assessed and that a separate risk assessment of JavaScript should have been carried out, since it is publicly known that the programming language has weaknesses. That this concerned a national infrastructure made it worse. When the Danish Agency for Digital Government referred to an older assessment of JavaScript, carried out several years earlier for "NemID", it was inadequate.


What can we learn from this? That documentation is of course important, even if it is time-consuming.

Is an agreement required for joint controllers?

In ECJ C-683/21, the question was whether a health institute that had outsourced the development of a Covid-19 infection tracking app to a private actor was a data controller. The health institute did not own the app and did not process personal data itself. However, the ECJ ruled that when a business actually contributes to determining the purposes and means of processing personal data, it is considered the data controller. In this current case, the ECJ found that there to be joint controllers of personal data. It was not decisive that the parties had not entered into any agreement on such j joint controllers – the court made an assessment based on the facts.


Presumably there are more controllers than you think that are joint controllers. I think this probably applies in some Groups and also in some development situations. It doesn't take much – and the consequences are actually quite big. It plays a role in one's responsibility whether you are a data controller or a data processor. A very capable English privacy expert, Christopher Millard, wrote an article about this already in 2019. The title was "At this rate, everyone will be a joint controller of personal data!" It is a very readworthy article, and you can find it here.

The SCHUFA cases – important for more than just credit reporting!

SCHUFA is a German company that operates a credit business, they score how good people's finances are. A customer of a bank had her loan application rejected as a result of her score from SCHUFA being too low. When she requested access to the data SCHUFA had assumed in her case, she was given access to it, but not to the algorithm that calculated a score because SCHUFA was of the opinion it was a trade secret. Many legal questions arise here, including whether Article 22 of the GDPR applies. Would SCHUFA be considered to have taken a decision with a large impact on the individual so that the very strict framework of Article 22 has to be applied? Then, as is well known, you can only use consent, agreement or law as legal basis for the processing/scoring.


The ECJ concluded in case C-634/21 that even if SCHUFA did not itself make the decision to reject the loan application, the score played a decisive role in the customer not getting a loan. They believed this was sufficient for Article 22 to take effect for SCHUFA's own activity. In addition, SCHUFA could fulfil the duty of disclosure towards the data subject more easily than the bank, as the bank had no knowledge of how the automated processes in SCHUFA worked.


In Norway, we have a separate law for credit reporting companies, so for them the matter is probably not of great importance. Many other European countries do not have this, and there this decision takes on greater significance. However, it may still have an effect on many other companies in Norway.


A broad interpretation of what constitutes an automated decision in Article 22 expands the provision. In practice, all companies that prepare and sell analyses that other businesses rely on will now have to take this judgment into account. Presumably, this will affect sectors such as employment, healthcare and insurance – areas where it is not uncommon for companies to use algorithms as a basis for decision making.


The judgment also emphasizes the importance of businesses providing clear and comprehensible information about the methods of data processing. Here, too, the AI Act will come in with very similar rules in a fairly short time.


Two other SCHUFA cases came at the same time, cases C-26/22 and C-64/22. They concerned storage periods for credit information, but I don't think these are that relevant in Norway as we have separate legislation on this.

Compensation for pain and suffering? It depends!

When the GDPR came, many were unsure what the scope would be for compensation related to non economic loss. We know a lot more about that now because there are many cases about this.


In a fairly recent case from a court in Cologne, it was determined that it was not sufficient to refer to discomfort, anxiety and fear to claim compensation under GDPR Article 82. The case concerned data that went astray after a breach at Facebook in 2019. Even if the claimant actually lost control of the data after it was published on the dark web, this was not sufficient to constitute a relevant damage. It was not sufficient that the claimant referred to a "negative consequence" and abstract loss of control, it should have been referred to a clear damage.


The Court said at the same time that this does not mean that there is a minimum threshold for compensation, but that the damage must at least be determined objectively. Similarly, the court found that the claimant failed to show how he suffered harm from the spam e-mails and SMS he received after the breach, or how he spent time and effort dealing with the loss of control over his data.


Here it is useful to take a look at CJEU C-300/21 about the Austrian Postal Service (Österreichische Post), from 4 March 2023, which is referred to in the judgment above. The Österreichische Post case was one of the first cases concerning this. The Austrian Postal Service made assumptions about the political affiliation of the population based on socio-demographic criteria. One person complained. He had not consented to this and felt violated. The claimant claimed that this caused him great upset, loss of confidence and a sense of exposure by the postal service storing data about his supposed political opinions.


The case ended up before several Austrian courts and they consistently rejected the compensation claim. In the process, an Austrian court asked the ECJ to rule on several matters and the following was established:


For the right to compensation to arise, three cumulative conditions must be fulfilled. There must have been a breach of the GDPR, there must be material or non-material damage as a result of this breach, and there must be causation between the damage and the breach. A mere breach of a GDPR provision is not sufficient to have a right to compensation, unless the claimant shows that he or she has suffered damage and that the breach in question actually caused it.


The ECJ nevertheless determined that the member states cannot make it a condition that there must be a threshold of "seriousness" being fulfilled in order for there to be a right to compensation for non material damage. The Court ruled that the term "damage" should be interpreted broadly and also stated that a different result would mean that claims for compensation could have different outcomes in different countries, which would not be in line with the GDPR.


Finally, the ECJ pointed out that the GDPR does not contain any rules for determining the amount of compensation to be paid. Hence, national courts in the EU can apply national rules when determining the amount of compensation. This means that the level of compensation will vary. You can find the judgment discussed here.


What was established in the Österreichische Post case has in many ways become the gold standard for how assessments should be made. It is therefore rare that businesses have such compensation claims against them.
Here, I will also mention the ECJ's new decision of 14 December 2023 (C 340/21), which has several other principled clarifications.


After a cyber-attack against the Bulgarian National Tax Service, in which information was leaked, a user complained and claimed compensation. The person claimed to have suffered non-material damage; fear that personal data could be misused in the future or that they could be pressured, attacked or even kidnapped.
The ECJ assumed that even if there was a breach of the GDPR, it did not necessarily mean that the data controller had neglected to use appropriate technical and organisational measures. The ECJ said the EU legislator's intention was to "reduce" the risk of privacy breaches, without requiring the risk to be completely eliminated. This is good news for many. The measures that have been implemented must be assessed specifically.


The ECJ went on to say that even if a GDPR breach is caused by third parties (hackers), the data controller is not exempt from liability. One must look at whether the implemented technical and organisational measures were appropriate. Note, however, that it is the data controller who has to be able to document that adequate measures have been taken.


Next, the ECJ referred to the above-mentioned C-300/21 (postal service), and wrote that perceived fear and possible misuse of personal data by an individual can constitute relevant damage. The ECJ wrote that it is up to the national court to verify whether an individual's fear can be considered well-founded. If you are to pursue such a case, you should perhaps involve psychological expertise that can say something about real fear. You can find the judgment discussed here.

Do you have any questions?