Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Security cameras mounted on wall.

Those of us who work with privacy have a tribal language. I recently heard myself say, "Because OpenAI is not yet on the DPF list, due to the SCCs you have to create a TIA in addition to the DPIA, but I think you can skip a FRIA." And many in the room understood what I said. But not everyone – even though they work in privacy. It is thought-provoking that privacy has become so complicated. 

I believe it is wise to require that those who will work with privacy in a company have a few years of experience – or get help and training from someone with experience. And here I can share a little personal good news: the basic textbook on privacy that I wrote with Simen Sommerfeldt came out in a new edition last week. When we occasionally get a new employee who will work with privacy, I actually ask them to read it. Many say that it is useful. It explains DPF, TIAs, DPIAs and much more. And it feels great to hold a book that you have written. 

Another thing I've been thinking about lately is the extent to which "case law" is important for privacy. It's actually quite a foreign situation for rule interpretation in Norway. We are used to being able (often) to read a Norwegian legal text and understand how it should be applied. It's usually not like that with the GDPR. The texts are so broadly formulated that in practice court decisions best describe how a paragraph should be understood. This makes it harder for many to understand how the rule should be applied. So that's why we need to keep ourselves updated, those of us who work with this. Below is a selection of the last month's important decisions. Happy reading!

Is the threshold for what constitutes automated decisions being lowered?

There is, at least, increasing attention on what constitutes an automated decision. 

A user of platform X (formerly known as Twitter) had their account temporarily restricted (shadowbanned) as a result of a message they posted containing a sensitive term. The user took the case to court and requested, among other things, that X provide information on how their automated decision-making works. X argued that it was not a case of automated decision-making because the system's parameters were determined by humans. The court rejected the argument and concluded that the decisive factor for what would be considered "an automated decision" was whether there was human intervention in the actual decision-making process, and there was not. This could prove to be a quite practical decision with implications for many.

Link

What constitutes a relevant breach is always important

Here is a new decision from Spain that (again) shows that it doesn't always take much for something to be considered a breach. 

GEOPOST ESPAÑA, S.L. was fined 45,000 euros for a breach of the GDPR. The case began with a complaint from a customer who discovered that the delivery company had left a note with the individual's personal data on a mailbox at the wrong address - where the note was accessible to several unauthorised persons. This was done by the postman – who believed he was at the correct address and that the recipient was not home – and left a note with information. 

The AEPD concluded that this was a breach of confidentiality according to GDPR articles 5 and 32. The company claimed that the breach of article 32 also covered the breach of article 5, but the AEPD found that both articles had been violated. It was also established that the company's responsibility could not be waived due to an employee's negligence. 

This is an example that one cannot refer to employee negligence – and probably not routine breaches either – to avoid responsibility. 

The case is described here.

Not all deletion requests must be complied with

The Belgian Data Protection Authority has handled a case where the question was whether a data subject had the right to have their personal data deleted. The request for deletion was justified on the grounds that the information could have a significant negative impact as well as possible defamation for the person concerned. The data processor declined the request because the information was necessary for a superior public interest. The refusal was then appealed to the Belgian Data Protection Authority.

The supervisory authority believed that the refusal related to the deletion request was justified as it was considered "reasonably plausible" that the information would be needed in court. As I understand the case, the police had shown interest in the information in connection with a case, and there was reason to believe that it would become relevant in a legal proceeding. The relevance of the information for the potential case was given weight in the decision that the information should not be deleted. 

The case is mentioned here

Interested in creating an AI chatbot?

It might be wise to know what you're doing. The Dutch Data Protection Authority has issued a warning about the increased use of AI-powered chatbots following several leaks of sensitive data. The supervisory authority points out that although digital assistants like ChatGPT can save time, they also pose a risk in connection with data protection.

The warning comes after several cases in the Netherlands where sensitive personal data was leaked. In one of the cases, a medical assistant entered private information about patients into a ChatGPT-based program, where the information was then stored on the technology company's servers and potentially used to train the software.

The supervisory authority emphasizes that companies must have clear guidelines for employees on the use of AI-powered chatbots to ensure that privacy is protected. Personally, I believe many companies have such guidelines, but I see some challenges with whether they are actually adhered to or not. 

Link

Data Protection Officers must not be named

The German Federal Court of Justice (BGH) has issued an important ruling clarifying the requirements for information about data protection officers. The case had a large set of facts where several questions revolved around what the data subject should be allowed access to.

The individual wanted access to all the information a bank had about them: all notes, assessments, algorithms used by the bank, all data processors, and names of all persons and institutions that had access to the data.

Parts of the case ended up in the German Federal Court of Justice, which concluded, among other things, that the bank did not have to disclose the name of the bank's data protection officer. The court determined that under GDPR Article 13(1)(b), it is sufficient to provide the contact information of the data protection officer without necessarily including the name. 

Link

Loyalty programs are becoming more common – and aren't always easy to get right

The Greek supermarket chain Alfa Vita has a loyalty card program called "AB Plus" that collects personal data from its customers. NOYB claims that they do not comply with basic GDPR rules and refers to a case where a customer requested access to the personal data the company had registered, but only received insight into a few details. AB processes customers' purchasing habits, the frequency of their visits to an AB store, the use of offers communicated to them, their home address, the total cost of their purchases - for profiling. The customer who demanded insight only received a list of purchases made and her contact information. It is quite clear that the company has more information than just that.

In addition, the supermarket chain requires customers to upgrade to "AB Plus Unique" and consent to data sharing with third parties to find out how much money the customer has saved in the program. This seems not only challenging in relation to GDPR but also like quite poor customer service.

In light of this, NOYB has filed a complaint with the Greek Data Protection Authority. I guess the supermarket chain will clean this up quite quickly.

NOYB writes about the case here.

Consent to facial recognition at gym was valid – if...

The Danish Data Protection Agency (DDPA) recently addressed a complaint regarding the use of facial recognition to gain entry to a gym. A practical access control.

A user complained to the DDPA claiming that the consent to facial recognition was invalid because there were no other alternatives. It turned out that alternatives did exist, but they were not communicated.

Users of the gym who did not wish to consent to facial recognition could be let in by the reception during staffed hours, and outside of staffed hours, they could contact a 24-hour support service that could either remotely open the door or generate a code for the door.

The DDPA concluded that the gym could obtain valid consent for the use of facial recognition, provided that the consent was informed and properly obtained. This means that members must receive clear and explicit information on how their biometric data will be used, and that they voluntarily give their consent without pressure.

The gym received criticism for informing the complainant that there were no alternatives to facial recognition when consent was attempted to be obtained, as this meant that any consent was obtained under pressure.

The DDPA concluded that the gym could obtain valid consent for the use of facial recognition, provided that the consent was informed and properly obtained. This involves ensuring that members received clear and explicit information about how their biometric data would be used, and that they voluntarily gave their consent without pressure.

The center was required to offer another access alternative for those who did not wish to use the technology.

If you are considering starting to use facial recognition, it is wise to read the decision which is available here.

Information shared as a private individual or as a data controller?

The Italian Data Protection Authority recently dealt with a case where a mayor was accused of illegally sharing information.

A member of parliament had received several speeding tickets from a municipality and believed these were "unfair." The member brought up the issue in a parliamentary session and discussed it in the media. A newspaper interviewed the mayor of the municipality that issued the fines, and the mayor confirmed the number of tickets and that the parliament member's driver's license had not been revoked. The parliament member did not appreciate this and complained to the data protection authority, arguing that the mayor's disclosure of such information was illegal.

The mayor claimed that the information was shared as a private individual and not as a representative of the municipality, and that the parliament member had also made the information public, so it was not really about new information.

However, supervisory authority concluded that the mayor acted as a representative of the municipality, which could therefore be considered as a data controller. They stated that Italian privacy law only allows public administrators to share information with third parties when sharing is authorized by law. Therefore, it was irrelevant whether the information had already been published or not. The authority found that there had been a breach of several GDPR articles.

A somewhat unusual case, in many ways, but it could become important for those representing the public sector.

Link

Another huge fine

The Dutch Data Protection Authority has recently imposed a fine of 290 million euros on Uber because the supervisory authority found that Uber had transferred and stored sensitive data about European taxi drivers in the USA without complying with the transfer requirements set by the GDPR related to securing such data. The personal data about the drivers included payment information, identification information, and in some cases also information about criminal records and health data. The correct transfer tools were not used for a period of 2 years.

The investigation was initiated after more than 170 French drivers submitted complaints to the French human rights organization, Ligue des droits de l’Homme, which then contacted the French Data Protection Authority. Because Uber's headquarters are in the Netherlands, it was the Dutch Data Protection Authority that imposed the sanction, but the investigation was carried out in close cooperation with the French Data Protection Authority and in coordination with other supervisory authorities in Europe.

Uber has already announced that the case will be appealed. On one hand, it is thought-provoking that the GDPR repeatedly imposes very large fines for activities that have concluded. On the other hand, of course, the deterrent effect would be diminished if violations committed by large entities were not sanctioned significantly.

Link

Recordings must be deleted when they should be deleted

The Danish Data Protection Agency (DDPA) has issued serious criticism to a nightclub for lacking insight into and deletion of video surveillance recordings. The case began when a citizen complained that the nightclub did not give him access to the recordings in which he was depicted. The nightclub denied access because they believed it could compromise security and crime prevention at the venue, and that they could not anonymize other individuals in the recordings without also anonymizing the complainant, as people were standing close to each other. Before the case was resolved, the nightclub had also deleted the video due to a human error.

The DDPA concluded that the nightclub could not refuse access by referring to public interests without support from the police. Furthermore, it was criticized that the recordings were deleted before the DDPA had finished processing the case, as this violated the principles of legality, fairness, and transparency.

Link

Discussions about the legal basis for AI training

No one has missed the discussions about Meta's AI training earlier this year. In that sense, it would only be expected that focus would also be placed on other actors who have actually done the same thing that Meta wanted to do.

The Irish Data Protection Commission (DPC) has taken legal action against X (Twitter) due to their practice of using personal data from EU users to train AI models. The DPC claims that the practice breaches GDPR.

According to the DPC, X has neither provided sufficient information about how the data is used nor obtained the necessary consent.

Despite X implementing some remedial measures, such as a mechanism to opt out of such data processing, the DPC argues that many users still have not received adequate protection. Therefore, the DPC has asked the court to order X to stop, limit, or prohibit the processing of personal data for developing, training or improving AI systems. There is reason to believe that this will be an ongoing case for a long time to come.

Link

In connection with the case in Ireland, the privacy organization Noyb has also filed several other complaints against X in Austria, Belgium, France, Greece, Italy, the Netherlands, Spain, and Portugal. The complaint, as in Ireland, alleges that X uses personal data from over 60 million users illegally to train their AI technologies. NOYB believes consent should have been obtained. NOYB hopes that the involvement of several European data protection authorities will increase the pressure on the Irish DPC and X to comply with EU legislation. 

Link

Employees on their way out? The email account should be deleted

The Belgian Data Protection Authority (APD/GBA) has imposed a significant fine on a company for several serious breaches of the GDPR regulations.

An employee of a company responsible for managing several residential properties was dismissed. After the dismissal, the employer kept the employee's email address active, with the justification that it was necessary to ensure a seamless transfer of work tasks to a new employee. The employer referred to its legitimate interest. They neither complied with nor responded to the employee's request to delete the account.

The APD/GBA determined that the email address of the former employee was personal data that must be closed when the individual leaves the job, and that an automatic reply message should be set up. It is emphasized that exceptions can be made for certain roles but, in this case, the inbox had been active for over 5 months, which violated the GDPR. The employer could not use legitimate interest as a legal basis and also breached the GDPR by not responding to the deletion request.

In Norway, we have special rules about deleting email accounts when someone leaves, but it is interesting to see that supervisory authorities come to roughly the same conclusion without such special rules. This can be important for companies that operate in many different European countries. 

Link

We may not be completely done with transfer issues just yet

The French data protection authority (CNIL) has expressed concern because the current European certification scheme for cloud services (EUCS) no longer allows providers to demonstrate that they protect stored data from access by foreign authorities.

According to CNIL, this leads to an increased risk that data stored by cloud service providers with non-European parent companies could be disclosed to foreign authorities. It is emphasized that stricter security measures are necessary to maintain a high level of data protection for European citizens. They recommend introducing "immunity criteria" in the EUCS certification, to ensure that sensitive data is not subjected to legal pressure from non-European countries. Now, GDPR requirements will always apply in the background as a safety net, and if a provider claims to be "compliant" with other cloud service requirements, one must check that this actually aligns with the requirements of the GDPR.

Link 1

Link 2

GDPR is unlikely to be amended anytime soon even though the regulation is not perfect

GDPR is not perfect. Far from it, in fact, and many agree on that. Nevertheless, the European Commission has decided not to amend the GDPR, but rather to focus on strengthening the enforcement of the regulations. The decision follows a report from July 2024 that described significant challenges with the enforcement of GDPR. The report highlighted that many member states struggle to implement and enforce the rules effectively, leading to inconsistent protection of personal data across the EU. No surprise there.

To strengthen the enforcement of the rules, it is proposed to increase the resources of national data protection authorities and improve cooperation between them. The Commission will also focus on improving training and awareness of GDPR among citizens and businesses. This includes campaigns to increase understanding of privacy rights and obligations, as well as guidance for businesses on how to comply with the regulations.

The Commission emphasizes that although GDPR has been effective in setting standards for data protection globally, there is still a need for improvement and adaptation to meet new challenges in the digital age.

The report is somewhat interesting reading, even though the issues are quite high-level. But for those who like the big picture, it can be found here.

Do you have any questions?