Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Security cameras mounted on wall.

Since the last newsletter, I have been in court with a case of great fundamental importance – the Grindr case. The case is in part about how consent is obtained, but also about whether it is a special category of personal data that someone has a user on the Grindr app. The latter question is probably the most interesting. The Grindr app has about 20 different categories of sexual orientations, and even if a user chooses one of them, this is not shared with adpartners. The Privacy Appeals Board's decision is very unclear as to why and how having a user there is sensitive as defined in the GDPR. I will not go through the arguments here, but if the judgment maintains that this is a special category of personal data, it will be of great importance. Not just for Grindr, but also for many other businesses aimed at minorities. It will become more difficult to conduct business aimed at minorities – both because all activities must be based on consent only and because security requirements will make everything more expensive. This case reflects something that is becoming increasingly clear – that privacy affects business models and interferes with businesses' ability to compete. I think we will see more trials in the time to come. These cases have become too important to leave an administrative decision untested. One of the issues that will be tried is how much weight can be attached to guidelines from the EDPB when their creation does not involve the same extensive preparation as regular laws.


Otherwise there is, of course, a lot going on. Once again, there is increased focus on compliance for the big data providers – see the story about Microsoft below. We need to pay more and more attention to cookies and marketing. Joint data controllership is popular at the moment – note that this also applies to responsibility for discrepancies and breaches.

A case of fundamental importance – is Microsoft compliant?

Many will remember that the Dutch Data Protection Authority has investigated whether Microsoft provides a GDPR compliant service. Many will also recall that the EDPS has conducted similar investigations into the MS services used in the Commission. Large parts of these investigations are publicly available, and one of the main findings is that European authorities are of the opinion that Microsoft is assuming too large a role for some of the user data generated during use. This is often referred to as metadata. The EDPS writes the following in a recent press release:


The EDPS has found that the Commission has infringed several provisions of Regulation (EU) 2018/1725, the EU’s data protection law for EU institutions, bodies, offices and agencies (EUIs), including those on transfers of personal data outside the EU/European Economic Area (EEA). In particular, the Commission has failed to provide appropriate safeguards to ensure that personal data transferred outside the EU/EEA are afforded an essentially equivalent level of protection as guaranteed in the EU/EEA. Furthermore, in its contract with Microsoft, the Commission did not sufficiently specify what types of personal data are to be collected and for which explicit and specified purposes when using Microsoft 365. The Commission’s infringements as data controller also relate to data processing, including transfers of personal data, carried out on its behalf.


These are big words. But we have heard similar statements from authorities for many years without any major changes really having taken place. It looks like this is about to change. The EDPS orders the Commission to rectify these matters by 9 December 2024. If they do not, the Commission must "suspend all data flows resulting from its use of Microsoft 365 to Microsoft [...]".


There is probably reason to believe that Microsoft wants to keep the Commission as their customer – and it is conceivable that if the Commission is able to make changes, they will perhaps apply to others, as well. In any case, there is every reason to pay more attention to this going forward.


The press release from the EDPS was announced some time before Easter, and is available here.


Here we must also look at what the Danish Data Protection Authority is doing. In the wake of the so-called Chromebook case (transfer of personal data to the US via Google), they have received questions from the region of Southern Denmark about their planned migration to Microsoft 365. It is hardly surprising that the Authority sees major similarities between these cases. At an overall level, it must be allowed to say that supervisory authorities often use big words in these cases, but that so far, it is relatively rare for the strict interpretations of the law to be enforced, even if they can be.


A statement from the Danish Data Protection Authority can be read here.

More about online stores and customer data - new case from Finland

The Finnish Data Protection Authority has issued its largest fine to date. Verkkokaupa, which is a major online store, received a fine of EUR 856,000 in March, close to ten million Norwegian kroner. The reason was, among other things, that they had not set a retention period for customer data. They argued that it was necessary to retain the data for a long time because some of the items would be used for a long time and it was useful to retain the information in case the customer wanted to complain. Partly, they argued that there was a parallel between the duration of the customer relationship and the retention period. They also claimed that they deleted data associated with active customers - defining this as customers who had not logged in for 6 years. None of this was accepted. The authority is clear that retention periods must be determined.


In addition, the company had made it mandatory to create a customer account to shop with them. The authority held, not surprisingly, that it is not necessary to create an account to make a purchase.


Like most major privacy cases, it has been appealed and is not final. However, it is worth noting what the authority has stated and adjusting one's own practices if they are not in line with this. I think it is likely that the authority's opinion on this matter will stand.


The case is referenced here in English.

Be careful what data your customers can see – CaixaBank fined EUR 5 000 000

CaixaBank has been fined EUR 5 million for a breach of privacy rules in connection with a bank customer being able to see transfers made by another customer. The fine has been subject to dispute, and CaixaBank has signalled that they will challenge the sanction because they believe it is disproportionate as there was an exceptional circumstance. From what I have found, the breach only involved a single instance where a customer could see another customer's data related to a money transfer. The person could see the sender and recipient as well as some other data relating to the transfer. With this in mind, it is understandable that they believe the sanction is big.


The decision has as yet not been published in English, but is discussed here.

Report discrepancies in time – and cooperate with the Data Protection Authority

On 24 April 2023, a company reported discrepancies to the Austrian Data Protection Authority. On 6 March, they had been subject to a ransomware attack where data was encrypted. It was unclear whether any data had been stolen, but information about salary for 55 employees was affected. Steps were taken to remedy the data breach, including disconnecting the entire network from the internet, reinstalling and securing systems, and securely erasing encrypted hard drives. The Data Protection Authority initiated investigations, but received inadequate response from the company. Partly they did not respond, and partly they simply repeated the first discrepancy report. In the end, they were fined EUR 5900. I do not really believe that many Norwegian companies would behave this way, but there is increased focus on reporting discrepancies in a timely manner and cooperating with the Data Protection Authority.


Link

And then there is the matter of cookies

This case is not entirely new, but from late last year. The French Data Protection Authority (CNIL) fined Yahoo no less than EUR 10 000 000 for not having cookie consents in place. The CNIL also emphasised the fact that if a user wanted to withdraw their cookie consent, they would lose access to their email. Again, it is interesting to note that the one-stop-shop mechanism in the GDPR does not apply here, since the supervisory authorities refer to national electronic communications rules. So far, these issues have not been in focus in Norway, but we will probably get new rules on cookies shortly, and then there will be every reason to be careful with how cookies and trackers are used.

The case is discussed at here and here.

More about cookies from the Danish Data Protection Authority

In 2023, the Danish Data Protection Authority considered two cases concerning GulogGratis' and JFM's use of cookies for preparing statistics. The Data Protection Authority believed that the preparation of statistics was not a necessary part of the services offered by the companies. The companies have later asked for the cases to be reopened, and as I understand, they argue that they need statistics to be able to tell advertisers whether it is relevant to buy banner ads on their websites. The Data Protection Authority seems to believe that this is a marketing purpose, which is something other than pure statistics. What constitutes statistics and for what purposes is not always a simple matter. Nevertheless, we can expect more focus on this from many data protection authorities in the time to come.


The cases are discussed here.

Answer when (if) the Data Protection Authority asks!

It is well known that in 2023, the EDPB conducted a coordinated inquiry into the role of data protection officers. Greece also participated and sent questionnaires to 31 data protection officers in the public sector, one of whom did not respond. The Greek Data Protection Authority then resent the questionnaire, but the data protection officer still did not respond. After the deadline had expired, the officer attempted to respond to the questionnaire, but by then the link had been deactivated by the Data Protection Authority. The data protection officer contacted the Authority, who reactivated the link, but once again, the officer did not respond in time. The officer claimed that the reason for the late response was technical problems with the website where the questionnaire was, which prevented submission even after several attempts. The Authority did not agree with this, and imposed a fine of EUR 5000 on the company.

New coordinated enforcement action from the EU

After the EDPB looked into the role of data protection officers in 2022–2023, the focus is now in 2024 on how businesses safeguard data subjects' right of access.


Personally, I believe that this varies quite a bit. At the same time, it is fundamental for privacy that this is safeguarded in a good manner, because if we cannot ask and get answers to how our personal data is actually used, it is difficult to enforce any rights at all. And if you work in an organisation that gets questions about this – make sure to answer in time.


Read more about this issue here.

BYOD and two-factor authentication

From Spain comes a somewhat special case where an employer could not ask for the employees' private mobile phone number to ensure the use of two-factor authentication when working from home. It appears that the case is based on the fact that the employer had not made mobile phones available to their employees, but instead based themselves on the use of private mobile phones. In Norway, most employers will make mobile phones available, and the issue is unlikely to come to a head. But it is interesting that such strict limits are set as to what an employer can ask for.


The case is discussed here.

More about advertising, TCF and joint data controllership

Many of you have heard me (and others) talk about the IAB's Transparency Consent Framework over the past two years. This framework is highly relevant to digital marketing. It is especially important for publishers (newspapers and media houses) and advertising platforms. Last month, the ECJ issued a new decision regarding the IAB and the TCF. The decision is complex, but it establishes, among other things, that the string of information showing that a consumer has consented to a specific purpose is personal data. This is perhaps not very surprising. However, the judgment also states that the organisation IAB Europe is joint data controller with the company using the framework. This means that special agreements must be entered into specifying which actor is responsible for what.


The decision can be found here and is discussed on multiple sites, including here (a long article) and here (shorter article).

More on joint data controllership – now for breaches

In a serious case from Slovakia about the murder of a journalist and his fiancée, extensive investigations were carried out. Among other things, several mobile phones belonging to a suspect were examined. The investigations were partly carried out by Europol and partly by Slovakian police investigators. The messages on the phones had been sent via encrypted services, but Europol had managed to decrypt (parts of) the material, which they sent to the police investigation in Slovakia.


Later, media outlets published content from the suspect's messages, including intimate communication between him and his girlfriend. The suspect complained to Europol that the information had become public and claimed EUR 100 000 in compensation. He believed he was entitled to half this amount because he had wrongfully been claimed to be on a "mafia list", and to the other half because intimate communication with his girlfriend had become public. Europol disputed that they had any responsibility, and was of the opinion that they were not a joint controller with Slovakia.


The European Court of Justice ruled that it is not necessary to decide which of these two entities – Europol or the Member State – was the cause of the discrepancy. For joint liability to arise, the individual affected need only demonstrate that, in connection with the cooperation between Europol and the Member State concerned, unlawful data processing has been carried out which has caused damage to him or her. This statement is, in my opinion, important. This means that several parties risk liability for damage occurring with another party.


Regarding the leak of the so-called "mafia list", the European Court of Justice found that the data subject had failed to show that the "mafia lists" where his name had allegedly been included, had been prepared and kept by Europol. Compensation for the claim that he had been on the "mafia list" was therefore not considered further.


Europol's argument that they had implemented appropriate technical and organisational measures to protect personal data was not heard. The Court noted that data of such an intimate nature shows the need for protection to be strictly secured. Since unauthorized access had taken place, this constituted a sufficiently serious breach. Without having read the original text on this, this appears to be a rather strict interpretation.


In addition, the Court ruled that the European Union may incur liability as a result of the publication of the data subject's intimate communication.


The result was that Europol and the Slovak Republic are jointly liable for the unlawful data processing that caused the data subject to suffer non-material damage.


As for the assessment, the Court emphasised that only transcripts of the communication had been published, no photos. For this, the person in question was awarded EUR 2000 in compensation. The amount is not high.


The case is discussed here.

More from the ECJ

In March, the ECJ ruled in the Endemol Shine case (case no. C-740/22) concerning whether oral disclosure of information could be considered processing of personal data.


The case originated from an oral request from Endemol Shine to a Finnish court for oral information about criminal proceedings against a person involved in a competition they had. The court refused to disclose the information, holding that it had no basis for conducting a search for that information in their systems. Endemol Shine lodged a complaint against the decision. The court of appeal asked the ECJ (1) whether oral transfer of personal data should be considered data processing, and (2) whether data relating to the criminal conviction of a natural person may be disclosed orally to anyone for the purpose of securing public access to official documents.


The ECJ was of the opinion that the term "processing" under the GDPR should be interpreted broadly and includes oral disclosure. They wrote that "the possibility of circumventing the application of [the GDPR] by disclosing personal data orally rather than in writing would be manifestly incompatible with [its objectives]".


At the same time, they say that oral disclosure of personal data can still fall outside the scope of the GDPR when the processing is manual, and the data processed does not form part of a filing system. However, the CJEU is of the opinion that this does not apply to the Endemol Shine case, as the personal data requested by the company is in a court database which is a filing system.


With regard to the relationship between the GDPR and the right to public access to official documents on criminal convictions, the CJEU considered that it is not compatible with the GDPR to disclose this to anyone who requests it without requiring them to establish a specific interest.


The case is discussed here.

US decision of European significance

The FTC (Federal Trade Commission) primarily makes decisions directed at US companies. However, they have now reached a settlement with a company with connections to the Czech Republic.


On 22 February 2024, the FTC announced that Avast Limited will pay USD 16.5 million, and the FTC prohibits Avast from selling or licensing personal data for advertising purposes. This is the result of allegations that Avast and its subsidiaries sold such data to third parties after having promised that their products would protect consumers from online tracking activities.


Through their Czech subsidiary, they collected consumers' browsing history, stored it indefinitely, and sold it without adequate notice and without consent. The FTC also claimed that the company misled users by claiming that the software would protect consumer privacy by blocking third-party tracking, and that all sharing would be in "anonymous and aggregated form". The FTC claimed that the company had sold data to more than 100 third parties.


Going forward, Avast will base any sale of such data on consent, inform consumers whose data has been sold, and adopt a comprehensive privacy program.


The case is discussed here.

Italy's largest fine to date

The Italian data protection authority (Garante) has fined the energy company Enel Energia SpA EUR 79 000 000 for illegal advertising.


Enel had collected lists of potential customers from four other companies, including their address, phone number, municipality of residence and energy supplier.


Garante emphasised, among other things, that Enel Energia violated the GDPR by not carrying out an adequate risk assessment of their customer system (CRM), and that they did not ensure that other companies that got them new contracts complied with the regulations. This also included that the content of the necessary data processor agreements was inadequate.


Just as the FTC often does, Enel Energia has been ordered to inform affected individuals about the outcome of the case and a number of other measures to improve privacy.


Did I hear someone say that they are focusing on vendor audits? Would be wise.


The case is discussed on many sites, including here.

AI and voice recognition

The use of audio recordings and AI can generate many unpleasant situations. There have already been reports of "fake calls" where someone has been tricked into believing that a family member or a manager is asking for a money transfer. This is going to affect how we deal with audio recordings in the future. A small French company's business idea is that voice recordings can be anonymised, or possibly watermarked.


We are going to see a lot of new technology going forward. Read more about AI and the use of voices here.

Finally – do you want your house and the number plate on the car outside your house to be easily recognisable on Google Maps?

It is always good fun to roam around online before writing these newsletters. I often find many exciting legal issues, but every once in a while something a little different comes up as well. Google Maps is convenient, but you may not want everyone to be able to see exactly what was outside your house when Google drove by. Or who was standing in the doorway. Here is a nice little article on how to get Google to blur the image that is open for everyone to see. Perhaps it would be a good idea to check what Google Maps has posted of the surroundings around your house?


Read how to proceed here.

Do you have any questions?