Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Security cameras mounted on wall.

It's always interesting to write newsletters. This time I'm surprised that there are so many stories about biometrics. There seems to be a trend that biometrics are allowed for various control measures and authentication functions. We have long said that it's difficult to use biometrics for control, but I believe that the scope for this is opening up. There are several stories about this topic below.

In Norway, there is also much debate about the legal basis for training AI. Much of this is related to the fact that Meta wants to use legitimate interest to train Llama, which is the name of their AI. Llama is used by many others besides Meta, as well, including Norwegian public enterprises. More guidance will come from the data protection authorities and the EDPB on what legal basis is correct, but the EDPB's task force on ChatGPT does open up for using legitimate interest. Personally, I don't think it's a good idea to have consent as the only possible criterion, but if legitimate interest is used, there must be a simple opt-out option. I will return to this topic later. 

I wish you a wonderful summer – and don't miss the last newsletter story: Employers must make sure that their data protection officer is not overworked. Have a relaxing holiday! 

Facial recognition on the streets of Germany (!) and at airports

In Germany, they have started testing a biometric surveillance system (facial recognition) that will make it easier to identify and apprehend suspects. This has caused major protests. The protests are not surprising, but what is surprising is that this initiative is being taken in Germany, a country that has traditionally had a strong focus on privacy. This has led to a public debate about the fine line between citizens' security and their individual freedoms.

German police wish to use high-resolution cameras and real-time facial recognition. It has recently emerged that the German federal criminal police (BKA) have used images of approximately three million people to test facial recognition systems. The BKA reportedly extracted almost five million facial images from the police's central information system in 2019 to evaluate the accuracy of different manufacturers' solutions. 

Internal communication between the BKA and the federal data protection authority shows that they have sought to base this on legal bases related to scientific research, but doubts have been raised as to whether this is legal, and there have been demands for clearer legal regulation on how security authorities may test software.

The case is discussed here

European Data Protection Board (EDPB) guidelines on facial recognition at airports

The EDPB has expressed concerns regarding the use of facial recognition technology at airports. More specifically, this applies to the use of biometrics in certain types of situations such as security checks, baggage drop, boarding, and access to passenger areas. They recognize that implementing such methods will greatly contribute to airport security. However, once again such technology brings the relationship between privacy and individual freedom to a head. 

As the EDPB often does, they outline different scenarios and comment on them. They make practical assessments of the types of processing that can be considered legal, and place much emphasis on whether the individuals have control over their own data or not. It is considered less intrusive if the airport does not store the biometric data about the individual. They also have some intermediary solutions where the data is stored centrally, but the individual controls a key to it. When the data is stored centrally, they emphasise a short storage time, for example 48 hours. This is as expected. Interestingly, the example they are most critical of is whether the central storage is cloud-based. 

Although there are good reasons for using such technology, it must constantly be balanced against the need for individuals' right to privacy and freedoms. However, it seems as though / if there is an increasing opening for these types of systems.

You can find the guidelines here.

More biometric surveillance also in Sweden

In Sweden, a committee has examined the possibilities for the police to use automatic facial recognition technology, as well as whether the existing use of technology for automatic licence plate registration can be expanded and made more efficient. They have also looked at the police's possibilities to gain access to camera surveillance material from others.

The proposals in their report include:

  • The police should be able to conduct camera surveillance on roads, streets, squares and other public places that are used for motorised traffic without having to make a document the balance of interests before the surveillance starts. Material gathered may only be processed for the purpose of preventing, obstructing, detecting, investigating, or prosecuting criminal offences punishable by three years' imprisonment or more, and may be stored for six months after it has been collected.
  • Technology for real-time biometric remote identification may be used in public places for crime-fighting purposes to the extent permitted by the EU's AI Regulation. However, the technology may only be used in individual cases, and only with permission from the authorities.
  • A clarifying rule should be introduced for what applies when the police request access to surveillance material from others. Material may be disclosed when it is necessary to investigate a committed criminal offence punishable by imprisonment, or to prevent, obstruct or detect criminal activity involving criminal offences punishable by imprisonment.
  • The Swedish Transport Agency (corresponding to the Norwegian Public Roads Administration) should disclose material from road, toll, and civil infrastructure cameras to the police if the data is necessary in an emergency to prevent or investigate a criminal offence punishable by three years' imprisonment or more, or a criminal attempt or preparation, or to detect or obstruct such crime.
  • Personal data collected through camera surveillance may be made available internally in the police to specified officers who need the data to maintain public peace, order and security, or to prevent, obstruct, detect, investigate or prosecute criminal offences punishable by three years' imprisonment or more. The data may not be stored for longer than six months after it has been collected.
     

The report proposes that the rules should come into force on 1 January 2025 and - from what I hear - the frustration over gang related crime in Sweden is so great that this is actually expected to become a reality. 

Ryanair met with complaints for requiring biometric identification

The travel organization EU Travel Tech has complained about Ryanair to the French and Belgian data protection authorities. EU Travel Tech's members include well-known online travel agencies such as Airbnb, Booking.com and Expedia Group, as well as Amadeus. The complaint challenges Ryanair's introduction of new rules for biometric customer identification introduced in December 2023. Customers without a Ryanair membership must submit photos of themselves, their signature or their passport to book and check-in online. This applies even if the booking is made through other Online Travel Agencies (OTAs).

EU Travel Tech believes that this procedure is in violation of the GDPR. They are asking for an urgent investigation into Ryanair's practices under Article 66 of the regulation and demanding temporary measures to suspend the biometric verification process.

In many ways, we see that privacy rules are used in a purely commercial context. It is reasonable to assume that Ryanair is making it more difficult for anyone other than their own registered customers with this more onerous registration process. 

Ever more cases about what can be required of identification

In Spain, the data protection authority has fined a data controller EUR 20,000 as a result of too strict requirements for identification.

The company Mouro Producciones organises concerts and events. They required that parents or guardians present copies of both their own and their children's ID cards for the children to be allowed access to the events. It was considered illegal to require an ID copy and to not give information about for how long the copy was stored.

The company acknowledged responsibility, and thus received a reduced fine according to Spanish rules. The fine was set at EUR 12,000.

Read more about the case here.

Cloud services under scrutiny (again) in France

The public interest group "Plateforme des données de santé" in France wanted to create a database of health data on Microsoft servers. They applied for permission from the CNIL (the French data protection authority), and were granted this on 21 December 2023.

Several companies and organizations challenged this permission, and appealed the case all the way up to the highest French administrative court (Conseil d'Etat) in March 2024. They argued that the permission entailed too great a risk of US authorities to misuse the personal data in the database that Microsoft controlled, and wanted to stop the project by using a special French provision on emergency decisions in situations of great imminent danger. 

The court did not agree that there was great danger. They seem to emphasise that no alternatives to the solution had been found without a similar transfer risk. Furthermore, they emphasised that this was pseudonymised health data where social security number and date of birth were not included, and the data was to be stored on servers in France. The court emphasised that the complainants had not provided evidence that US authorities could use the CLOUD Act to gain access to the data. However, they believed that it could not be ruled out that such access could take place, but that this seemed hypothetical at the time of the assessments. Based on this, they concluded that there was no great and imminent danger to privacy, while the purpose of the project – to promote research in the field of health – weighed heavily. 

This is a decision that will probably receive much attention for third-country transfers and will definitely be important in the time to come. 

The case is discussed here

The Danish data protection authority is (as usual) on the offensive

This case involves a list of things that are often done wrong. At the same time, it illustrates that recent case law on processing responsibility and legal basis for digital advertising is actually enforced. 

The Danish data protection authority received a complaint from a citizen who complained that Telmore A/S disclosed his personal data to Meta Ireland. The complaint contained four points:

  1. Telmore's disclosure of the complainant's personal data to Meta Ireland.
  2. Telmore's transfer of the complainant's personal data to the United States.
  3. Telmore's failure to comply with its duty of disclosure.
  4. Telmore's failure to respond to the complainant's request for access.
     

The person's email address was used as a key into Facebook's "Custom Audience". By sharing the email address, Telmore wanted to avoid the person in question receiving targeted advertising on Facebook. In other words, this was not done to send advertising – but to prevent the person in question from receiving advertising. 

The company claimed that they did this based on their legitimate interest. The email address was hashed. Telmore considered that Meta Ireland was acting as data processor for the company. Telmore also argued that there had been no transfer to the United States, because the contracting party was Meta Ireland. 

Lesson 1: The data protection authority did not attach importance to the fact that the data had been hashed. Hashed personal data is considered personal data and must follow the rules of the GDPR.

Lesson 2: Legitimate interest is difficult to use as a processing basis for advertising and sharing of personal data with third parties. To do so, you have to make sure that information is provided, among other things. The processing was not described in the company's privacy policy.

The data protection authority nevertheless acknowledged that Telmore could have a legitimate interest in conducting marketing and has some interesting assessments. They write: 

The data protection authority is of the opinion that the data subject generally has a reasonable expectation that personal data provided as part of a customer relationship will not be disclosed to other data controllers for use in direct marketing, including to social media providers.

This does not mean that a company is obliged to obtain the data subject's consent. The processing of the data in question may be based on a balancing of interests. However, this requires that the company takes additional measures to ensure that the data subject is made specifically aware that the data will be used for direct marketing purposes, that the data used for direct marketing purposes is disclosed to social media providers, what data is involved, and that the data subject always has the opportunity to object to the processing in question.

After an overall assessment, the data protection authority finds that disclosure and processing of the complainant's email address could not take place within the framework of Article 6(1)(f) of the GDPR.

It is the data protection authority's assessment that Telmore has pursued a legitimate and necessary interest in carrying out direct marketing and a legitimate interest in ensuring that the company's customers are not exposed to unnecessary advertising messages.

On the other side is the consideration for the complainant, as the processing of personal data has an impact on the data subject. This impact, which can be both positive and negative, can also include purely emotional impacts, such as irritation, fear and anxiety, that can arise if the data subject loses control of their personal data.

The data protection authority has carried out a specific balancing of the stated (opposing) interests, and finds, on this basis, that the complainant's interest in their email address not being disclosed and processed outweighs Telmore's legitimate interest in carrying out the above-mentioned marketing activities. In this connection, the data protection authority has placed particular emphasis on the fact that in a situation such as this, the complainant must be assumed to have a legitimate expectation that data provided as part of the establishment of a customer relationship will generally not be disclosed to third parties for marketing purposes. Furthermore, the data protection authority has emphasised that Telmore does not appear to have implemented any further measures to ensure e.g. increased transparency, which could have led to the balancing of interests falling out in favour of the company.

The data protection authority concluded that Telmore's processing of the complainant's personal data could not be based on legitimate interest in this situation. 

Lesson 3: Case law from the EUCJ on joint processing responsibility applies. The data protection authority determined that there was a joint responsibility for the data processing. Thus, an agreement on joint processing responsibility was missing. This also plays a role in the assessment of legitimate interest. There is less control over the data, making it harder to conclude with legitimate interest. 

The data protection authority found that it was unnecessary to continue investigating the other points of complaint in the case, as it is a key prerequisite for complying with the privacy rules that one's own role of responsibility is correctly identified. The data protection authority also emphasised that Telmore was no longer sharing the complainant's email address with Meta Ireland.

You can read the entire decision here – it is worth reading.

The Danish data protection authority has also criticised a high school for using software to prevent cheating during exams.

The school was criticised for a lack of risk assessment when using the exam monitoring software "ExamCookie". The software was used to monitor students during exams to prevent cheating. The school believed that processing of screenshots, copied text, active programs and URLs was necessary to discover cheating. They chose not to use functions that were not considered necessary, such as monitoring active processes and network activity. However, the school was criticised for failing to implement adequate technical measures to protect the students' personal data, particularly the risk of inadvertently collecting sensitive data. Among other things, the risk of privacy breaches had not been sufficiently communicated to the students. The data protection authority concluded that the school generally complied with the personal data rules but encouraged the school to carry out a new risk assessment. 

This means that while you can probably often use this type of software, you must ensure that the necessary assessments have been made and that information is provided.

The case is discussed here.

Help with impact analyses from the Danish data protection authority

In May this year, the Danish data protection authority published two impact assessment templates. One is quite general, while the other is specifically aimed at the development, the operation and use of AI solutions. The template for AI solutions is quite extensive, but that is actually beneficial, as it forces users to go through a series of questions they might otherwise forget to ask. 

I strongly recommend that you take a closer look at these templates if you need to conduct impact assessments, especially if you are currently in the process of making your first actual assessments of using AI in your organisation. One practical aspect is that they provide examples of scenarios where things go wrong. For example, a very serious incident is described as follows: 

Data subjects can experience significant consequences that can only be overcome with considerable effort and consequences for the individual (financial consequences, incorrect accounting of funds, blacklisting or downgrading of credit possibilities, physical damage to assets, impact on work situation, lawsuit, poorer health and the like).

It is very useful to get some help with these things. 

You can find the templates here.

The AI Act is important – but the Council of Europe also wants in

Long before comprehensive privacy regulations were introduced by  the EU, the Council of Europe had conventions on fundamental principles of privacy. These applied to Norway and some of us worked on them, but now they are mostly for those who have a particular interest. The Council of Europe is nevertheless an important body, and now they have decided on a convention on AI as well. The convention is called the Council of Europe Framework Convention on artificial intelligence and human rights, democracy, and the rule of law.

The name of the convention indicates that it is to ensure respect for human rights, the rule of law, and democratic standards in the use of AI systems. It covers the entire life cycle of AI systems and seeks to balance potential risks with responsible innovation. The agreement can be ratified by both European and non-European countries. As such, it meets the need for an international legal standard for several continents and could become just as important as the AI Act.

The convention is the result of two years of work carried out by a group consisting of 46 member states and 11 non-member states. It addresses the use of AI systems in both the public and private sectors, incorporating requirements for transparency and oversight and identification of content generated by AI systems.  It mandates measures to identify, assess, prevent, and limit possible risks, as well as to assess the need for temporary suspension, prohibition, or other appropriate measures when the use of AI systems may be incompatible with human rights standards. They must also ensure accountability and responsibility for adverse effects, and that AI systems respect equality, including non-discrimination of gender and respect for privacy rights.  The convention prohibits the use of AI systems to undermine democratic institutions and processes, including the principle of separation of powers, respect for judicial independence and access to justice. 

The convention excludes activities related to national security interests, but members have a duty to ensure that such activities comply with international law and democratic institutions and processes. Members shall establish an independent oversight mechanism to monitor compliance with the agreement. The convention will be opened for signature in Vilnius on 5 September during a conference of ministers of justice.

Storing customer data in Finland

Many of the cases I write about are complicated. Fortunately, this one is a bit more straightforward but relevant nonetheless: 

The Finnish data protection authority was notified that a telecommunications operator (data controller) had refused to delete the personal data of a data subject who had been a customer of theirs. The authority requested the data controller to clarify why they had refused the deletion request and for how long they stored their customers personal data.

In response, the data controller explained that they could not accommodate the deletion request because the storage was necessary under Article 17(3)(e) citing the general limitation period of three years. They acknowledged that customer data older than three years should have been deleted from their systems through an automated deletion process, which had not been carried out due to a technical error. 

The Finnish data protection authority acknowledged the data controller’s right to store the data subject's personal data for three years after the end of the customer relationship. They accepted that if the data controller deleted the data subject's personal data, they would not be able to defend themselves against possible invoice claims from customers or other creditors.

However, the authority concluded that the data controller had violated Article 17(3)(e) by failing to delete personal data that should have been removed from their systems before the deletion request was made. As a result, the data protection authority issued a reprimand against the data controller.

Dark patterns and design on cookie consent?

There is a lot of talk about dark patterns at the moment. A recent German case addressed several issues involving Meta, with the judgment containing a pragmatic statement about user interface and manipulative design. There are two sentences in the decision specifically about the design of Meta's cookie banner. In the judgment, it was noted that the "Accept cookies" button was displayed in blue. It is not clear what shape or colour the decline button had, as there is no description or picture of it in the judgment. But I assume it may have been white with blue text. The Consent button appears to have been blue with white text. 

The two relevant sentences read as follows (machine translation of paragraph 62):

The fact that the "Allow all cookies" button is coloured blue does not constitute a violation of Article 25(2) of the General Data Protection Regulation (GDPR) (privacy-friendly default setting). This is not a "default" setting, but a common and permissible visual highlighting that does not affect the user's active decision-making ability. 

This suggests that you can have different colours on the consent and decline buttons. From what I understand, Meta believes that the colour blue is common for them to use – as it is the colour they use in Facebook. 

However, while different colours are acceptable. it probably does not mean that you can use red and green nor black together with light grey. You must be wary of colours that influence the user’s decisions. in a certain direction. 

However, this shows that there can be many ways to design legal consent buttons. The buttons do not necessarily have to be identical. 

For those who read German, the judgment is available here.

"Pay or OK" – new decision from Spain

In May 2023, a complaint was lodged with the AEPD (the Spanish data protection authority) by a user of the website Motorsport Network España (data controller). The complaint alleged that the website was using an illegal consent or pay solution in the cookie banner. The AEDP conducted an investigation and found that the website was using non-technical cookies, which require consent, without obtaining prior consent from the users upon the first visit to the website.

After the cookies had been set, the website asked for consent through a cookie banner that provided two options in the first layer. Users could accept the cookies and use the website for free. Then the website continued to use the same cookies as before consent was requested or given. 

The alternative was more complicated. The user could click a box labelled "Show the options," which led to another banner. There, all use of cookies was initially set to "off" except for analytics cookies, which were marked as "on". If a user wished to reject all cookies by clicking the "Confirm my preferences" button, the website would continue to use the same cookies as before consent was requested. Subsequently. a new pop-up would then appear, prompting registered individuals to either subscribe for a monthly fee and access the website ad-free, or to accept all cookies.

The AEPD also considered the setup for users who wanted to withdraw consent. There was a link to "Manage preferences" at the bottom of the website. There, users could access the cookie control panel and manually deactivate each cookie. However, once they confirmed their preferences, they were still confronted with the same choice of either accepting all cookies or starting a paid subscription.

The AEPD concluded that the setup allowing the use of non-technical cookies without consent was illegal. . The lack of an option to easily reject consent was illegal, and it was too difficult to withdraw consent.

However, the AEPD considered this a minor violation and imposed a fine of EUR 5000.

The case is discussed here.

Interesting case about the right of complaint from the Data Protection Board

In April of this year, the Norwegian Data Protection Board dismissed a complaint from a private individual who alleged that Statistics Norway (SSB) processed his personal data illegally.

The man filed a complaint with the Data Protection Board after the Data Protection Authority decided not to pursue his complaint that Statistics Norway had violated the GDPR. The starting point is that the Data Protection Authority must adjudicate complaints alleging GDPR violations, provided that the complainant sufficiently specifies and substantiates the alleged violations – criteria that, in this case, the man had not met. In the Board's view, the lack of specification and clarification meant that the Data Protection Authority were not obliged to decide on the complaint.

In other words, in the Data Protection Board's opinion, complaints must meet certain qualitative requirements for the Data Protection Authority to be obliged to decide on alleged violations in a specific case.

You can find the case here.

Overworked data protection officer?

We end this round by thinking about vacation. Many data protection officers have a lot to do and desperately need the upcoming summer vacation. 

In a recent case from the Belgian data protection authority from June 3, it was revealed that a data protection officer only worked three days a week. The person in question was the only one who had access to the email address where the authority had sent an inquiry. The inquiry went unanswered, partly because the person in question was overworked.  The inquiry stemmed from a failure to respond to a complaint regarding direct marketing. 

The data controller claimed that they were unaware of how much work the officer had, but they were not given the opportunity to address on this issue. The responsibility lay with them. They were fined EUR 170,000. 

So – dear privacy colleague – let someone know if you have a lot to do. And remember to take a vacation too.

The decision (in French) can be found here.

Do you have any questions?