Privacy Corner

by Eva Jarbekk and Paal-André Storesund


Security cameras mounted on wall.

As always, there are many issues to write about within data protection. The Norwegian Data Protection Authority is getting noticed on the international arena with various banning decisions against both the American Meta and the Russian-linked taxi app Yango. It is fascinating that a data protection authority in a small country can make decisions that hinder processing that goes far beyond its own national borders, and which concerns large companies. I would not be surprised if there will be even more of this and decisions where data protection authorities order ongoing processing to stop, in the future.

There is a lot of talk about AI these days. There is reason to note that the GDPR will apply in addition to the upcoming AI Act – the use of personal data in AI must have legal basis in the GDPR. There is a special legal basis in the AI Act for use of personal data in sandboxes, but it is narrow. The GDPR continues to be the "backbone" for personal data, also when using AI. We will return to this later. For AI, it is also relevant that the judgment from last year, which ruled that pseudonymised personal data will not always be considered personal data, has been appealed. At the same time, several data protection authorities from various nations have come out against unlawful data scraping. AI does not operate in a lawless space today, but there are many new rules coming to ensure safe AI and a good society. More about this and much more below.

Zoom and AI

Zoom is a well-known name and tool for many. In August, Zoom made headlines and discussions in various media. The reason is that the company recently updated its terms of use. The change gave the company a perpetual right to use user content that Zoom users create or upload to the service, to produce and develop services, for machine learning and artificial intelligence.

In other words, the update gave the company the right to use the user's video calls, uploaded files, user behaviour, etc. to train artificial intelligence. However, the company eventually pointed out that audio recordings, videos and conversations will not be used to train artificial intelligence without consent being obtained. This is nevertheless not unproblematic where the meeting host has accepted that the company can use video and audio recordings. The alternative will then be for the participants to leave the meeting.

It did not take long before lawyers and others spoke out against the change. There was quite simply a hail of criticism. This led to Zoom choosing to change its terms of use again. The new changes mean that Zoom no longer has the right to use audio recordings, video, conversations, shared screens, attachments or other communications to train artificial intelligence. The company has nevertheless still reserved the rights to service-generated data, which is typically diagnostic data, information about product use, etc.

AI is certainly here to stay. A new AI Act is also coming soon. However, having the right legal basis to use learning data is not something that can be overlooked and it will mostly have to be found in GDPR. I predict that there will be a lot of discussion about this in the years to come, probably much in the same way as legal basis for marketing is discussed today.


Joint statement about data scraping

Several countries' data protection authorities have made a
joint statement on the importance of preventing unlawful data scraping. An
example of data scraping is Clearview AI, which built a database of billions of
facial images. It obtained the images of ordinary people directly from the web.
This can entail that the individual loses control over their own personal data
when something they have published in a context is used for completely
different purposes by unknown players. The GDPR applies even if personal data
is openly available online.

There have also been cases where online information has
been used by AI as training data, something we have previously written about.

The statement from the data protection authorities has been
sent directly to companies such as Alphabet, Meta, Microsoft, X (Twitter) and
others, and the statement refers to a number of technical measures such
companies can take to prevent personal data being scraped from pages they
operate. The statement is 6 pages long and is worth reading – it is available
at the link below.

The statement

EDPS appeals the "SRB case"

The so-called SRB case, which came in April, has been appealed by the EDPS (the European Data Protection Supervisor). The EDPS was criticized for not having assessed whether the recipient of the personal data was in possession of sufficient information to be able to identify the persons behind the pseudonymised information themselves. I have discussed and written about the judgment before. Many have thought that it was wrong and quite sensational – while others applauded and said that they finally got a more practical angle on what constitutes personal data.

One thing is what assessments the EDPS had made, another thing is that the court referred to an older decision (the Breyer case) where it emerged that one must not only assess what the data controller can re-identify, but also what the data controller can get a third party to re-identify. This allows for a more subjective understanding of the concept of personal data rather than just unconditionally stating that pseudonymised personal data is also personal data. Making such assessments of the possibility of re‑identification will be complicated.

Now the case is referred to the ECJ. This is not surprising, and it will be very exciting to see what the outcome will be there. The appeal from the EDPS is short – it simply states that the underlying court has misunderstood basic data protection definitions. The appeal can be found at the link below.

See the appeal here. 

NOYB sues Ryanair

NOYB has taken legal action against the company Ryanair. The background for the legal action is Ryanair's use of facial recognition technology in connection with verification processes. The problem was discovered when a customer booked a trip via the online travel company eDreams. Ryanair writes that the verification process should verify the customer's information. The customer was given the choice of either carrying out the verification or showing up at the airport at least two hours before departure to check in at the airport. In addition, the customer was asked to pay a fee for the verification process.

However, NOYB has not been convinced by this reasoning from Ryanair. NOYB has stated: "They already have your contact details to send you the link to the 'verification' process. A verification of contact details via biometrics also doesn't make a lot of sense: Your email address is not printed on your face or in your passport. Ryanair's verification process looks like another attempt to make the lives of travelers and competitors more complicated to increase profits." Another argument put forward by NOYB is that no such type of verification is required for customers who book directly with Ryanair.

NOYB writes that there are two things that are problematic with this practice. Firstly, facial recognition technology involves the processing of special categories of personal data, namely biometric data. The processing of such biometric data may involve an unacceptably high risk for the data subject. Secondly, the consent solution is problematic. The objection from NOYB is that Ryanair provides too little information about the purpose of processing the biometric personal data. The data subject will then not be able to give an informed consent and it is not valid.

Now, such solutions are not common in many places, and I think that the data protection authorities will set strict requirements for such facial recognition. In the upcoming AI Act, using facial recognition systems is considered potentially very intrusive and in some cases completely prohibited.

Booking a Ryanair flight through an online travel agent might hold a nasty surprise (

The Norwegian Data Protection Authority intervenes against Yango/Yandex

The Norwegian Data Protection Authority reacts, following a new Russian bill which seems to give Russian security authorities unlimited access to the personal data that the Russia-connected taxi company Yango has on Norwegian residents.

The Norwegian Data Protection Authority has made an urgent decision – in cooperation with the Finnish Data Protection Authority. The data protection authorities find the situation so urgent that the conditions for imposing a temporary ban based on Article 66 are considered fulfilled. In other words, the Norwegian Data Protection Authority believes that the situation is a special case where there is an urgent need to take measures to protect the rights and freedoms of Norwegian citizens.

The letter to Yango is public and the Norwegian Data Protection Authority writes that they will take the following measures:

  • Order for a temporary halt in Yango's transfer of personal data from Norway to Russia; and
  • Ban on processing personal data for users when the purpose is to transfer this personal data from Norway to Russia.

The ban and the order will enter into force on 1 September 2023 and last for three months, i.e., until 30 November 2023. Article 66 is very rarely used, but this is the second time that the Norwegian Data Protection Authority has issued a temporary ban under the provision. The first time was earlier this summer and concerned the temporary ban on Meta's behaviour-based marketing on Facebook and Instagram. Below is a link to the Norwegian Data Protection Authority's description of the case.

Datatilsynet griper inn mot Yangos overføring av personopplysninger til Russland | Datatilsynet

India's new Digital Personal Data Protection Bill

The latest news from India is that the Indian Parliament is now considering a new proposed Digital Personal Data Protection Bill. The legislation will apply to all India-based businesses that process personal data, in addition to international businesses that process personal data about Indian citizens. This is not the first time that a data protection bill has been presented to the Parliament. The last time was in 2019. Previous bills have generally been considered to be far too complex, and for that reason have been rejected. The rationale has been, inter alia, that the legislation would have been difficult to comply with for, inter alia, start-up companies in the country. Indian Minister Rajeev Chandrasekhar has stated that the new bill should be easier to comply with than previous bills.

There has nevertheless been some criticism of the proposed bill. Part of the focus when drafting the bill has been that the state is given a number of exemptions from the legislation and power and control. The organisation The Internet Freedom Foundation has expressed concern that the legislation contains several exemptions for state businesses and can therefore facilitate increased state surveillance.

The legislation also contains exemptions relating to personal data relating to children. Some of the exemptions in the legislation cover, inter alia, the processing of children's personal data and obtaining consent from the child's parent. If a business can prove that they are "verifiably safe", they will be exempt from the bill's rule on processing children's personal data and obtaining consent from the child's parent.

From a European perspective, the question is whether the legislation provides sufficient data protection for India to get a stamp of approval from the EU on transfers of data there. From a first glance, it looks like it could be challenging.

See link here.

Does the purpose behind access requests play a role?

A court in Germany recently concluded that a data controller could reject a request for access as a result of the request being considered "excessive". There is an opening in the GDPR for a data controller to reject an access request if the request from the data subject is manifestly baseless or excessive.

The reasoning was that the purpose of the access request was not related to data protection. The request for access was based on a question about the legality of certain adjustments to an insurance policy. The data subject's purpose for the access request was to obtain information regarding the basis for the changes. Although the court recognized that the information covered by the access request was personal data, it was crucial that the purpose of the request was not related to data protection.

One can then ask whether the data controller can ask the person requesting access what the purpose of the access is. I think such an idea will be rejected – so the outcome in this case is in a way somewhat surprising. However, the purpose in this case was probably easy to identify because the request for access was made in connection with a court case between the parties regarding insurance terms. The case is referred to on GDPRhub, see link below.

OLG Hamm – 20 U 146/22 – GDPRhub

Can access requests be sent to anyone?

The Data Protection Authority in Berlin has made a decision in which a company is reprimanded for not having corrected personal data despite the fact that the data subject requested this. The data subject had informed the company via e-mail that the e-mail address stored by the company was incorrect.

The data subject had not ordered anything from the company nor asked to be included in their newsletter. The person nevertheless received an order confirmation and a newsletter – as well as personal data belonging to the actual orderer. The data subject assumed this was due to someone having given the company the wrong e-mail address. The company later said the incident was due to an error on its part.

The data subject sent an e-mail to the company and asked for the e-mail address to be deleted, but then she was asked to log in via her user and carry out the request there via a form. The fact that she did not have a user was overlooked. The company did not comply with the request to delete the e-mail address, because the department believed that the address was necessary in order to carry out the order and payment. Furthermore, it was argued that the data subject had contacted customer service, not the company's data protection officer.

The Data Protection Authority in Berlin specified that the company had no legal basis for processing the e‑mail address. Furthermore, they could not place the responsibility on the data subject by pointing out that the request was sent to customer service instead of the data protection officer.

The decision shows the importance of good routines on how requests in connection with data protection should be processed. You can find the decision in the link below.

The decision

IMY's Director General leaves after 5 years

Lena Lindgren Schelin has resigned as Director General of the Swedish Data Protection Authority, IMY (Integritetsskyddsmyndigheten). Lena came from a position as Chief Legal Officer in the Swedish Economic Crime Authority (Ekobrottsmyndigheten) and has worked as Director General in IMY for just over five years. A new Director General is to be appointed and until then Karin Lönnheden is acting in the position. She is currently Chief of Staff and responsible for innovation work at IMY.

One consideration in this regard is that data protection has become a complicated area with a large international dimension. It takes a good deal of time to become skilled in the area and to know the international environment. It is an advantage to have continuity among those who work in the area. It will be exciting to see what expertise the next Director General brings to the role, let us hope it is someone who is good at data protection.

IMY:s generaldirektör går till Kustbevakningen

Do you have any questions?