Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Security camera survailance
GDPR has just turned four. Like on its other birthdays, this birthday is a cause for a number of comments from authorities and activists. Many feel that things are going too slowly, others feel that a lot of progress has been made.

NOYB believes that there is still a non-compliance culture; see article on this here https://noyb.eu/en/statement-4-years-gdpr. It is obviously true that much remains to be done for many, but there have nevertheless been major improvements. The fact that data protection authorities are facing challenges in their internal cooperation is perhaps to be expected, but the one-stop-shop scheme creates challenges when it comes to enforcement against the large companies.


Regardless – there is still a steady flow of new cases – this newsletter addresses those which I think you need to be aware of. Note in particular that Schrems III may become a reality, and would be a good reason to sign up for Schjødt's seminar on confidential computing and the CISPE code of conduct, which will be held 8th June. Confidential computing and the CISPE Code of Conduct are aids that can ensure third country transfers for some, Schrems III or not. AWS will present its own solutions at the seminar. Invitation is found here.

Guidelines from EDPB

Even after these four years, EU legislative and regulatory authorities have not been idle:


a) The European Data Protection Board (EDPB) has issued a guideline for “calculation of fines.” It is of course relevant to those who are at risk of being given administrative fines, even if the content of the new guideline is largely in line with practice for calculation of fines which was in place prior to the guideline being issued.


Link


b) The EDPB has also presented guidelines for police authorities' use of face recognition technology for consultation in mid-May. It is characteristic of the present time that the use of AI and detection technology is now regulated.


This is a good thing from a privacy pespective. Several police authorities in Europe use or intend to use face recognition technology. The technology can of course be used for many different purposes, such as identifying people on watch lists or monitoring a person's movements in public space. It is good to take a critical look at such use, which can be very intrusive.


Link


c) In mid-May, the European Commission also published its "Q&A" for the 2021 Standard Contractual Clauses (the SCCs). In total, it consists of as many as 44 questions and useful answers. The answer to perhaps the most relevant question is whether the SCCs can be used to regulate the transfer of data to controllers or processors where the importer of personal data is already subject to the GDPR? EDPB's answer is simply "no". This has quite far reaching consequences because the GDPR to a large extent actually has effect outside the EU/EEA.


One consequence is that the Commission is now drafting standardised contract formulations that can be used for such transfers. In the meantime, it will be exciting to see how national data protection authorities handle this.


Link


d) In the second half of May, the Council of the European Union and the European Parliament agreed on measures to ensure a high level of data security for all Member States. The measures will be codified in a new directive often called "NIS2". It will replace the current “NIS” Directive (Network and Information Systems).


The new directive will set a number of new and/or strengthened requirements for particularly important companies, such as banks, energy companies, telecoms and transport, to invest heavily in computer security systems to prevent hacking and computer crime. Public enterprises will also be required to make similar investments and measures.


Link

Schrems has also not been idle – notifies Schrems III

Even before the wording of the agreement to ensure lawful data flow between Europe and the US is drafted, data privacy advocates challenge the content of the new "Trans-Atlantic Data Privacy Framework". In an open letter to the most central politicians around the negotiating table for a new data exchange agreement between the EU and the US, NOYB and privacy activist Max Schrems have announced that they are prepared to challenge the agreement in court. Schrems criticises several aspects of what is known so far, such as the use of "executive orders" in the US. Such executive orders are presidential orders that can be changed by any president at any time, and Schrems' view is that this creates little predictability. There is reason to listen to Schrems' arguments, he has proven to have good legal analyses before, which have been upheld in court.

Latest news from Europe

Google is being sued in the UK

In the UK, Google is again in trouble, this time linked to the AI part of the company, the so-called DeepMind. They are subject of a lawsuit after Information Commissioner's Office (the ICO, UKs data protection authority) found that Google received health data about 1.6 million patients on an unlawful and failing legal basis.


The health data was sent to Google in 2015 by local health authorities for use in the testing and developing of a health app, but without the patients being informed of the data sharing. In return for disclosing the data, patients and health authorities would be able to obtain the app at a reduced price.


It is interesting to note that the ICO in this case has not chosen to make any decision against the health authorities who shared the information, as it believed that they did not have enough competence to understand that what they were doing was wrong. Nevertheless, private actors can sue the recipient of this data.


See more here and here.

Deleting data prematurely can also be wrong…

A decision made by the Spanish Data Protection Authority raises interest. The case demonstrates that deleting data prematurely can also be wrong.


A customer was injured when shopping in the supermarket chain Mercadona. The customer wanted to claim compensation for the injury, and contacted the chain office via an online form asking for a copy of the CCTV that had probably filmed the course of events. The customer received an automatic confirmation that the inquiry had been received along with a case number, but then did not hear anything more from them.


A month later, the customer's lawyer sent a reminder to the supermarket chain. It then turned out that due to human error, the request had never been forwarded to the right person in the organisation. The video recording had been deleted, as required by local law. The customer complained about this to the Spanish Data Protection Authority, but before the Data Protection Authority began to process the case, the customer and the supermarket chain had reached an amicable solution with financial compensation to the customer.


The Spanish Data Protection Authority nevertheless chose, on its own initiative, to process the case, as they believed that the customer's very basic rights had been violated. They fined the supermarket chain EUR 170,000 for breach of Articles 12 and 15 of the GDPR, because the company had not provided a copy of personal data (the video recording) when requested, and for breach of Article 6 for deleting a video recording without lawful reason.


Link

Be aware when using AI

The Data Protection Authority in Hungary (NAIH) has fined a bank EUR 700,000 for, inter alia, unlawful use of AI.


The bank used AI to analyse telephone conversations between customers and bank officers with the aim of identifying customers who were dissatisfied with the bank. The AI analysis focused on periods of silence, voice pitch, several voices speaking simultaneously (arguing), speaking speed, volume, etc. The AI technology then made a decision as to whether the individual customer should be followed up or not, and in those cases where follow-up should be done, information was forwarded to a bank employee. The bank employee would then call back to the customer and try to find out what the customer did not like and how the bank could improve the matter.


Everyone calling the bank was told that the telephone calls could be recorded and that the recordings would be used for quality control, prevent complaints or losses for the customer, and increase efficiency. However, they were not told that the AI technology would be used.


When the bank was confronted with the fact that this could be unlawful activity, the bank invoked legitimate interest. NAIH pointed out that they could not see that a sufficient balancing of interests had been done in order to be able to invoke something like this, and that the use of AI that had been made here was to be regarded as a form of profiling.


Just as interesting regarding NAIH's decision is that it also to some extent rests on the EDPB's and EDPS' joint statement, in which they express that AI used to derive emotions and states of mind in natural persons is undesirable and should be banned.


Link

Once again, news from the Danish Data Protection Authority…

This time, they have quite simply decided that there is a limit to how much information a person can request disclosed about themselves.


A former employee of a Danish municipality had asked the municipality to hand over all e-mails, memos and letters that employee had sent or signed during his many years of employment. The municipality asked the person in question to clarify in more detail what type of information he was actually looking for, but this was not answered. The municipality refused to hand over all the information, and a complaint was lodged against the municipality to the Data Protection Authority.


The Danish Data Protection Authority found in favour of the municipality – there is a difference between information created by a person in a role and information about the person who has held the role/function.


Link

Here at home, NAV is in trouble

Here at home, the Norwegian Data Protection Authority has announced that they will fine NAV NOK 5,000,000 for breach of the data protection rules. This concerns the case where all jobseekers' CVs were freely available to employers who advertised for new employees through arbeidplassen.no


All jobseekers' CVs were available for a period of 20 years to employers who were logged on the service. It was NAV itself that discovered the breach and reported this to the Norwegian Data Protection Authority in February 2021.


Link

Norwegians' personal data is shared 340 times a day in the advertising market

A recent report from the Irish Council for Civil Liberties (ICCL) shows that the scale of data sharing in the adtech industry is large. The data of an average Norwegian internet user is shared 340 times per day. ICCL is behind a lawsuit against the Interactive Advertising Bureau (IAB) about this, where ICCL claims that this practice is not based on correct consents and therefore is the largest data theft of all time. We also see in Norway that there is great focus on the adtech industry from both the Norwegian Consumer Council and the Norwegian Data Protection Authority.


See more here and here.

Finally: Biometrics and payment services

Biometrics related to payment services is a huge market. It is estimated that being able to offer contactless payment services using biometric identifiers will have a value of over USD 18 billion by the end of 2026.


Amazon already offers biometric payment in its stores, using handprint readers. This has been criticised by US politicians for not protecting privacy in a satisfactory way, but is (so far) not illegal.


Mastercard has now also entered this arena. They are developing a technology that will make it possible to make payment by just smiling or waving the hand.


Biometrics as an identification and authentication tool is undeniably on its way. The challenge will be to ensure that such data about us is adequately protected.


Link

Do you have any questions?