Newsletter

Privacy Corner

by Eva Jarbekk and Sigurd Fjærtoft-Andersen

Published:

Computer and lock

The EDPB has issued its "Opinion" on "Consent or pay". It is triggered by Meta introducing a payment solution to avoid targeted advertising. I have written two articles on this that you can find here and here and where I am somewhat critical of the role the EDPB now has in European legal development. In reality, the decisions of the EDPB work in the same way as legislation, but the creation process is quite different and less transparent. This is a consequence of GDPR that is difficult to do anything about. Some have started calling the data protection authorities in the EU "the fourth branch of government" and that is not entirely wrong.


As regards "Consent and pay", the EDPB now says that some operators are obliged to offer their services free of charge. Businesses need to make money. If a consumer is given 3 choices: 1) pay for the service, 2) pay for the service by receiving targeted advertising or 3) get an equivalent service for free – then the majority will probably choose the latter. It hardly promotes new services, no one can provide services for free. One of the key questions is who is required to provide gratuitous services – is it just Meta? No. But who is affected is quite unclear from the EDPB, and there is scope to apply the guidelines to services that are central to their area, even if the service is not very large. Many profoundly disagree with this. We'll see in the future how this plays out.


Otherwise, it is full speed in the privacy area. NOYB has complained about OpenAI that it produces results that are not correct when it "hallucinates" – and criticized that it is not in line with GDPR that there are no ways to remedy this. Here we see a collision between the rules of GDPR and how AI works. For me, it's unclear how to make this technology follow all the rules of GDPR, but I also don't think the rules are going to change. The EDPB is also working on this in a dedicated task force – it will be interesting to see what they get to in due course.


This time I will not go into the basis for processing targeted advertising and the IAB cases that we have talked about several times in the network, but for those who are particularly interested, there has been a new decision from the ECJ on 5 March this year (C-604/22). The decision confirms that the information in the consent strings is personal data and that the IAB is joint controllers with those who use the IAB system (perhaps most surprisingly). For us in Norway, it is probably more important that a new proposal for the Electronic Communications Act has been made, which sets clear requirements that the use of cookies is based on consent and not legitimate interest. It is interesting to see that it is proposed that NKOM should decide what constitutes necessary cookies, while the Data Protection Authority should decide what constitutes good enough consent. There could be a lot of action ahead. It's unclear when the bill will go through Parliament, it could happen before the summer, but most likely in the fall.


Below are some of the other important stories of the past month and I'm afraid it was a long newsletter, but I know many of you still appreciate it. Happy reading and remember to enjoy the spring sun too.


Dominoes in trouble for using AI system for phone orders

The ever-increasing use of AI systems usually presents new challenges for large, worldwide companies taking advantage of technological developments. In a proposed class action lawsuit in the US state of Illinois, three customers have accused pizza chain Dominos and the developer of the company's voice recognition system of collecting and storing customers' voice prints, names, addresses, phone numbers and credit card numbers, without informing customers. A voice print is regarded as an individual and distinctive pattern of the characteristics of a person's voice that are spectrographically produced, and usually constitutes biometric data that are specially protected as special categories of personal data. The voice recognition system was developed by ConverseNow Technologies, and is reportedly used in 57 different Dominos locations in Illinois to improve customer service and to improve sales. According to the proposed lawsuit, the voice recognition system has been used since 2020, without customers being informed of such use and storage. There is significant abuse potential for such voting information.


The class action lawsuit raises questions about Domino's approach to customer privacy, and is based on the fact that the collection of voting prints violates Illinois Biometric Information Privacy Act (BIPA). As usual, the lawsuit can result in financial sanctions and lost reputation, and illustrates in the usual way the importance of compliance with data protection regulations in situations where companies adopt new technology in line with technological developments. For those who are implementing new AI-based tools in your company – keep an eye on the privacy regulations' requirements for information and consent!


Link


Italian Prime Minister Giorgia Meloni demands €100,000 in compensation for uploading videos with fake pornographic content

Technological developments also present challenges that go beyond the more typical regulatory issues in privacy and AI. According to the Italian prime minister, in 2022, several fake pornographic videos were produced and uploaded on the internet in which the Italian prime minister's face was placed over someone else's body – so-called deepfake. Among other things, the videos have raised widespread concern about the use of such deepfakes in Italian politics and elsewhere, related to the production of propaganda and disinformation. Using the videos also shows how easy it is to create and share deepfakes in a way that violates individuals' privacy, damages reputations, and causes emotional damage. Police have stated that through tracking of the mobile devices used to post the videos, they have identified a 40-year-old man and his 73-year-old father as perpetrators, and Meloni will testify against them before a court in Sardinia on July 2, 2024. If Meloni's claim for compensation of €100,000 prevails, she will donate the money to a fund that supports women who have been subjected to male violence. It's fair to think that deepfakes are pretty scary and that it's nice that the AI Act regulates this hard.


Link



The Danish Data Protection Authority allows – surprisingly enough – facial recognition

Four and a half years ago, the Danish football club Brøndby IF, as the first private operator in Denmark, received permission from the Danish Data Protection Authority to use automatic facial recognition for the identification of banned spectators. For the past four seasons, spectators have been filmed and compared using Brøndby Stadium's facial recognition technology – all to identify and capture fans who have been quarantined and should not be allowed into the stadium. A system for facial recognition is now being considered at national level by the Danish Divisional Association, which is in the process of drawing up rules and frameworks for the use of the technology at the national level. Despite the fact that the use of the technology has yielded good results for Brøndby IF, the launch of the system in 2019 – as increasingly also elsewhere in the privacy world – was met with scepticism in Denmark. The head of the IT Policy Association, Jesper Lund, stated back in time that facial recognition is "the most invasive surveillance technology available", and that he hoped that "there will be protests and that football fans who attend Brøndby's matches send complaints to the Norwegian Data Protection Authority about this processing of personal data". Despite such scepticism, the Danish Data Protection Authority informs that they have not received any complaints about Brøndby If's use of facial recognition.


Despite the fact that it is in many ways surprising that the Danish Authority has allowed facial recognition, it illustrates that the potentially significant benefits of using such technology can be so great that the assessment of what is allowed tilts in the direction of allowing more intrusive measures – and that intervention measures after an overall assessment may also be acceptable.


Link



Study shows (surprisingly) harmful fabrications in OpenAI's speech-to-text algorithm

The tendency for chatbots powered by artificial intelligence to occasionally invent things, or hallucinates, is eventually well documented. Now, a new study from Cornell University has found that AI models for transcription may hallucinate. This may not be surprising, but the consequences can be huge. OpenAI's Whisper, an AI model trained to transcribe audio to text, found out 1.4% of the audio data tested, according to a new study. In addition, a large part (around 40%) of the constructed sentences contained offensive or potentially harmful content, e.g. related to violence, sexual relationships and demographic stereotypes. This is particularly unfortunate where transcription tools are used by, for example, doctors or other healthcare professionals to write patient notes or medical records. One can easily imagine that the challenges of using AI-based transcription tools become even greater when the user of these takes it for granted that the tool transcribes what is actually said as input. The fact that transcription tools can hallucinate gives a strong incentive to treat such tools with caution, especially in cases where the correctness of the text generated is of great importance to the individual to whom the text concerns.


Link



One must have legal training data for the development of AI models

In the wake of a new lawsuit against Amazon in California, issues related to the use of copyrighted material when training AI models are once again raising. In the lawsuit, a former Amazon employee alleges that, in connection with OpenAI's launch of GPT-4, the company was so desperate to keep up with competition in generative AI that the company was willing to violate copyright rules when developing and training its AI models. The allegation is made in a case in which a formerly employed AI researcher was demoted and then dismissed, after Amazon discovered the person in question was pregnant. In addition to allegations related to wrongful and discriminatory termination, the woman alleges that she was terminated because she complained when Amazon allegedly violated its own rules regarding the use of copyrighted material in connection with AI development. The lawsuit alleges that representatives from Amazon's top legal team asked the woman to ignore Amazon's violations of its own internally drafted copyright rules, and that the instruction was justified by all other major players doing the same. The lawsuit illustrates that copyright issues related to the development and training of language models remain a highly relevant topic.


Link



New complaint forms for transfers to the United States

When the European Commission decided last year that you can safely transfer personal data to companies in the USA registered under the DPF, a prerequisite was that private individuals should be able to complain if they believe that the conditions are not met. There are many places such a complaint can be directed, but now the EDPB has created a complaint form to the local data protection authority that the recipient in the US does not comply with the many requirements of the DPF. There is a separate form you can use if you believe that the US intelligence service is not processing information correctly. The Danish Data Protection Authority has made the forms available here.


CJEU with new decision on the use of fingerprints on European identity cards

The European Court of Justice recently issued a decision that highlights the intersection between the EU regulation on identity cards and fundamental rights to respect for privacy. The background to the case was that a German citizen had complained to the municipality's refusal to issue an identity card without fingerprints to the European Court of Justice, on the grounds that the municipality's refusal violated fundamental rights to respect for privacy. The European Court of Justice recognized that the EU regulation on identity cards, which requires the storage of fingerprints on the card, limits, among other things, the right to protection of personal data. However, the European Court of Justice found that the purposes underlying the storage of fingerprints are justifiable and legitimate, and that these purposes had to be given precedence in the specific case. One of the legitimate objectives of the European Court of Justice was, among other things, that the use of a reliable identification system promotes citizens' right to free movement and residence in the EU and that the use of fingerprints helps to combat the production of fake identity cards. The case illustrates that privacy considerations are constantly weighed against other legitimate considerations, and that data minimisation (through other, less privacy-invasive measures) is not always sufficient to safeguard other, legitimate purposes that interfere with individuals' privacy.


Link



Developments in the automotive industry create new privacy-related challenges

The trend in the automotive industry is that more and more cars are equipped with internet access, and it is estimated that by 2030 more than 95% of passenger cars sold will have internet connectivity. This allows automakers to offer features related to safety, but it also allows companies to collect, share, or sell data related to driving habits and other personal information that individuals may not want shared. Recently, several internet privacy advocates have voiced concerns about the use of vehicle personal information, including reports showing that car companies share driver data with insurance companies. The privacy concerns also relate to the fact that, like other consumer technologies, the ability to opt out of data sharing with car companies may soon be buried in settings and menus that are difficult to find. Continuing these developments, the privacy network now fears that cars will become the worst product category that has ever existed in terms of privacy. I myself have been expecting for some time that there will be more attention from the authorities to the collection of personal data by cars, but so far this has not happened to any great extent.


Link



New decision by the Portuguese Data Protection Authority related to Worldcoin's storage of biometric data

The Portuguese Data Protection Authority (CNDP) has recently ordered the company Worldcoin, which encourages customers to face scans in exchange for digital ID and free cryptocurrency, to stop all storage of biometric data for 90 days. CNDP has learned that more than 300,000 people in Portugal have provided Worldcoin with biometric data, including in the form of iris scans, and they have received several complaints about unauthorized collection of personal data from minors, inadequate information and lack of individuals' ability to delete data and withdraw consent. Worldcoin, for its part, has stated that it aims to build an identity and a financial network, and that the processing of biometric data is necessary for people to prove that they are human beings in a world dominated by artificial intelligence. Worldcoin is currently under investigation in several countries, and has received criticism from privacy connoisseurs related to the collection and storage of personal data. The Spanish Data Protection Authority has granted Worldcoin a three-month suspension after receiving similar complaints. The case illustrates several well-known privacy law issues, and it will be interesting to see how the data protection authorities handle the case – particularly related to Worldcoin's allegedly inadequate information about the processing of biometric data – which could potentially have major consequences for both the data subject and Worldcoin.

Link



The Finnish Data Protection Authority announces new decision related to the use of social security numbers in SMS messages to patients

The Finnish Data Protection Authority recently ruled against a data controller for sending test results to his patients by SMS, with the patient's social security number included in the message. In response to the Norwegian Data Protection Authority's request for an explanation of the purpose of including social security numbers in text messages to patients, the controller stated that the use of social security numbers ensured that the patient's information was not accidentally shared to other persons with the same name. The controller further argued that in a service where the patient's social security number was sent as a text message to the patient's own mobile phone, the risk associated with processing the social security number must be regarded as low. Despite the fact that the Data Protection Authority agreed that Article 87 of the GDPR states that member states may lay down specific conditions for the processing of personal identification numbers, the Data Protection Authority emphasised that the personal identification number is a unique and permanent identification factor, and that third-party access to such data may cause the data subject extensive damage in the form of, inter alia, identity theft. On this basis, the Finnish Data Protection Authority concluded that the number should not be used in SMS messages to patients. From a Norwegian perspective, such treatment of social security numbers must be regarded as sensational, and I don't think many people would do the same here.

Link



Decision from Germany regarding the data subject's right of access to all documents containing personal data

The Administrative Court in Berlin has considered a case concerning the data subject's right to demand access to and copies of all personal data that the controller processes about the data subject. The data subject required a copy of all documents containing personal data. The controller gave the data subject access to general information about the types of personal data the company had in its IT system – so-called "Master Data". The data controller argued that the request would involve a disproportionate effort and was therefore unreasonable – as a result of this involved a review of over 5,000 pages – and that the request for copies of all documents constituted an abuse of the data subject's rights. The Court held that the purpose behind the right of access pursuant to Article 15 of the GDPR, inter alia, is to enable the data subject to verify the lawfulness of the processing of personal data. Based on the importance of this purpose, the court concluded that the controller can only object to the obligation to provide such access on the basis of disproportionate effort in strictly limited Cases. As a result of the data subject's request for access being based on the desire to understand how and with which third parties the controller shared personal data, the court concluded that the data controller had to provide the data subject with copies of all personal data, and that the data subject's request did not entail any abuse of rights under the GDPR.


The case is a further reminder of how strong the right of access stands and that it can only be limited in exceptional cases.

Link



The Icelandic Data Protection Authority decides on the data subject's right of access to minutes of meetings where an association (data controller) discussed the data subject.

An association (data controller) held a meeting where a complaint from the data subject related to bullying and violence in the workplace was discussed. The registered person requested access to the minutes of the meeting. As a result of the association's refusal of access, the registered association complained to the Icelandic Data Protection Authority. The Norwegian Data Protection Authority concluded that Article 15 of the GDPR establishes a right for the data subject to access information from meetings where the data subject is discussed, as well as information about the names of those who attended the meeting. However, the Norwegian Data Protection Authority stressed that the interests of other data subjects must also be protected, and that the data subject could therefore not access the minutes of the meeting in their entirety. An interesting question here is whether the result of the complaint would have been different in Norway, as a result of the exemption provision for internal documents in section 16 (e) of the Personal Data Act – and it is nevertheless interesting that there appear to be different practices in this regard within the EEA.

Link



Circumstances count when assessing what is acceptable use of personal data

The Spanish Data Protection Authority (AEPD) has decided whether a medical centre's (data controller) use of a device for temperature measurement of patients in reception and waiting areas constituted acceptable use of personal data. As the use of the temperature measuring device in these areas meant that temperature data could be viewed by third parties staying in the waiting rooms, the Spanish Data Protection Authority concluded that the data controller lacked adequate measures to protect the personal data against observation by third parties. The data controller was fined EUR 30,000 for violating the principles of security and confidentiality set out in GDPR Articles 5 and 32. There are probably some reception rooms and waiting rooms within the health sector that have a challenge here – even if you don't take the temperature of people in public, there are many places where patients have to explain the reason why they have turned up.

Link



European Parliament votes to strengthen enforcement of GDPR

Members of the European Parliament voted on Wednesday 10 April on changes to strengthen enforcement of GDPR – and others called for further improvements to the GDPR. The amendments to the rules on enforcement aim to strengthen the rights of complainants and to address procedural challenges to the rules. Among other things, the amendments entail a change in the role of the supervisory authorities, and remove certain obligations related to the sharing of preliminary findings. A significant change allows national supervisory authorities to request urgent decisions from the EDPB in procedural disputes, and if a lead supervisory authority is unable to meet a deadline due to complex investigations, an extension of up to nine months may be requested. In addition, regulators can now request ex officio investigations when they suspect a potential GDPR breach affecting data subjects. Such ex officio investigations enable the supervisory authorities to independently initiate investigations of suspected unlawful processing of personal data, regardless of complaints from data subjects. However, the amendments are not introduced without opposition, and relevant objections can be read in the link below. In any case, it seems clear that the changes will entail greater focus and closer follow-up of the data protection regulations.

Read more here.


The U.S. may finally have a federal privacy law that competes with Europe's GDPR

Work has been initiated on a new, comprehensive privacy law in the United States that will provide far broader protection than the rules for medical data and child data that apply today. The bill, which is formally set to be introduced at the end of April, is called the American Privacy Rights Act or APRA. The preliminary regulations reportedly look similar to the European regulations in the GDPR, and will, among other things, allow US data subjects to opt out of targeted advertising and to minimize the personal data companies process about them. In addition, the proposal envisages that data subjects should be able to ask the data controller for access to the personal data being processed, require that personal data be corrected or deleted, and require a downloadable version of data held by the companies. The proposal also proposes that companies should not be able to share sensitive personal data without the data subject's explicit consent. Although many of these rights are already available to many Americans in some states, the new regulations could help strengthen and harmonize privacy regulations across the United States. However, it will be very interesting to see what the regulations will look like once implemented, after what is likely to be an extensive political process.

Read more here.


Principally important case concerning the data protection authorities' duty to act

The Data Protection Authority in Hessen, Germany (HBDI) has decided a case related to the Data Protection Authority's duty to act. The case was prompted by a data controller making HBDI aware of a data breach as a result of one of the data controller's employees having illegally gained access to personal data about the data controller's customers. After HBDI informed the data subject that no action was taken against the data controller, the registered person appealed the decision to the administrative court in Wiesbaden and requested that HBDI take action against the data controller. Subsequently, the Court referred to the European Court of Justice whether a supervisory authority finding that personal data has been processed in breach of the data subject's rights must always take action in accordance with Article 58(2) GDPR. In principle, the European Court of Justice answered the question in the affirmative and it was concluded that when a supervisory authority becomes aware of a processing of personal data that has infringed the rights of the data subject - the supervisory authority must take action under Article 58 (2) GDPR to the extent this is necessary to ensure full compliance with the GDPR. The decision is of principle and highly interesting, especially in light of the fact that such an extensive duty of activity for the data protection authorities can be difficult to comply with in an ever-increasing number of assignments.

Read more here.


Practically important case concerning the storage of personal data in recruitment cases:

The Finnish Data Protection Authority has taken a position on a data subject's request for deletion of personal data in connection with a recruitment process. The Norwegian Data Protection Authority had been made aware that a company (data controller) had refused to delete the data subject's personal data. For questions from the Norwegian Data Protection Authority about why The data controller refused to delete the personal data, the company replied that the storage of the information was necessary for the company to defend itself against potential complaints of discrimination in the hiring process. The company's explanation was accepted by the Data Protection Authority. Thus, the Finnish Data Protection Authority concluded that the company did not have to comply with the data subject's request for deletion, as storage was necessary for the company to defend itself against such potential complaints of discrimination – which according to the Finnish Penal Code must be made within two years of the appointment process. It is not obvious to me that this would be the result, since, for example, it has previously been stated that a company cannot store customer information to handle later complaints – but here I think there are different practices in different countries.

Read more here.


Decision of the Irish Data Protection Authority in case against Airbnb

We have previously discussed a case between the Irish Data Protection Authority and Airbnb, related to the basis for processing Airbnb's data subject ID requirement when processing the data subject's request for deletion of personal data. The Irish Data Protection Authority has now concluded that Airbnb lacked a basis for processing – and breached the obligation to minimise data – when it requested the data subject's ID in connection with the deletion request. Against this background, the Irish Data Protection Authority issued a reprimand against Airbnb, without Airbnb being fined.

Read more here.


Do you have any questions?