Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Lock hanging.

A current major issue on the privacy scene is the EDPB's upcoming opinion on the new "pay or OK"-model of Meta. Following a decision from the ECJ last year, Meta turned to consent for targeted advertising and introduced their solution that either the user consents to such advertising or the user will have to pay a fee to avoid targeted advertising. The question then arises – may Meta do this? May they claim compensation? At all? The alternative could be to force Meta to offer their service for free and without making revenue on targeted advertising. Or – may one say that the amount Meta claims is too high? There are a large number of important questions to be discussed here and they are not only important for Meta but also for many other companies.


I believe these questions are highly important and my primary worry is that the EDPB only has eight weeks, with a possible extension of 6 weeks, to deliver their opinion. This timeline is set by particular articles in the GDPR, but it is a frighteningly short time to handle a fundamental question on how enterprises may ensure that they finance their services. See the matter commented here.


On other topics, much focus in the privacy and tech area still circles around AI. The latest status on the AI act is described below. There is also one interesting case from Denmark on the lack of audits of data processors – likely important for many. Another interesting case is the continuation of the Google Chrome books in Denmark – but with a new twist. It is no longer the international transfer and documentation that is in focus, but the controllership of data. As always – happy reading!

Status of the AI Act

Finally, the European Union member countries have unanimously reached a deal on the bloc’s Artificial Intelligence Act, overcoming last-minute fears that the rulebook would stifle European innovation.


After agreeing on basic principles before Christmas, but without having a precise text to show, we finally have a united text of the new AI Act. It is, however, still a draft undergoing linguistic work. It will most likely pass the Council and Parliament in the weeks or months to come and is expected to be published in the Official Journal in May/June. After being published there, some of the Articles for forbidden AI will come into effect after only six months. For generative AI, however, they enter into force after 12 months. For high-risk AI (as defined in Annex III), they enter into force after 24 months and 36 months for high risk (as defined in Annex II).


Something that caught my interest is that several of the articles in the new draft are substantially different from the drafts before Christmas. Among the more interesting changes is that the fines have been lowered. Not substantially, but significantly though. I will revert with a more in-depth analysis of the new wording. From what I hear, one is not to expect further changes. It is only a linguistic clean-up that remains.

iApp link
Politico link

Italian data protection authority says ChatGPT breaches privacy rules

Italy's data protection authority has told OpenAI that its artificial intelligence chatbot application ChatGPT breaches data protection rules. The service was reactivated after OpenAI addressed earlier concerns from the data protection authority, amongst other things, on the right of users to decline consent to the use of personal data to train algorithms. Without providing further detail, the data protection authority has now concluded that certain elements indicate one or more potential data privacy violations. We do not know what those issues are.


Open Ai on the other hand has stated that it believes its practices are aligned with the EU's privacy laws. Open Ai has 30 days to present defense arguments.


Italy was the first West European country to curb ChatGPT, whose rapid development has attracted attention from lawmakers and regulators. It is a pity that not more information is known on this matter, as the lawfulness of AI systems is a particularly hot topic. But, I am sure we will know more about this case in not too long.

Reuters link

Use of AI - are our "searches" saved?

Don’t type anything into Gemini, Google’s family of GenAI apps, that’s incriminating — or that you wouldn’t want someone else to see. In a support document, Google outlines the ways in which it collects data from users of its Gemini chatbot apps for the web, Android and iOS. Google notes that users routinely process conversations with Gemini to improve the service. These conversations are retained for up to three years, along with "related data" like the languages and devices the user used and their location. The user may, however, switch off Gemini Apps Activity in Google's My activity dashboard, which prevents future conversations with Gemini from being saved to a Google Account for review.


However, even when Gemini Apps Activity is off, Gemini conversations will be saved to a Google Account for up to 72 hours to "maintain the safety and security of Gemini apps and improve Gemini apps."


Google’s GenAI data collection and retention policies don’t differ all that much from those of its rivals. OpenAI, for example, saves all chats with ChatGPT for 30 days regardless of whether ChatGPT’s conversation history feature is switched off, except in cases where a user has subscribed to an enterprise-level plan with a custom data retention policy. As GenAI tools proliferate, organizations are growing increasingly wary of the privacy risks.


OpenAI, Microsoft, Amazon, Google and others offer GenAI products geared toward enterprises that explicitly don’t retain data for any length of time, whether for model training or any other purpose. Consumers though — as is often the case — get the short end of the stick.

Techcrunch link

Penalty for failure to conduct supplier audit

The Danish Data Protection Authority has reported a party for not having supervised their data processors.


The Data Protection Authority's investigation with the party showed that they had not supervised their data processors. The initial audit was only conducted when the Data Protection Agency began its investigations. The agency found that the party did not act in accordance with the requirements of accountability. They emphasized that there had been no supervision over several years and that the processing involved a large number of registered persons. Furthermore, sensitive personal data was processed.


Audits of data processors have become increasingly common in recent years. This was not the case in 2018, but it is a natural development. In addition to a police report, a recommendation was made for a fine of no less than DKK 1,500,000.


As far as we know, this is the first decision in which a party is fined for not having carried out an audit of its data processors. Have you sent any questions at all to your processors? If not, it is time to start doing that.

Datatilsynet link

The story of Chromebooks in Danish schools was not over

On 30 January, the Danish Data Protection Authority came up with a startling decision on the use of Google Chromebooks and Workspace for education in schools. The decision concludes that the processing of personal data in certain parts of the services is illegal, as there is no legal basis in the Personal Data Protection Regulation and national law.


Therefore, the Data Protection Authority issued an order for 53 Danish municipalities to bring the processing in line with the rules and to ensure that all personal data processed in Google services has a sufficient legal basis.


The municipalities state in the submitted material that there is a transfer of personal data to Google that Google uses for its own purposes; not for the purposes of the schools. The purposes for Google are said to be improvement of their services and development of new services. The point of the DPA is that the schools must have a legal basis for rendering personal information of the students to Google for such purposes – and they cannot find such legal basis.


This is an issue that is very common in many situations, not only schools. You may imagine the same issues for employment situations. What is the legal basis for our employers to hand our personal information to vendors of its services? I believe we will see the same discussion in this area in the not so distant future. It is also quite parallel to the previous investigations of the EDPS on how Microsoft processes information of the users of their products, something that we have addressed in this newsletter earlier.


The Danish Data Protection Authority has assessed the legality of these disclosures to Google and concluded that they cannot find the necessary legal basis. The Data Protection Authority has ordered the municipalities to bring the processing in line with the rules by ensuring that there is authorization for all the processing that takes place. The municipalities must comply with the order from 1 August 2024, but must indicate how they intend to comply by 1 March at the latest.

Datatilsynet link


See also how the Norwegian Data Protection Authority comments on the case. It seems they believe the same situation could be in Norway, but that they have not completed an assessment.

Personvernbloggen link

Penalty for unlawfully asking for too much information when addressing an access request

We have talked about it many times - you must be sure that you give information to the right person, but you cannot ask for more identifying information than necessary.


This is demonstrated by a case handled by the Cypriot DPA. The Cypriot DPA reprimanded a controller for unlawfully asking a data subject’s ID when answering an access request and for failing to inform its users properly about a data breach.


In this case a data subject requested a copy of their personal data under Article 15(3) GDPR after being informed by the media that the controller (operating a website) had suffered a data breach. The data subject denied ever creating an account on the website and claimed that someone else must have done so using his e-mail address.


The controller replied stating that the only data they processed was the data subject's IP address and username collected upon registration and that the data subject needed to provide an ID for the controller to identify him and proceed with the access request. The data subject replied stating that the ID was not necessary as he was using his e-mail address. The data subject consequently filed a complaint against the controller stating that they required an ID for data subjects’ access requests to protect users from unauthorized disclosures and this is in line with Recital 64 GDPR.


The DPC took its final decision on 4 October 2023 and held that the controller acted in violation of Article 5(1)(c) GDPR by collecting IDs solely for complying with data subjects’ access requests, as this was excessive. It is not always easy to assess what level of certainty is required, but as a general rule it should suffice to use the email that the controller has available.

GDPRhub link

Technical frameworks are disregarded - Klarna AB

The Swedish Privacy Protection Agency (IMY) has started supervision of Klarna Bank AB ("Klarna") due to a complaint from a German citizen. The complaint has been handed over to IMY, as the responsible supervisory authority according to Article 56 of the data protection regulation.


The appellant has a Klarna card and has requested correction of the e-mail address linked to the card. Klarna has, in its feedback to the complainant, stated that it is not technically possible to change the e-mail address linked to the complainant's card and that the complainant therefore needs order a new card to have their email address changed. A new Klarna card would however, affect the appellant's creditworthiness.


The Privacy Protection Authority notes that the investigation into the matter shows that Klarna has processed personal data in violation of Article 12.2 of the Data Protection Regulation by not facilitated the appellant's exercise of his right under Article 16 to change his email address. The Privacy Protection Authority gives Klarna a reprimand in accordance with Article 58.2(b) in the data protection regulation for violation of articles 12.2 and 16 of the Data Protection Regulation. Failing to correct the e-mail address is a breach on Article 16. Here, we see that IMY interferes with how Klarna has set up their internal logistics. I believe this is a trend – that enforcing GDPR actually also implies opinions on how businesses (and sometimes business models) are set up.

IMY link

Employee monitoring

On 27 December 2023, the French Data Protection Authority (CNIL) fined AMAZON FRANCE LOGISTIQUE €32 million for setting up an excessively intrusive system for monitoring employee activity and performance. The company was also fined for video surveillance without information nor sufficient security.


The company manages the AMAZON group's large warehouses in France, where it receives and stores items and then prepares parcels for delivery to customers. As part of its activities, each warehouse employee is given a scanner to document the performance of tasks assigned to them in real time. Each scan carried out by employees results in recording of data, which is stored and used to calculate indicators providing information on the quality, productivity, and periods of inactivity of each employee.


CNIL considered that the system for monitoring employee activity and performance was excessive. They ruled that it was illegal to set up a system measuring work interruptions with such accuracy, potentially requiring employees to justify every break or interruption., CNIL also ruled that the system for measuring the speed at which items were scanned was excessive. More generally, the CNIL considered it excessive to keep all the data collected by the system, as well as the resulting statistical indicators, for all employees and temporary workers, for a period of 31 days.


In order to determine the amount of the penalty, the restricted committee took into account in particular the fact that the processing of employee data using scanners was different from traditional activity monitoring methods due to the scale on which they were implemented, both by their exhaustiveness and their permanence, and led to very close and detailed monitoring of employees' work.

CNIL link

Uber does not protect its drivers' privacy - gets a big fine in the Netherlands

The Dutch data protection authority (DPA) have fined Uber EUR 10 million for infringement of privacy regulations regarding its drivers' personal data.


The DPA found that Uber had not specified in its terms and conditions for how long it retained its drivers' personal data, or how it secured the data when sending it to entities in countries, which it had not named, outside the EEA.


Uber also obstructed its drivers' efforts to exercise their right to privacy by making personal data access requests unnecessarily complicated.


The fine was imposed after more than 170 French drivers complained to a French human rights organization, which lodged a complaint with the French data protection authority. However, as Uber has its European headquarters in the Netherlands, it was forwarded to their DPA.

Reuters link

Still tough weather for Facebook

The Dutch government is considering withdrawing from Facebook altogether over serious concerns about how the social media platform handles data security.


The government has been worried about how Facebook handles privacy-sensitive data for years and the Dutch government have through conducting a DPIA they revealed serious shortcomings. They have discussed this with Meta, but it has not in satisfactory commitments or improvements for the state. Apparently, Meta does not agree on the findings in the DPIA.


In November, the government asked the local data protection authority for advice on whether it is safe to keep using Facebook. That advice is expected soon. However, indications seem to be that the government expects to ban Facebook and is already making preparations.

NL Times link

Privacy in cars

Consumer group have found that the ‘Connected Services’ feature built into new Toyota cars can send personal and vehicle data to third parties. If drivers remove the feature, they risk warranty being voided.


Toyota has insisted it takes customer privacy "extremely seriously" but has acknowledged that the "Connected Services" feature – can only be disabled but not removed from its cars, or else drivers could void their warranty and render Bluetooth and speakers non-functional.


It has been found that Toyota’s "Connected Services" feature collects information such as vehicle location, driving data, fuel levels, and even phone numbers and email addresses. If you do not opt out of the service it will collect and use personal and vehicle data for research, product development and data analysis purposes. It may also under some circumstances share the data with third parties, such as debt collectors or insurance companies.


I believe privacy in cars have not yet really caught the eye of the regulators although there are a few cases. But when it does, there seems to be some matters that should be addressed.

The Gurdian link

EDPB clarifies notion of main establishment

The EDPB has adopted an Opinion on the notion of main establishment and the criteria for the application of the One-Stop-Shop mechanism. The Opinion clarifies the notion of a controller’s "main establishment" in the EU, in particular for cases where the processing occurs outside the EU.


The Opinion clarifies when there may not be a main establishment of the controller in the Union, and hence the One-Stop-Shop should not apply. Main points to be noted:


  • For the One Stop Shop to apply, an establishment of the controller must take decisions on the purposes and means of processing within the EU.
  • If decisions on the purposes and means - and the power to have them implemented - are exercised outside the EU, the One Stop Shop will not apply.
  • Burden of proof falls on the controller - things like regional HQs, ROPAs and Privacy Policy disclosures are all "relevant elements", but not determinative and DPAs can investigate and challenge the reality of decision-making.
  • The controller's processing will be assessed with respect to the specific processing - i.e. just because you may have an EU main establishment for some processing, does not mean it is a main establishment for all processing.
  • The Opinion specifically cautions that the GDPR does not permit “forum shopping” in the identification of the main establishment.

This Opinion is the latest in a series of concrete actions taken by the EDPB following its Vienna Statement on cross-border enforcement, aiming to streamline enforcement and cooperation among DPAs.

EDPB link

NOYB has investigated the state of privacy in Europe

In November 2023, NOYB conducted an online survey to gain reliable insight into the practical implementation of the GDPR. Some may say that NOYB's findings are likely to be seen as being too privacy focused, but the report is actually quite a good analysis of what the objectives of the GDPR were and the current status.


The survey included, inter alia, questions about companies’ GDPR compliance, the difficulty of convincing other departments or employees within a company of the importance of GDPR compliance, and questions about the most relevant factors that influence GDPR compliance. The survey focused on data protection officers (DPOs) and professionals working in the field of GDPR compliance.


The report sheds light on the prevalent issue of non-compliance with the GDPR across various sectors within the EU. NOYB identified significant gaps and deficiencies in GDPR implementation, highlighting the urgent need for stronger enforcement mechanisms and enhanced accountability measures.


The report reveals widespread non-compliance with GDPR provisions among organizations. Many entities fail to adequately protect individuals' privacy rights and secure their personal data in accordance with GDPR. Further, they found a lack of transparency and accountability in data processing practices, with organizations often failing to provide clear and accessible information about their data collection, storage, and usage policies. They also found that many organizations lack robust data security measures to safeguard against data breaches and unauthorized access. This puts individuals' personal data at risk of exploitation and misuse.


Also, they noted that consent management processes are often flawed, with organizations relying on ambiguous or coercive consent mechanisms that do not meet GDPR standards for informed and freely given consent. There is also a general lack of awareness among individuals about their rights under GDPR, and organizations frequently fail to facilitate the exercise of data subject rights in a timely and transparent manner.


The report has recommendations for improvements. Firstly, the regulatory authorities must strengthen enforcement efforts to hold non-compliant organizations accountable and deter future violations through the imposition of fines and sanctions. Further, organizations should prioritize investments in robust data protection measures, including encryption, access controls, and regular security audits, to mitigate the risk of data breaches and enhance overall data security. Organizations must also prioritize transparency and clarity in their data processing practices, providing individuals with clear and accessible information about how their personal data is collected, used, and shared, and ensuring that consent mechanisms adhere to GDPR standards. The report stresses that efforts should be made to raise awareness among individuals about their rights under GDPR and empower them to assert greater control over their personal data through the exercise of data subject rights.


In conclusion, the report underlines the urgent need for concerted efforts from regulatory authorities, organizations, and individuals alike to foster a culture of compliance with GDPR principles and uphold individuals' fundamental rights to data protection within the EU.

NOYB link

Do you have any questions?