Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Abstract pillars

There is no point writing that "a lot has happened now" – that's the way it always is with data protection. There are constantly new decisions and interpretations to consider, and this is a challenge. But GDPR is just 5 years old. It is only natural that it will take some time to get a regulatory framework of this size "in place" when all countries in Europe are to be uniform. I believe we probably need another 5 years for things to become more predictable. In any case, data protection has improved considerably over these years. We are now seeing a trend where data protection is becoming more important in acquisition situations, as well, and this will also be a strong driver towards better data protection in the future. Meta's historically high fine is also important. Read more on this and other key issues below. Happy 5th anniversary and happy reading!

Meta fined EUR 1.2 billion

Meta has been levied the largest violation fee in GDPR history. The illegalities consisted of the company transferring data from European users to the United States in violation of Chapter 5 of the General Data Protection Regulation. The error was considered very serious because Meta had unlawfully transferred information in a systematic and repetitive manner, and because the volume of the transferred information was massive. The fine amounts to approximately NOK 14 billion.


The European Data Protection Board (EDPB) has stated that the fine must be seen as a strong signal to companies that serious breaches of the regulation's provisions will have far-reaching consequences. This is an important message, and I have noticed that many are now more cautious about entering into agreements involving the transfer of personal data to the United States.


Meta has been given until October to rectify the situation or stop the transfers. It is unclear whether the new transfer system for personal data from the EU to the United States will be in place in time for Meta to continue with the transfers. It is conceivable that it will be in place before Meta's suspension deadline for transfers to the US expires. However, Meta maintains in its reports that it is not certain that the transfer system will actually be ready by Meta's deadline, and that this could be problematic for the company. Many of the people I talk to do not believe that the new transfer agreement will be in place in time.


At the same time, many people ask whether the fine is strict enough. An alternative approach could have been to require the company to immediately stop transfers between the EU and the United States, as well as requiring immediate deletion of personal data from US servers – and imposing a daily fine on Meta for each day passing without the requirements being met. It is argued that such sanctions would have been a greater incentive for Meta to bring its operations into compliance with the GDPR and to ensure that similar acts did not take place again. A high daily fine would also have meant that companies other than Meta would be more encouraged to ensure that their own acts were GDPR compliant.

What does the Schrems II verdict mean for the big tech companies?

The Schrems II case came in 2020. As you know, it became more difficult to transfer personal data to the United States – and one question is what this means for US companies. For a long time, many have said that the verdict does not really entail any major changes. That is not necessarily the case. It is interesting to see how this is now being discussed in a number of the companies' annual and quarterly reports.


The company Alphabet writes in its annual report for 2022 that until the data protection regulations between the United States and the EU have been adopted by the EU, there is much uncertainty surrounding the use of the Commission's standard contractual clauses as a basis for transferring personal data between the United States and the EU. Moreover, they write that the authorities' sanctioning of this type of transfer of personal data could harm the company's ability to offer, and the customers' ability to make use of, certain features they otherwise could have.


Like Alphabet, Telefonica Deutschland states in its 2022 annual report that the invalidation of the Privacy Shield framework is likely to create uncertainty in various contractual relationships. Furthermore, it is argued that there does not appear to be a consensus among European supervisory authorities on how the legal status of the exchange of personal data is be understood, increasing the risk of companies being found guilty of violations. Salesforce writes that their costs are increasing and so is the complexity of providing services in "certain markets". Microsoft's quarterly report states that the Schrems II verdict continues to create uncertainty regarding the legal requirements for transfer of information from the EU to the United States. Moreover, it states that the verdict has led certain supervisory authorities in the EU to block the use of US-based services involving the transfer of personal data to the United States. New and stricter rules for data transfers between the EU and the United States may potentially increase the costs and complexity of delivering the company's services and products.


In summary, the companies write that the Schrems II case actually does have an impact. This is hardly surprising, but it is refreshing that it is stated clearly in writing. I found this discussed on LinkedIn in this post

The Irish Data Protection Authority is out of step with data protection authorities in other EU countries

Although the Irish Data Protection Authority (DPC) has now issued the largest fine in GDPR history, there has often been disagreement between the DPC and other countries' data protection authorities about the level of sanctions. A report by the Irish Council for Civil Liberties (ICCL) shows that three-quarters of the Irish Data Protection Authority's decisions over a period of five years have been overruled by the European Data Protection Board (EDPB).


In these cases, the Board has demanded stricter enforcement of the violations than what the Irish Data Protection Authority concluded with in its decisions. An overrule rate of 75 percent is obviously very high. The report states that the Irish Data Protection Authority tends to use its discretion under Irish law to land on an "amicable resolution" rather than exercising enforcement measures against the business subject to a complaint. The report therefore argues that Ireland is the "bottleneck of enforcement " for cross-border cases in Europe.


Since a large share of the complaints against the tech giants have only been met with reprimands, the enforcement mechanism has largely been paralysed. Last summer, the Irish government announced that two additional commissioners would be hired and that Helen Dixon would be promoted to chairwoman to better deal with the ever-increasing workload of the Data Protection Authority. However, it is still uncertain whether the DPC will change its practices. In my guess, they will continue to be more lenient than other supervisory authorities, and the latter will then object to the EDPB, with the consequence that the DPC will have to increase its fines, and things will really just take a little longer than if the DPC had changed its practices itself.


The report is discussed here.

Case against Uber and Ola Cabs

In a case before the Amsterdam Court of Appeal, a group of drivers upheld the claim that Uber and Ola Cabs had violated the GDPR by using non-transparent algorithms as a basis for fining and dismissing the drivers.


One part of Uber's defense was that the algorithms' decisions had been reviewed and evaluated by humans – a common method of using algorithms in case processing. It is therefore particularly interesting that the appellate court rejected this, saying that the human evaluations could not be considered to be more than just a symbolic act. The algorithmic decisions thus had to be considered to have been made automatically.


Under Article 22 of the GDPR, the drivers had the right not to be subject to automated decision-making processes when it came to decisions that significantly affected them. Consequently, there was a breach of Article 22 in this case. In addition, the drivers' right to information pursuant to Articles 13-15 of the GDPR also had to be considered to have been violated.


With regard to the drivers' right to an explanation of the algorithmic decisions, the companies argued that they had to be entitled to withhold information pertaining to the functionality of the algorithms used to detect fraud. The basis was that such information had to be regarded as trade secrets and that sharing information could enable circumvention of these processes. The appellate court countered this by saying that the withholding of information about the functionality of the algorithms was not proportionate to the negative effects experienced by the drivers.


The verdict is important because it shows the need for transparent algorithmic solutions. When drivers are not given information about the decision-making process, it is practically impossible for them to assess the fairness of the algorithms. At the same time, one wonders how and if transparency is possible at all – that will depend on the types of algorithms used. In any case, this is clearly an important decision for anyone planning to use AI. The verdict is discussed here.

Can Google Analytics be legal after all?

The Norwegian Data Protection Authority recently wrote in its updated guide on third-country transfers that it is not necessary after all to implement additional measures for transfers to countries where the law is problematic, if there is no reason to believe that the problematic legislation will apply to the specific transfer. In assessing whether the problematic legislation will apply to the transfer, the Authority stated that emphasis may be placed on the data importer's practical experience as well as the experience of similar players in the same industry.


In a blog post from January 2022, Google wrote that it has had no requests for access from US authorities over a period of fifteen years prior to January 2022. The statements in the blog post represent experience that Google has gained over a long period of time. The company was not obliged to make the statement, and information from high-ranking employees of the company must be classified as credible.


The Norwegian Data Protection Authority will announce new guidelines for the use of Google Analytics in Norway shortly. It will be interesting to see whether the Authority takes Google's statements in the blog post about their experience into account – something that would be in line with the information they present in their own guide. I wrote an opinion piece about this in Digi together with some colleagues, which can be found here.

Marketing consents

If personal data is processed for direct marketing purposes, the data subject has the right to object to the processing of personal data concerning him or her. If the data subject objects to such processing, the personal data shall no longer be processed for such purposes, see GDPR Article 21 (2) and (3). In its annual report for 2022, the data protection authority of the German state of Hesse addressed a special situation concerning this issue. The situation may be relevant for many.


Let's say that a customer has bought something in an online store and opted out of advertising there. The customer then buys something through the company's app – can ads then be sent based on this purchase? In this specific case, the company did just that. Their reasoning was that the customer data from the app and from the online store was stored in different places.


The data protection authority in Hesse concluded that this was not legal because opting out of advertising does not depend on distribution or underlying technical solutions. The customer's objection to the processing had to be considered valid until it was withdrawn by him or her.


The report further states that if it is not possible for the company to have a uniform customer information database for online store and app customers, the company must develop other technical and organisational solutions to ensure that customers' data protection rights are actually respected. This is a reminder that you should think in terms of data protection by design from day 1 when developing new services.


The report is available here.

The data controller may be fined for the data processor's breach of the GDPR

In a new case from the European Court of Justice, Advocate-General Emiliou commented on whether the data controller can be held liable – and thus also be fined – as a result of the data processor's unlawful processing of personal data. In his decision, the Advocate-General concluded that Article 83 of the GDPR provided grounds for the data controller to be held liable for the data processor's breach of the GDPR – even if the data controller itself has not done anything unlawful. The statement is highly relevant, since many companies use other companies to handle personal data.


However, the data controller, i.e. the contracting entity, cannot be held liable in every case – only when the data processor acts within the framework of the instructions given by the data controller. If the data processor exceeds this mandate and uses data for its own purposes, the data controller cannot be fined for the data processor's unlawful processing of personal data.


The fact that the data controller only becomes liable for the data processor if the latter acts within the framework of the instructions given by the former makes the instructions important. Questions to be clarified includes who is to issue the instructions and in what form. Moreover, it is essential to clarify what the instructions specifically include: The vaguer and more discretionary the instructions, the greater the risk that the data controller be fined for the data processor's potential violations. I believe many data processing agreements can be improved on this point.

Large fine from the data protection authority in Croatia to a debt collection agency

In December 2022, the Croatian data protection authority received an anonymous complaint alleging that a debt collection agency, Debt Collection Agency B2 Kapital, was processing a large amount of personal data on debtors in an unlawful manner. The authority concluded that the agency had not informed the debtors about the processing of personal data in a clear and transparent manner in accordance with Article 13 of the GDPR. Nor did they have a data processing agreement with the data processor who was responsible for monitoring the personal data in the event of consumer bankruptcies. The data protection authority concluded that the debt collection agency did not have adequate security measures and issued a fine of EUR 2.265 million. The decision is discussed here.


When working with data protection issues, it is necessary to prioritise between many types of data protection risks. Processing personal data in ways that are not in line with information provided to the data subjects is a fairly large risk. And something you should focus specifically on when carrying out self-control in your own company – especially because I am afraid that this is not entirely uncommon.

Compensation for any damage in the event of a GDPR breach?

In May, the European Court of Justice stated in case no. C-300/21 that not every breach of the GDPR triggers the right to compensation under Article 82 of the regulation. The case involved an Austrian citizen who sued the national postal service because it had predicted citizens' political views based on sociodemographic criteria without their knowledge or consent.


The European Court of Justice wrote that three conditions must be met to claim compensation under the regulation: 1) The processing of personal data must constitute a breach of the provisions of the GDPR, 2) The individual claiming compensation must have suffered damage, 3) There must be a causal link between the unlawful processing of information and the damage that has occurred.


Although it is required that the individual has suffered damage, the Court stated that it is not required that the damage meets a certain level of seriousness. This must be seen in light of the fact that it is important to have a broad concept of damage to ensure that the GDPR gives the individual the best protection possible. However, a far-reaching concept of damage may pose a major risk to businesses that risk being sued due to trivial damage suffered by the individual, for example if a breach of the GDPR has only resulted in the individual feeling upset and angry. In turn, this will also impose a heavy burden on the judicial system.


The GDPR contains no provisions providing guidelines for assessing the scope of the compensation that the injured individual is entitled to. In the absence of such provisions, the member states themselves must formulate rules on this. However, the European Court of Justice points out that the member states' provisions must take into account the GDPR's principles of equal treatment and efficiency.


However, such a solution could lead to different interpretations of the concept of compensation in the various EU countries, which in turn could result in the compensation amount being different depending on which EU country you live in. This can of course be unfortunate from an overall perspective. We will probably see many more such cases in the future. Techcrunch has a good article on the matter here.

Data Protection Guide for small businesses

The EDPB has launched a data protection guide to help smaller businesses adapt to data protection regulations. The purpose is to raise awareness of the GDPR and make practical information more easily accessible and more understandable, through videos, graphics and other practical materials.


Upon closer examination of the guide, it may be objected that it is really quite advanced for small businesses. However, in light of the complexity of the regulations, there is a limit to how easily the material can be communicated.


An example of how the guide can be useful for businesses is the description of how a DPIA can be implemented and scaled. The guide can also be useful as an introduction for data protection beginners and can probably work well for training purposes in many companies.


The guide is available here.

Do you have any questions?