Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Lock hanging.

The end of 2022 was marked by some very large fines from various data protection authorities. Most of the cases have been pending for several years before reaching their final decisions. And will still probably be a while before all rounds in the legal systems are over for many of them. A central point in many of the cases is that the data protection authorities place great importance on the guidelines from the European Data Protection Board (EDPB) in the interpretation of the GDPR. It will be interesting to see whether the EDPB 's guidelines have so much legal weight that they can lead to fines as large as we are seeing now. When looking at what will be most relevant in 2023, it is clear that data security will become increasingly important. Especially when questions are also raised as to whether it will be possible to take out insurance against such incidents in the future, see more about this below. Third country transfers will continue to be a hot topic. There will probably be a new agreement between the US and the EU on the transfer of personal data to the US, but I am not sure whether it will make compliance work easier. NOYB is clearly critical to the agreement – and I often wonder why there is so little talk of transfers of personal data to India. I predict we will work with Transfer Impact Assessments (TIAs) for a long time to come and not just for transfers to the USA. In Norway, it is probably conceivable that we will get clearer cookie regulations and many people will have to change their practices quite quickly. We're going to have enough to do, all of us. Happy new data protection year!


First of all, we have to look at what is, in many ways, one of the most important data protection matters in the whole world. The case of Cambridge Analytica, which was allowed to use data from Facebook and used such data to influence election results. On Christmas Eve, it became known that Facebook parent company Meta has entered into a settlement in the USA after a long-running class action in the case. Meta has accepted to pay no less than USD 725 million, i.e., over NOK 7 billion. The settlement is set to be the largest pay-out in a class action within data protection. The settlement must be approved by a court in San Francisco, which is expected to happen in March this year. The case has been discussed many places, and an overview can be found here.


The Norwegian website digi.no quite recently had an interesting article in which the head of the insurance giant Zurich Insurance Group (Zurich) says that it may become impossible to insure against cyber-attacks in the future. His simple point is that if someone takes control of critical infrastructure, the consequences are so expensive that they cannot be insured against. It is pointed out that Hydro lost NOK 1 billion on the hacker attack in 2019. They supposedly received approximately NOK 800 million in insurance coverage. The malware NotPetaya hit many businesses in 2017=,the company Mondelez was one of them. Mondelez had cyber insurance with Zurich and there was a lawsuit between the parties regarding the compensation. Zurich believed that the incident represented a "war-like act" and therefore did not entail cover as force majeure. The outcome of the case is not known, as it was a settlement that is exempt from public disclosure. Another angle is taken by the insurance market Lloyds of London, which has stated that cyber insurance policies must have exception provisions for state-supported cyber-attacks. One may wonder how it is possible to establish that an attack is state supported. At the same time, it is obvious that private insurance companies will not insure situations where they will obviously lose money. The case is discussed several places, see here and here.


At the end of November last year, the Irish Data Protection Authority announced its decision to fine Facebook for so-called "data scraping", an activity allowed on Facebook between 25 May 2018 and September 2019. The personal data of 530 million users could be scraped (where data is pulled from a website into a user-generated report) by third parties– Facebook was criticized for not having sufficient built-in and default data protection. Although Meta defended itself by saying that it had changed the systems during the period and reduced the possibility of scraping data from the platform, the Data Protection Authority did not give weight to this when deciding the fine. This is something that we see in a number of cases; even if the processing is adjusted and adapted to the regulations, this does not always mean that a fine is reduced. The fine of EUR 256 million – over NOK 2.5 billion – is extremely high, it will certainly go through a legal appeal process and presumably a trial before it becomes final. However, for many companies it is a useful reminder that increasing emphasis is being placed on ensuring that personal data cannot be misused. The data taken in this case was phone numbers, Facebook IDs, names and dates of birth. Many will probably claim that it was not particularly compromising data. The case is discussed here.


At the beginning of December, the EDPB discussed a case, or actually three cases together, namely Meta's processing of personal data in Facebook, Instagram and WhatsApp. The underlying questions concern whether processing of personal data can be based on the legal basis "agreement", when the purpose is behaviour-based marketing or improvement of the company's services. As usual, the Irish Data Protection Authority has handled the cases because of the one-stop-shop mechanism. However, the Data Protection Authority's proposal for a decision was criticized by many other countries' data protection authorities, including the Norwegian Data Protection Authority. In its decision, the EDPB will have laid down clear guidelines for which decision the Irish Data Protection Authority will make. The decision from the EDPB is therefore not known but will be made public when the Irish Data Protection Authority makes its final decision. The case is discussed in a good way on the Danish Data Protection Authority's website here and NOYB mentions it here. We will come back to this when the decision is public because it has great transfer value to the design of privacy statements and terms of use.


In Italy, the perfumery Douglas has been fined EUR 1.4 million for not respecting the individual's rights. A customer complained, and the data protection authority started an investigation of the database of Douglas, which had approximately 10 million customers. Although the inspection found that Douglas largely safeguarded the individual's rights, they found violations of several of the provisions of the GDPR. Douglas was required to make a number of improvements. Among other things, they must change the layout of their app so that there is a clear difference between the privacy statement and their cookie policy. They are also now required to delete personal data no later than 15 days after a customer does not renew their loyalty card. The case is interesting and should be read by any business which uses loyalty programs and apps for consumers. The fine appears high compared to the practice in other countries. The case can be read about here.


At the end of 2022, the French Data Protection Authority(CNIL) announced that they are fining Microsoft EUR 60 million for incorrect use of cookies. It is the largest fine CNIL has given in 2022. The situation was simply that the search engine Bing did not allow users to decline cookies as easily as they could be expected to. This is a principle about which there have been many cases over time, and although practices historically may not have been correct, there is no reason to make mistakes in this area going forward. In Norway, the situation is still not quite parallel to the countries in the EU, but there will probably be a legislative proposal on this during the first half of the year. The case is discussed here.


In Norway, one of the most discussed cases is a notification of a decision to Statistics Norway (SSB). Statistics Norway wanted to collect personal data about grocery purchases via what is known as bong data and to use this to create new statistics on which groups buy what. Statistics Norway has a special authority, under Section 10 of the Statistics Act, for such an analysis, and the question is whether the legal basis actually covers the activity that Statistics Norway envisages. A notification of decision is not a final decision, and it remains to be seen how this will end up. Nevertheless, the case shows that for players within the public sector it is important to find the framework for what a law allows for, in the same way as one in the private sector often discusses where an agreement can provide legal basis for processing. The case is discussed here.


I have been made aware of a rather special case from Germany, specifically from Cologne. The case has its background in a special German regulation where it is said that an agreement must be able to be terminated as easily as it is entered into. In other words, a click to enter into an agreement implies a click to terminate an agreement. This is similar to how we should be able to withdraw consent under the GDPR. In this case, a customer of telecommunications services could not terminate the service as easily as entering into it, and he made a complaint. The person had to enter a password to the service in order to terminate it. The court found that the provider could not require the customer to provide a password in order to terminate the service. Of course, this also opens the door for abuse, in that services are terminated without being completely sure whether it is being done by the right person. The case seems to be hotly debated in Germany and is relevant for those who have services in the German market. You must ensure that you have a so-called "termination button" implemented correctly. The case is discussed here.


When the GDPR entered into force, many wrote that the large fines could open the door to extortion, an opportunity for criminals who have personal data they shouldn't have to blackmail the person they stole it from. However, whilst there appear to have not been many such cases, now the already rather hard-tested Elon Musk may possibly become a victim of this. Last year, Twitter had a leak of fairly trivial personal data about 5.4 million of its users. Even if the data is as simple as a phone number and email address, this will not always be considered common knowledge, and the potential fine for Twitter is no less than USD 276 million. The person who stole the data threatens Musk that he will make the information public if he does not receive a ransom for it – and thus expose Twitter to a large fine. So far, Twitter – officially – has not commented on the matter, but it is discussed here.


As I wrote about in the previous newsletter, India has updated its proposal for a new data protection act. The question is whether the revised draft is sufficient for India to be approved as an adequate country in terms of data protection legislation one day, perhaps in the not too distant future. I will not go into the details of this new draft here, but there is still a lot of criticism against it for giving the government too much freedom and control. A more detailed review of the regulations can be found here


The Danish Data Protection Authority has recently emphasized that the GDPR also applies to personal data in printed material. There is nothing new so far, but they have simply made a poster that you can hang next to a printer so that users remember this. Link to the poster can be found here


In mid-December, Uber was subject to another nonconformity. One of their data processors was attacked and data on 77,000 employees was stolen. The data was then quickly dumped on the dark web. One of the comments surrounding the case is that hackers who publish material in this way are almost worse than those who engage in extortion. The reason is that the data then quickly becomes public and can be misused. It is claimed that the method of attack was to bombard an internal user with a large number of MFA inquiries (Multi Factor Authentication) until they accepted one in order to stop the inquiries. On the other hand, if you get a lot of MFA inquiries, then one should perhaps understand that something is wrong. In any case, you have to check whether the security is set up correctly. Uber has had several nonconformities in recent months and is now criticized for not having learned their lesson. This is of course not good for the company's reputation, and it is claimed that the shares have fallen by 5.2% as a result. The case is discussed here

Do you have any questions?