Newsletter

Privacy corner

by Eva Jarbekk

Published:

Security camera survailance

As always, there are many new decisions and happenings in privacy. Below is an update on some of the most important events that took place in November 2022.

In ECJ decision no. C-77/21, the court comments for the first time, as far as I know, on the use of personal data in tests. The case shows that the principle of purpose limitation does not prevent a data controller from using personal data in a database for test purposes, even if the personal data had previously been collected and processed in another database, provided the testing is "compatible" with the original purpose(s). The ECJ also writes that the performance of tests and the correction of errors had a concrete connection with the original contract with the data subjects – which concerned a subscription. Read more here: https://www.linkedin.com/posts/dr-carlo-piltz-631571b_language-of-document-activity-6989552698605850624-aP9b/?utm_source=share&utm_medium=member_ios


AI is important to many, and its importance will only increase. Some have experienced difficulty finding a legal basis for using personal data to develop an AI. The previous round of the Norwegian Data Protection Authority's sandbox for AI included a case using so-called federated learning. In short, this means that an AI can learn from a set of data and then learn from another set of data – without data from the two sets mixing. This makes it easier to find a legal basis for the processing, because you do not have to find a basis for mixing the data. It is of course a precondition that no data is taken out of the first set. The Data Protection Authority has arranged a seminar on this, and if you are going to use AI in the future, this is well worth looking into. The link: https://www.datatilsynet.no/aktuelt/aktuelle-nyheter-2022/sandkasseseminaret-2022/


In the Netherlands, preparations are underway for a mass claim against Twitter on behalf of up to 11 million people. The background is that Twitter has collected and sold information about their users through its mobile advertising platform MoPub. The information was used for targeted advertisements. In isolation, targeted advertisements may seem harmless, but when someone is building profiles on individuals, the consequences could potentially be very unfortunate. A simple example is that some employers do not want to show job announcements to pregnant women. A Dutch organisation is now running a campaign to get people to join the mass claim, and they are envisioning 250–2500 Euro in compensation per person. This could get expensive, even for Twitter. Anyone who had a mobile phone between 2013 and 2021 can join – as long as apps collecting the information were installed on the phone. Both Apple and Android have overviews of which apps are installed on a phone, including apps installed historically. The case is discussed here: https://nltimes.nl/2022/11/08/mass-claim-twitter-selling-data-11-million-dutch-users


In the US, the Federal Trade Commission (FTC) has taken action against the company Drizly and their CEO, James Cory Rellas. The company sells alcohol online, and experienced a data breach where data about 2.5 million customers was compromised. The company was naturally met with harsh criticism, but what is interesting, is that the FTC is holding Rellas personally responsible for the data breach. The company had not implemented adequate information security, and specific reference was made to the fact that they had not hired a "senior executive" responsible for information security. What is even more astounding about the decision, is that Rellas has been further ordered to implement good information security in other companies where he may be employed in the future (provided he is employed as CEO or in a leading role with responsibility for information security.) The FTC's reasoning is that executives frequently move from company to company, and that they want to ensure that he will safeguard information security if he gets a new job. The case is discussed here: https://www.huntonprivacyblog.com/2022/10/27/ftc-takes-action-against-drizly-and-its-ceo-for-alleged-security-failures-that-exposed-data-of-2-5-million-consumers/


There has also been a substantial settlement in the case between Google and several state's attorneys in the US. Google must pay USD391 500 000 for having misled consumers and tracked their location, in the biggest settlement in relation to data breaches in the US to date. Even when location tracking had been disabled in mobile phone apps, Google continued to track users. Going forward, Google will have to clearly show information about whether tracking is enabled, and provide users with details about such tracking and what it is used for. Read more here: https://www.nytimes.com/2022/11/14/technology/google-privacy-settlement.html


You might think the EU Data Protection Authorities do not want more personal data breach notifications. That may not be the case. On 18 October, the EDPD published a draft for new guidelines on personal data breach notifications suggesting, for example, that companies outside the EU have to notify breaches in all countries where affected data subjects are, not just through the one-stop-shop mechanism. The proposal is available here: https://edpb.europa.eu/system/files/2022-10/edpb_guidelines_202209_personal_data_breach_notification_targetedupdate_en.pdf


There is still much attention around cookies in Europe, even if though compliance is not enforced strictly in Norway yet. NOYB has filed a complaint against an Austrian company which lets users of its website first reject cookies, but then makes it mandatory to consent to Google's cookies and to ÖWA (Austrian Web Analysis). The website believes that the consent is necessary for the site to function properly. A mandatory consent is invalid, NOYB writes, and adds that a legitimate interest cannot be used as grounds for processing, either. It is quite possible that Schrems will get support for his views. The matter is referenced here: https://noyb.eu/en/two-are-better-one-profilat-strikes-back-forced-banner-when-users-make-wrong-choice


Transfer of personal data to third countries is a hot topic for many these days, as the deadline for implementing the new Standard Contractual Clauses is approaching. Many third-country assessments are being made nowadays. Many choose an overall risk-based approach for mission-critical systems, even if they know this is in the "orange" zone and not 100% compliant. In many cases, the alternative is to shut down the business. This time I am linking to an article looking at the situation almost with some humour: on what US intelligence is doing and not doing, on the amount of work Schrems II has generated, and on possible solutions. The article is worth reading with a cup of coffee: https://world.hey.com/dhh/american-data-spies-will-never-care-where-the-servers-are-371d4016. At the same time, it will be interesting to read NSM's report on conceptual approaches to a national cloud solution in Norway, expected in December and discussed here: https://www.digi.no/artikler/sv-vil-utvide-utredningen-av-nasjonal-sky-bor-ogsa-omfatte-kommuner-br/523943

Do you have any questions?