Newsletter

Privacy Corner

by Eva Jarbekk

Published:

Lock hanging
Decisions regarding third country transfers and deletion of data – an ever relevant topic!

It is hard to avoid the topic of third country transfers and deletion of data: A Danish it-solutions company, KOMBIT, contacted Danish Data Protection Authority with a specific question related to the disclosure of information to authorities in third countries (countries outside the EEA).


KOMBIT delivers computer systems which process personal data to the Danish municipalities. KOMBIT use a subcontractor, Netcompany AS for delivering the services. Netcompany AS in turn has a subcontracting agreement with Amazon Web Services ("AWS"). In principle, the relevant personal data is only processed within the EU/EEA, but it is stated in the subcontracting agreement between Netcompany A/S and AWS that this can be deviated from if AWS is required to disclose the data to public authorities.


In their response to KOMBIT's question, the Danish Data Protection Authority has stated the invoking the deviation provision in the this subcontracting agreement between Netcompany A/S and AWS would amount to an intentional transfer to a third country. This means that a provision that is common in most cloud service agreements, namely that the supplier will provide surveillance authorities in third countries access to data if so required, is problematic.


This has broad consequences as it will probably be difficult for suppliers to change this type of clause in their contracts. It remains to be seen how authorities will view this issue going forward. Read more here.


In another case, the Danish Data Protection Authority notified a fine of DKK 10 million to Danske Bank because the bank did not have sufficient measures in place for deletion of personal data from its systems. The bank itself had contacted the Data Protection Authority and reported that they had problems deleting personal data that they no longer had any legitimate reason to keep. It was eventually revealed that there were almost 400 different computer systems where required deletion could not be carried out. Together, these systems processed personal data of several million people.


In connection with the notified fine, the Data Protection Authority stated the following:


"One of the basic principles of the GDPR is that you can only process data you need – and when you no longer need it, it must be deleted. With regard to an organisation of Danske Bank's size, which has many and complex systems, it is particularly crucial that it can also be documented that the deletion actually takes place."


This is not the first time that the Danish Data Protection Authority has been strict relating to the deletion of [personal] data. We have so far not had [any] similar cases in Norway, but there probably will be. Rules for deletion of [personal] data are without a doubt fundamental principal of the GDPR. Read more here.

Be aware if you have personal data stored in DevOps environments

The Danish Data Protection Authority has also criticised the Danish Health Data Authority after they reported that personal data had remained in the Microsoft Azure DevOps development environment for more than a year after pseudonymised data had been used [by Danish Health Data Authority] for development purposes. The Danish Health Data Authority stated that it was not possible to establish technical security measures to ensure that similar errors would not occur again.


The Danish Data Protection Authority was not content with this explanation, but stated firmly that the Danish Health Data Authority had not followed the rules on processing security because they had not established appropriate control mechanisms, automatic or manual. In this regard, it is not sufficient to have guidelines and/or procedures for such control mechanisms if they are not implemented in practice.


The decision clearly shows the importance of documentable routines being in place and that established routines are actually implemented. Read more here.

Banks in trouble: Creditworthiness must be reported correctly and breach of the disclosure requirement

Things happen in the banking world as well:


Bank of Ireland risks a number of civil lawsuits after the Irish Data Protection Commission (DPC) revealed that the bank had sent incorrect, in many cases negative, credit information about 47,000 customers to the Central Credit Registry. Despite the fact that the bank had become aware of the errors in June 2019, they waited almost 6 months before informing their customers of the error, which DPC found extra criticisable.


The bank was notified a fine of € 463,000, but that is perhaps the least of their concerns. The GDPR allows individuals to sue for non-pecuniary damage, and it is quite possible that such cases will be brought as a result of these errors, especially by customers who have been registered with an incorrect, reduced credit rating. Read more here.


The Swedish company, Klarna, has also received notification of a fine, not relating to credit assessments, but to breach of the disclosure requirement. The Swedish Data Protection Authority notifies a fine of SEK 7.5 million for not having sufficiently informed its customers about the purpose of the collection of personal data and on what legal basis the data was collected. They had also not sufficiently informed their customers about who was the recipient of which personal data when Klarna shared it with other Swedish and foreign credit information companies, and also not whether the data was transferred outside the EU/EEA area.


All such information should be crystal clear. There are still a number of companies here that still have a lot to do. Read more here.

Speaking of data in registers: The Dutch tax authorities fined EUR 3.7 million for unlawful register

The Dutch Data Protection Authority has imposed a hefty fine on the Dutch tax authorities because they had created a list of individuals while in the process of seeking to identify fraudulent cases. The list contained over 270,000 individual registrations and was in use for more than six years. Among the indicators that were registered as fraud risk in individuals, were nationality and appearance. A number of persons were "marked" as possible fraudsters, although this was not correct.


In addition to the list itself being considered unlawful, the Data Protection Authority pointed out that the tax administration had violated the principle of transparency, the principle of limitation of purpose, correctness and storage of personal data. Read more here.

Use of images for identification and recognition purposes causes headaches – diverging decisions

Three decisions regarding the use of images should be mentioned. The first comes from Austria, where a ski lift operator succeeded in the claim that they had a legitimate reason to photograph the individuals using the ski lift every time they passed the checkpoint. The Austrian Data Protection Authority (DSB) considered this to be a legitimate use of personal data, based on Article 6 (1) f of the GDPR. Read more here


On the other hand, the Danish Data Protection Authority concluded differently in a similar case. The gym chain FysioDanmark wanted to use face recognition as access control to their gyms. The use was to be voluntary and based on consent. The collected data was also to be used for statistical and analytical purposes.


The Data Protection Authority found in favour of FysioDanmark in that they could use face recognition if it was voluntary and based on consent, but that there had to be differentiated consent; customers also had to be able to choose not to give consent for their personal data to be used for other purposes, such as statistics and analysis. Read more here


The third recent decision on the use of images for identification and recognition is perhaps a more obvious one: the Italian Data Protection Authority has notified Clearview AI of a fine of € 20 million for having collected biometric data on faces based on images it has found on internet. The Data Protection Authority also requires all images of Italian citizens deleted from their databases.


Similar decisions have been made by the French and British Data Protection Authorities, without fines having been issued. Clearview is a US company, with no representation or subsidiaries in Europe. It is therefore unlikely that the Italian Data Protection Authority will succeed in recovering the fine. However, the customer base in Europe will disappear, because the authorities will be able to prosecute any European company that uses the technology. This has already happened: Swedish police have been fined for using the technology. Read more here.

Be careful when sharing information about employees' health data

The Danish Data Protection Authority has reprimanded a municipality for having shared an employee's health data with her colleagues in connection with a joint e-mail which was sent regarding a temporary change in work tasks. The municipality pointed out that the whole thing was a misunderstanding and an accident, but the Data Protection Authority found the sharing of sensitive data so serious that a formal reaction was necessary.


One should be especially careful of spreading e-mails with potentially sensitive content widely. Read more here and here.

Finally – there are new guidelines to prevent so-called "dark patterns"

The European Data Protection Board (EDPB) has adopted draft guidelines on manipulative design, so‑called "dark patterns", in social media. The guidelines provide practical advice and recommendations to developers and users of social media. The deadline for public consultation is May 2.


Providers of social media and similar services are responsible for ensuring compliance with the obligations in the GDPR. The guidelines provide concrete examples of the various types of "dark patterns" that exist and contain specific recommendations to developers of user interfaces on how best to facilitate the effective implementation of the obligations in the GDPR in their services. Many of the examples are also graphically illustrated. In a separate appendix, they have made a separate checklist of different forms of "dark patterns".


The guidelines ought to be compulsory reading for everyone working with UX. There is every reason to believe that in a year or two the Data Protection Authorities will follow up the content of the guidelines with inspections.


Link

Do you have any questions?