Eva Jarbekk
Partner
Oslo
Newsletter
by Eva Jarbekk and Maja Helene Christiansen Steien
Published:
Digital Omnibus – the EU's proposal for simplifying digital regulations
I am glad that I enjoy learning new things. I have a green card, but I rarely play golf. I have taken a course in celestial navigation for coastal skippers using a sextant, but I rarely sail. I have taken several ceramics courses, but I do not practice. The most enjoyable thing is actually just learning something new – and then learning something else new. But you don't need more courses when you work with digital laws from the EU. You just have to wait a week or two, and something new will come along. Now we have a whole omnibus to familiarize ourselves with.
The Digital Omnibus initiative comes after so-called "stakeholder consultations". The Commission has identified key areas that create challenges in the implementation of the "digital laws". The Draghi report has also played a role. Europe is afraid of being left behind because of too much regulation. Perhaps the US has also exerted pressure, but we will probably never know.
The purpose of Digital Omnibus is to simplify the regulations without compromising privacy. Nevertheless, several of the changes are already the subject of intense debate. And the process of simplifying the regulations does not stop there. The Commission is already looking at further simplification measures for digital regulations via a "Digital Fitness Check". Stakeholders are invited to submit their views by 11 March 2026. You can find out more about this initiative here.
Below is some information about the process leading up to a possible change in regulations, followed by some information about proposed changes to the GDPR, proposed changes to the AI Regulation and other changes to the Data Act. In some places, I think the proposed wording is so important that I have included it, while in other places I only mention the changes. I have also included some useful links to an excellent tabular overview of changes to the GDPR and the AI Regulation written by László Pók.
Overall, it is important to remember that these are proposals. I see some people writing that they are now going to change their business flows. I think it is wise to wait a little before adjusting business practices until you see what is adopted.
After reviewing the "omnibus", there is some other news worth knowing about. Among other things, there is a survey from the French Data Protection Authority on what people really think about paying for services as an alternative to targeted advertising. The Norwegian Data Protection Authority is not closing the door on sharing personal data to prevent fraud, the ICC is saying goodbye to Microsoft products and adopting OpenDesk – as well as a handful of other issues that caught my interest.
Happy reading!
Here is the process of how the Commission's proposal may be adopted:
Before the end of the year: The Commission submits the Digital Omnibus proposal to the European Parliament, which distributes the content to the Committee on the Internal Market and Consumer Protection (IMCO), the Committee on Industry, Research and Energy (ITRE) and the Committee on Civil Liberties, Justice and Home Affairs (LIBE). Political groups then appoint a rapporteur and shadow rapporteurs to prepare the Parliament's various recommendations.
Parliament: The European Parliament (MEPs) will discuss and propose amendments to the proposals. The aim is to reach a final recommendation by the first quarter of 2026.
Council discussions: At the same time, representatives from the 27 Member States in the Council will hold discussions to prepare their position, known as the ‘general approach’, which is also expected in the first quarter of 2026.
Q2/Q3 2026: Trilogue negotiations between the Commission, Parliament and Council on their respective positions. The aim is to reach a compromise solution.
Accelerated procedure: Parliament may apply an urgent procedure (cf. Rule 170 of the Rules of Procedure). This has been done with previous omnibus packages. In that case, the committee stage is omitted and there is a direct vote in the plenary session of Parliament. It is actually possible to reach a decision as early as Q1 2026. Rumor has it that many people are working towards this.
Specific changes to the GDPR
If you would like a tabular overview of which articles are proposed to be amended and to what, you can find one here.
What is personal data?
The definition of personal data is changing. What constitutes personal data for one data controller does not necessarily constitute personal data for another recipient.
This is where the much-discussed SRB ruling comes into play. It appears that this also applies to data processors, which is not clarified in the SRB ruling.
The new proposed text in the definition of personal data is highlighted below:
"personal data" means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person; information relating to a natural person is not necessarily personal data for every other person or entity, merely because another entity can identify that natural person. Information shall not be personal for a given entity where that entity cannot identify the natural person to whom the information relates, taking into account the means reasonably likely to be used by that entity. Such information does not become personal for that entity merely because a potential subsequent recipient has means reasonably likely to be used to identify the natural person to whom the information relates."
The Commission shall "adopt implementing acts to specify means and criteria to determine whether data resulting from pseudonymisation no longer constitutes personal data for certain entities".
NOYB's comment on this is that "They stick with the 'double subjective' definition, which will be impossible to manage in reality (data would be 'personal' or not based on the intentions of a company and this would be different for each company)... It's like defining a gun based on the intention to shoot, not the objective characteristics of a gun..."
NOYB's point of view is not surprising. But I actually think you can have excellent privacy despite this change. And I don't think the comparison with a gun is apt. The whole point of the definition is the inability and intention to use the information (or the gun).
Based on the proposed amendment, it seems likely that the assessment can also be made for data processors. They are, after all, "recipients" in the GDPR's definition. This means that data processing agreements may not be necessary if the data processor cannot re-identify the individuals to whom the information relates. This will make things much more efficient – for example, general confidentiality clauses can be included in a SaaS agreement instead. This will be really exciting to follow in the future!
Research
Further processing for research purposes is always compatible with the original purpose. It will also be possible to refrain from providing information about such (further) processing when this is impossible or would require a disproportionate effort.
The new definition of research in Article 4 reads:
"scientific research" means any research which can also support innovation, such as technological development and demonstration. These actions shall contribute to existing scientific knowledge or apply existing knowledge in novel ways, be carried out with the aim of contributing to the growth of society's general knowledge and wellbeing and adhere to ethical standards in the relevant research area. This does not exclude that the research may also aim to further a commercial interest.
This has significant practical implications for both academic and commercial research. I have had many cases where we have carefully considered whether a certain type of analysis can be considered research. In future, this may become much simpler.
Reporting deadline postponed
The current deadline of 72 hours for reporting breaches is proposed to be extended to 96 hours. And the reporting obligation will only apply to high-risk breaches of personal data security. There will be a single place where all incidents relating to GDPR, NIS2 and DORA, as well as the Critical Entities Resilience (CER) Directive, can be reported. These appear to be very practical changes.
The EDPB will draw up a "list of the circumstances in which a personal data breach is likely to result in a high risk to the rights and freedoms of a natural person". This could also be useful, especially if the guidance is more specific than what is already available.
Further harmonization
The EDPB will also draw up a list of activities that require and do not require a DPIA. They will create common DPIA templates and methodologies for performing DPIAs, as well as a template for reporting non-compliance. This is described in addition to Article 70.
Automated individual decisions
GDPR Art. 22 is proposed to be clarified so that automated individual decisions are permitted for the conclusion of contracts, even if the decision could have been made manually, as well as on the basis of legal obligations and consent. This is probably a consequence of such systems becoming commonplace and the need for simpler regulation. I believe that privacy will survive this.
Rejection of requests from data subjects
Data controllers may reject a request from data subjects where the data subject "abuses the rights conferred by (the GDPR) for purposes other than the protection of their data". It is not clear what "abuse" means, but the data controller has the burden of proof in this regard.
Restriction of the duty to provide information
The duty to provide information is restricted. The new wording of Article 13 will be:
"Paragraphs 1, 2 and 3 shall not apply where the personal data have been collected in the context of a clear and circumscribed relationship between data subjects and a controller exercising an activity that is not data-intensive and there are reasonable grounds to assume that the data subject already has the information referred to in points (a) and (c) of paragraph 1, unless the controller transmits the data to other recipients or categories of recipients, transfers the data to a third country, carries out automated decision-making, including profiling, referred to in Article 22(1), or the processing is likely to result in a high risk to the rights and freedoms of data subjects within the meaning of Article 35."
Although the duty to provide information is restricted, it is good that it is retained when the information is to be disclosed to others, transferred to third countries or used for profiling.
The information obligation for research is also restricted:
"When the processing takes place for scientific research purposes and the provision of information referred to under paragraphs 1, 2 and 3 proves impossible or would involve a disproportionate effort subject to the conditions and safeguards referred to in Article 89(1) or in so far as the obligation referred to in paragraph 1 of this Article is likely to render impossible or seriously impair the achievement of the objectives of that processing, the controller does not need to provide the information referred to under paragraphs 1, 2 and 3. In such cases the controller shall take appropriate measures to protect the data subject's rights and freedoms and legitimate interests, including making the information publicly available."
It is reasonable to assume that many will try to avoid the obligation to provide information because it takes time. There have been some similar formulations in existing legislation, but at the same time it has been clear that they should be interpreted very restrictively. Guidance is now needed on how restrictively these formulations should be interpreted.
Legal basis for special categories in the development of AI
A new Article 9(2)(k) allows the use of special categories of personal data in AI development and operation. A new fifth paragraph allows data to remain in training sets when removal would require "disproportionate effort", as long as the controller promises to block harmful results or disclosure. The Commission, together with the EDPB, shall, in accordance with new Article 41a, provide guidelines on when something is pseudonymised so that the content is not personal data.
New text in Article 9(2) shall read:
"Paragraph 1 shall not apply if one of the following applies: […]
(k) processing in the context of the development and operation of an AI system as defined in Article 3, point (1), of Regulation (EU) 2024/1689 or an AI model, subject to the conditions referred to in paragraph 5.
(l) processing of biometric data is necessary for the purpose of confirming the identity of a data subject (verification), where the biometric data or the means needed for the verification is under the sole control of the data subject.
For processing referred to in point (k) of paragraph 2, appropriate organisational and technical measures shall be implemented to avoid the collection and otherwise processing of special categories of personal data. Where, despite the implementation of such measures, the controller identifies special categories of personal data in the datasets used for training, testing or validation or in the AI system or AI model, the controller shall remove such data. If removal of those data requires disproportionate effort, the controller shall in any event effectively protect without undue delay such data from being used to produce outputs, from being disclosed or otherwise made available to third parties."
These are highly controversial changes. It is assumed that the exception in the fifth paragraph may be applied to the processing of special categories in connection with uncovering bias. Regardless of how this ultimately turns out, it appears that more sensitive personal data will be entered into AI. It is not certain that this will only have negative consequences.
Article 88(c) contains specific provisions on the development and training of AI. Legitimate interest is recognised as a possible basis for the development of AI, but a balance must be struck. It is not a free-for-all. National law may require consent, the rights of the individual may outweigh the interests of the controller, and the needs of children must be given particular weight.
Cookies incorporated into the GDPR
Provisions on cookies from the ePrivacy Directive are included in the GDPR, see Article 88a. It must be said that it is a great advantage that this is now being brought together so that interpretation doubts are avoided. A regime is proposed in which there are several clear exceptions to the consent requirement. Among other things, consent is not required for pure communication, to offer a service requested by the user, to create aggregated information about users for the controller's own use, or for security purposes. Where consent is required, it must be possible to refuse it with a single click. If consent is refused, it shall not be possible to ask again until six months have passed. Article 88b allows for consent/refusal in browsers.
It is also proposed to introduce a special rule for "media service providers" so that they can ask the user for consent despite the settings in the operating system.
Changes to the AI Act
As described above, a number of key changes for the development of AI are being incorporated into the GDPR itself. Perhaps the most important changes for AI systems came in the GDPR. However, the AI Regulation is also undergoing changes. The most important ones are described below. In addition to what is described below, there will be easier access to regulatory sandboxes, regulatory simplifications and less documentation requirements for SMEs and medium-sized enterprises. If you would like a tabular overview of which articles are proposed to be amended and to what, you can find one here.
Deadlines postponed
The high-risk rules are proposed to be postponed for a maximum of 16 months, so that time can be freed up to finalize standards in this area. It is interesting that this is happening when representatives of the Commission said that this was completely out of the question. But it is, of course, permissible to change one's mind.
AI Literacy Obligation
AI Literacy Obligation is being transferred from businesses to the Commission and Member States. This may seem simple, but it is a significant departure from what the AI Regulation has established.
Fewer companies will have to register AI systems
Currently, companies that consider their systems to be low risk must register them in the EU database. The proposal seeks to remove this obligation. However, they must still maintain documentation showing their decisions and retain these for potential regulatory review.
Changes to the Data Act
Protection of trade secrets
The proposal introduces new measures for companies concerned that confidential information may fall into the wrong hands. Under Articles 4(8) and 5(11) of the draft Data Act, data holders may refuse to disclose information if there is a significant risk that trade secrets could be unlawfully used or shared with entities in third countries, particularly countries operating under legal frameworks that provide weaker protection than the EU. Any refusal to share information must be based on a specific assessment.
Restriction of data sharing between businesses and public authorities
The proposal restricts the ability of public authorities to request data. The trigger is changed from broadly defined "exceptional needs" to specifically defined "public emergencies" – only applicable when data is genuinely necessary to manage, prevent or recover from an emergency. Micro and small businesses will be entitled to compensation when they have to provide data in emergency situations. Larger companies will still have to provide data without compensation.
Three regulations merged into one
The Commission proposes merging three legal instruments in the Data Act: the Regulation on the free flow of non-personal data, the Data Management Act and the Open Data Directive. The aim is to create a uniform set of rules on how data held by public authorities can be reused, and to eliminate overlapping and sometimes conflicting provisions in several regulations.
GDPR Art. 22 is proposed to be clarified so that automated individual decisions are permitted for the conclusion of contracts, even if the decision could have been made manually, as well as on the basis of legal obligations and consent. This is probably a consequence of such systems becoming commonplace and the need for simpler regulation. I believe that privacy will survive this.
Now on to some other issues besides the Digital Omnibus.
Much of the content on the internet is offered "for free" – that is, without you paying money, but instead giving away your personal data. But now we are seeing new business models emerge, such as "consent or pay" solutions where you can choose between accepting tracking tools or paying to avoid them.
The French data protection authority CNIL has conducted a survey of over 2,000 French people to find out what people really think about paying for services as an alternative to targeted advertising. The results are actually quite interesting.
Over half of those surveyed already pay for video streaming services, with an average of €20 per month. However, for other services such as social media, health and fitness, news and generative AI, the proportion paying is less than 10%. Interestingly, between 24% and 33% say they are willing to pay for these services, which are currently "free".
The survey shows that between 25% and 48% of users would be willing to switch from free, ad-based access to a paid subscription without targeted advertising. They are willing to pay between €5.50 and €9 per month, depending on the service. For social networks, 72% of those willing to pay would only pay less than £5, and 88% less than £10.
51% of respondents believe that privacy is one of the three most important criteria when choosing a digital service, and 21% rank it as the most important – almost as highly ranked as price and quality. 64% say they actively monitor their browser data, for example by changing browser settings or using private browsing.
The emergence of new economic models has made it clear that services presented as "free" are in fact based on a different form of payment – the extraction of users' personal data, sometimes in ways that are highly intrusive to privacy. It seems that people have realised that nothing is really free – the question is whether we want to pay with money or with data.
Read more about this issue here.
Some of the most influential think tanks in Brussels are pushing for the establishment of a single digital regulator. In October, digital ministers from the EU's so-called "digital frontrunner countries" – the D9+ group – met in Lisbon to discuss the idea of a common EU technology regulator. The fact that the proposal has been raised in the D9+ group shows that the idea is gaining attention, but at the same time, the Commission and national authorities are still fighting for control of the sector.
Why this sudden interest? Some experts believe that Europe needs a dedicated body to enforce technology laws and avoid conflicts of interest. European Commission President Ursula von der Leyen has been the public face of trade discussions with Washington, while also leading the enforcement of EU technology laws that Trump has publicly criticized. Some observers have reacted to this dual role, especially as the Commission has been slow to impose fines on American technology giants.
Some believe that an independent regulatory body could better protect decisions on the enforcement of digital laws from legislative priorities and geopolitical pressure.
But it is not just about politics. Another argument for joint technology supervision relates to the highly topical issue of simplifying EU regulations. Everything is connected, as someone once said. But whether this will become a reality is uncertain. The Commission is reluctant to relinquish power.
You can read more about the issue here.
Economic crime is a growing problem that hits both individuals and businesses hard. Now, the Ministry of Finance wants to give banks and other financial institutions greater access to share information in order to combat this. The Data Protection Authority supports the proposal, but also points out the need to limit the purposes of sharing and to set clearer boundaries.
The Authority believes that privacy rules do not prevent increased sharing of personal data when the aim is to prevent and detect financial crime. However – and this is an important point – it is crucial to have clear legal bases that set clear limits on how this sharing should take place. These limits should ensure confidence that financial institutions are processing both their own customers' and third parties' personal data in a responsible manner.
So what is the Data Protection Authority concerned about? They believe that the purpose seems broader than necessary – the consultation paper describes the need as preventing and detecting fraud and deception, while the proposal covers all financial crime and other serious crime. That is quite a lot broader.
The Data Protection Authority supports the measures in the proposal, but believes that the Ministry should consider adding further measures and safeguards to ensure the fundamental rights and freedoms of data subjects. It is also important to ensure that foreign financial institutions that receive information are subject to the same framework as Norwegian financial institutions.
Another important point is that the proposal for a new Section 16-17, first paragraph, of the Financial Institutions Regulation is limited to the disclosure of ordinary personal data. The Data Protection Authority recommends that the Ministry also consider the need for a general legal basis for processing, not just disclosing, special categories of personal data.
Unsurprisingly, the Data Protection Authority emphasizes that the fight against financial crime is important, but that privacy must be respected.
Read more about the case here.
It started with an employee who complained to the Norwegian Data Protection Authority about their employer. The Norwegian Data Protection Authority chose to take corrective measures against the employer. However, the employee was not satisfied with the response chosen by the Norwegian Data Protection Authority and complained to the Privacy Appeals Board. The Board said no – the employee does not have the right to complain about this.
Then the Civil Ombudsman got involved and argued that the employee did in fact have the right to complain under Article 78(1) of the GDPR, which gives the right to an effective remedy against the Data Protection Authority's decisions. The Privacy Appeals Board therefore had to reconsider the case.
However, the conclusion remained the same: the employee still has no right of appeal. Why not? Because the person in question is not a party to the case against the employer. A "party" is "a person to whom a decision is addressed or to whom the case directly concerns".
What does the GDPR actually say?
Article 78 of the GDPR gives the right to an effective remedy against a legally binding decision that concerns them. The question is therefore whether this right can be invoked by a person who has complained to the supervisory authority but who is not the addressee of the decision that is subsequently made.
Recital 143 of the GDPR suggests that the right to an effective remedy can in principle only be invoked by the person or persons to whom the decision is binding. Other affected parties will not automatically have the right to bring legal proceedings, not even the person who originally complained to the Data Protection Authority.
The Privacy Appeals Board concluded that Article 78(1) does not automatically grant the right to appeal in all types of cases. You can lodge a complaint when your complaint is rejected, when the Data Protection Authority considers that there has been no breach of the GDPR, or when corrective measures directly affect your own privacy situation. However, as a rule, you cannot lodge a complaint simply because you believe that the Data Protection Authority should have reacted differently towards the data controller.
The Civil Ombudsman referred to two new rulings from the European Court of Justice – the SCHUFA case from December 2023 and the Land Hessen case from September 2024 – which would argue for a certain extension of the right to bring legal action, and whether in this case this indirectly implies a corresponding extension of the right to appeal. After a closer assessment of both judgments, the Data Protection Authority rejects this.
What does this mean in practice?
You have the right to appeal if the Data Protection Authority rejects your case or concludes that there has been no breach. In such cases, the consideration of effective protection of your rights argues that you should be able to appeal. However, if the case has been taken under consideration and a decision has been made on corrective measures, you will in principle only have the right to appeal if the breach constitutes an ongoing violation of your interests.
In this specific case, the violation had ceased, and the Board could not see that the Data Protection Authority's decision had any demonstrable actual effects on the complainant – beyond the desire that the Data Protection Authority should have reacted more severely.
Read more about the case here.
A Greek publisher, in this case the data controller, has been fined for revealing the real name and gender of an author who published books under a pseudonym. The data controller sent an email to the author's private address, but made it visible to around 55 other recipients – thus revealing the author's identity.
The author had deliberately chosen a pseudonym to conceal their authorship from their family and professional environment, and writes about topics related to gender minorities. The breach caused shock, serious mental health problems and jeopardised their career, according to the complaint.
The Greek Data Protection Authority found that the data controller shared special categories of personal data with third parties and could have avoided the whole situation with simple measures such as blind copies or individual messages. It was particularly serious that the data controller had actually agreed to treat the author's real name as confidential.
The publisher also failed to notify either the data protection authority or the author of the breach, despite the obvious risk.
The result? A fine of EUR 9,000.
What can we learn from this? Double-check the recipient list before you press send – it could save you both headaches and fines. The fine may not have been that large. The negative publicity is perhaps the worst thing for the publisher here.
Read more about the case here.
The International Criminal Court in The Hague is replacing Microsoft's office suite with OpenDesk, an open source solution from the German Centre for Digital Sovereignty (ZenDiS). The background? As most people remember, ICC Chief Prosecutor Karim Khan suddenly lost access to his Microsoft email in May after President Trump signed an order imposing sanctions on ICC officials.
The incident has sparked an important debate about Europe's digital sovereignty: Can we really trust American technology companies with our data? Several European regions are thinking along the same lines – Schleswig-Holstein in Germany, Andalusia and Valencia in Spain, and even Denmark's Minister for Digitalisation has announced a transition to open source software. The European Commission is also negotiating to move its cloud services away from Microsoft to French OVHcloud.
It's not just about politics – it's about control. When critical infrastructure and data are stored and run locally on European soil, we become less vulnerable to geopolitical tensions and foreign interference. Maybe we should take a look at OpenDesk in the network over Christmas.
Read more about the case here.
In August, Swedish IT systems supplier Miljödata was hit by a massive ransomware attack. This affected around 200 municipalities and regions. The attackers gained access to sensitive personal data such as names, health certificates, rehabilitation plans, occupational injury cases and other sensitive health information.
The hackers demanded bitcoin equivalent to approximately SEK 1.6 million not to publish the data. The hacker group Datacarry claimed responsibility and threatened to publish the data, which they later did. A significant amount of personal data is published on the dark web and over 1.5 million people were affected. Sensitive data on the dark web is not good.
The leak raises a number of questions about security and what types of personal data were stored in the systems. Several companies and organisations are now being investigated by the Swedish Data Protection Authority, including Miljödata itself, the City of Gothenburg, Älmhult Municipality and the Västmanland region.
The big risk now is that the attackers will start contacting private individuals using the leaked information and try to trick them into revealing more information.
It will take time before we know how this will end. In a Finnish case where health information was posted on the Dark Web, criminal proceedings were brought against the company's CEO – the case is discussed here.
Read more about the Swedish case here.
When the European Commission launched its proposal for new regulations against child sexual abuse material (CSAM) in 2022, no one thought we would still be discussing it in 2025. But the proposal met with massive resistance from day one, mainly because it would have forced messaging services such as WhatsApp and Signal to undermine encryption and scan private messages – which gave the regulation the nickname "chat control".
Now, three and a half years later, the Danish Presidency of the EU has announced that it is dropping the mandatory scanning requirements in order to finally gain enough support from member states. Sounds fine, right? Well, not quite.
The Danish Presidency wants to make the current temporary permission for voluntary CSAM scanning permanent. In addition, providers of high-risk services may still be required to develop "relevant technologies" to reduce the risk of child abuse on their services. What is actually meant by "relevant technologies" is unclear, and former MEP Patrick Breyer fears that in practice this could still mean chat control.
However, the Danish Presidency also wants to include a review clause that allows the Commission to assess the necessity and feasibility of introducing scanning requirements in the future, based on technological developments. In other words, the Commission can try again with mandatory scanning at any time.
Breyer points to three major problems: The proposal does not comply with the European Parliament's requirement that only courts can decide on access to communication channels. Furthermore, Article 6 of the proposal would prohibit young people under the age of 16 from installing messaging apps such as WhatsApp, Telegram, Snapchat and X, allegedly to protect them from grooming. Experience from the UK's Online Safety Act shows how easily teenagers circumvent such rules with VPNs and other tools. Thirdly, Article 4(3) of the Danish proposal would effectively ban anonymous email and messaging accounts, as well as anonymous chatting – users would have to show ID or their face, making them identifiable and exposing them to the risk of data leaks. This is alarming to journalists and civil society organisations that rely on private communication with whistleblowers.
It remains unclear whether member states will accept the Danish Presidency's proposal – after all, not everyone was opposed to the original chat control proposal. So even though the headline says that chat control has failed, it might be more accurate to say that it has taken on a new form.
Read more about the case here and here.
Partner
Oslo
Senior Associate
Oslo
Managing Associate - Qualified as EEA lawyer
Oslo
Partner
Oslo
Partner
Oslo
Partner
Oslo
Partner
Oslo
Senior Associate
Oslo
Senior Lawyer
Stockholm
Associate
Stockholm
Senior Lawyer
Stockholm
Senior Associate
Oslo
Associate
Stockholm
Partner
Oslo
Partner
Oslo
Senior Associate
Oslo