KPMG Law LLP logo

5 April 2024

As referenced last week in Part 1 of this series, the Commission designated 17 online platforms as VLOPs and 2 online search engines as VLOSEs. However, it’s not only the big players in this market segment that are impacted by this regulatory development. The scope of the DSA is very broad for a reason. The DSA is part of the “European Digital Package”, the main goal of which is to strengthen the European Union’s digital sovereignty and set standards with a clear focus on data, technology, and infrastructure. Hence, if your organisation is a small or medium digital business, it is very likely impacted as well.

The DSA applies to online intermediary services, including hosting services and online platforms. However, depending on the reach and functionality of the relevant service, additional compliance obligations might apply.

Is my organisation an “Intermediary Service”?

According to the DSA, an organisation would be considered as an “Intermediary Service” if it offers network infrastructure, including:

Is my organisation a “Hosting Service”?

In line with the DSA definition, these types of services allow for the storage of information provided by, and at the request of the recipient of the service.

Is my organisation an “Online Platform”?

According to the DSA, an online platform is a "hosting service that, at the request of a recipient of the service, stores and disseminates information to the public”. This means that content sharing websites, online marketplaces, certain video games, or social networks will fall into the scope of the DSA.

Notwithstanding the above, there are certain exemptions applicable to micro and small enterprises such as reduced transparency obligations.

My organisation is not in the EU, is the DSA applicable outside of the EU?

The DSA, like the GDPR, has an extra-territorial nature. The extra-territorial scope applies to organisations that fall into one of the definitions outlined above irrespective of their place of establishment, provided the services they offer are “substantially connected” to the EU.

A “substantial connection” will exist where the intermediary service provider has:

Relevant factors in determining whether a provider is targeting activities towards a Member State may include using a language or currency used in that Member State, the ability to order products to that Member State or using a relevant top-level domain.

Obligations

Intermediary Services
Article(s) Obligation Key Requirement

9, 10

Removal of illegal content

Upon receipt of an order to act against illegal content or provide information, inform relevant authorities of any effect given to the order without undue delay, specifying when effect was given to the order.

“Illegal content” is defined broadly under the DSA, to include any information or activity which is not in compliance with EU law or the law of a Member State

11, 12

Point of Contact

Designate single points of contact, one for supervisory authorities and another for users, and make information publicly available to facilitate communication with those points of contact.

13

Legal Representative

As outlined above, the DSA might apply to intermediary service without an establishment in the EU. This type of intermediary service must appoint a legal representative in the EU.

14

Terms & Conditions

Comply with the obligations regulating the form and content of terms and conditions applicable to their own services.

15

Annual Report

Publish a “clear” and “easily comprehensible” annual report detailing content moderation activity.

 

Hosting Service
Article(s) Obligation Key Requirement

15

Enhanced Transparency Reporting

The number of reports that were submitted by trusted flaggers will have to be included as part of the transparency reports, as well as the action taken as a result and the response time.

16

Notification System

Users should be able to notify of content they believe is illegal. In addition, users should be notified of the final decision and the redress mechanism available, if applicable.

17

Statement of Reason

Provide a statement of reasons to a user if content is removed, disabled or if services are terminated.

18

Criminal Offences

Alert law enforcement authorities if there is a reasonable suspicion that a serious criminal offence involving a threat to life or safety of individual is being planned.

 

Online Platform
Article(s) Obligation Key Requirement

20

Implement an appeal process

Online Platforms will need to put a system in place to allow users to appeal the decision to remove/restrict access to content determined to be illegal or in breach of the terms and conditions of the platform. The decision must be taken under the supervision of qualified staff.

21

Settlement of disputes

Users should be given the opportunity to avail of an out of course dispute settlement body. The information must be available in the platform offering the services.

22

Trusted flaggers

The necessary criteria for an entity to become a trusted flagger is set out in the DSA. Any content reported by a trusted flagger should be processed as soon as possible.

23

Repeat Offenders

Users that upload illegal content repeatedly can be suspended for certain amount of time. If users submit unfounded complaints, they can also be suspended.

24

Transparency Reporting

Publish the average monthly active users in the EU over the last six months.

25

Dark Patterns

Online platforms must not design or operate the user interface in a deceiving manner.

26, 27

Advertising & Recommender systems

The key parameters of the recommender systems should be clear in the terms and conditions as well as the option to change those parameters.

28

Protection of minors

Processing personal data of minors or sensitive personal data as part of targeting advertising is not allowed.

 

VLOPs and VLOSEs
Article(s) Obligation Key Requirement

14, 33

Accessibility of terms and conditions

Provide machine-readable summary of the terms and conditions, including remedies and redress mechanisms as well as publish them in the official languages of each Member State where the services are offered.

34

Risk Assessments

Perform risk assessments to assess the “significant systemic risks” arising from the provision of the services.

35

Risk mitigation measures

Organisations must implement reasonable, proportionate, and effective mitigation measures to address the systemic risks identified in their risk assessment.

36, 48

Crisis response mechanism

Take specific actions in case of a public risk including content moderation.

37

Independent Audits

Audits must be carried out at least once a year by an independent firm. The objective is to assess the level of compliance with certain obligations arising under the DSA.

38

Recommender systems

Users must be able to choose at least one option to avoid being profiled.

39

Transparency Reporting

The transparency reports must include additional information such as whether the advertisement was targeted.

40

Regulators

Providers must share data with the relevant authorities such as an explanation on the specific algorithm systems.

41

Compliance Function

If the organisation doesn’t have it already, it must establish a compliance function to monitor compliance with the DSA.

43

Supervisory Fee

Providers must pay the Commission an annual fee.

39

Ads Repository

Implement a repository of the online advertisement displayed on the platform.

 

The information above may seem complex; however the summary is quite simple. There are a significant number of obligations on any party covered by the DSA and the larger and more complex the entity, the more onerous the obligations. We will explain next week how this legislations links into the EU’s overall digital strategy and how best you (in conjunction with the KPMG and KPMG Law team) can be ready for the challenges ahead.

How can we help you?

At this stage, organisations under the scope of the DSA need to demonstrate compliance with this regulation to be able to grow in the EU digital environment. We would very much emphasise that in this new digital landscape, the Commission and regulators will be using more powers with a combination of criminal and legal sanctions to regulate the online space. We are uniquely placed to assist both regarding your operational and legal needs. Thus, it is crucial to understand what the ongoing obligations and best practices are to ensure compliance with the applicable requirements. KPMG Law can help you on your path to being DSA compliant by:

In addition, the KPMG Consulting team has deep technical expertise across all DSA related areas, including implementing compliance functions, frameworks and operating models; implementing complaints handling and issue management systems and processes; designing and implementing Know Your Business Controls (KYBC); performing systemic risk assessments, performing independent reviews and audits. Our capabilities include experts from IT Assurance, Risk Consulting, Technology Law, Algorithm Assurance, Privacy, Cybersecurity, and Forensic teams. In addition, our DSA services are powered by accelerators to ensure an efficient process. These include:

Contact the team

David McMunn

David McMunn

Director & Head of Technology & Digital Law, KPMG Law LLP

Sean Redmond

Sean Redmond

Director, KPMG in Ireland

Daniela Mejuto Pita

Daniela Mejuto Pita

Associate Director, KPMG Law LLP