EU Digital Services Act Explained


Big tech behind bars, illustrating the serious nature of the EU's Digital Services Act (DMA)

The EU Digital Services Act (DSA) is a big piece of legislation, like GDPR was, so I’m going to leave a lot out and focus on summarizing. In particular, I’m going to answer the following questions:

  • Which companies are subject to the rules of the DSA?
  • What are those rules?
  • What is the penalty for noncompliance?
  • When is the deadline for companies to be in compliance with the DSA to avoid penalties?
  • What’s the difference between the EU’s Digital Services Act (DSA) and the EU’s Digital Markets Act (DMA)?

Let’s jump in.

Which companies are subject to the rules of the DSA?

Let’s start with some companies which are NOT subject to the DSA:

  • U.S. bloggers
  • U.S. e-commerce sellers (this doesn’t include marketplaces such as Amazon or Ebay)

The DSA only applies to companies which meet two conditions:

  1. The company is an “intermediary service provider”, and
  2. The company has a “substantial connection” to the European Union.

There are additional rules for companies which also qualify as “online platforms”, qualify as “online search engines”, and/or qualify as “very large”, and there are a few less rules for certain companies which are deemed to be “micro or small enterprises”.

Intermediary service provider

Intermediary service means one of the following “information society services” (defined in a few paragraphs):

  1. A mere conduit service, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
  2. A caching service, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate, and temporary storage of that information, performed for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request; and
  3. A hosting service, consisting of the storage of information provided by, and at the request of, a recipient of the service.

Examples of mere conduit services:

  • A private messaging app like Facebook Messenger

Examples of caching services:

  • Content delivery networks (CDNs)

Examples of hosting services:

  • Cloud storage providers like Dropbox
  • Cloud computing providers like AWS, GCP, and Azure

An information society service is any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. The definition is the same as provided in Article 1(1), point (b), of Directive (EU) 2015/1535.

For the purposes of this definition:

  • ‘at a distance’ means the service is provided without the parties being simultaneously present;
  • ‘by electronic means’ means that the service is sent initially and received at its destination by means of electronic equipment for the processing (including digital compression) and storage of data, and entirely transmitted, conveyed, and received by wire, radio, optical, or other electromagnetic means;
  • ‘at the individual request of a recipient of services’ means that the service is provided through the transmission of data on individual request.

Examples of information society services:

  • eBay
  • Hotels.com
  • Google Search
  • Uber

Non-Examples of information society services:

  • A taxi driver providing transportation services (service is not provided ‘at a distance’ from the passenger)
  • A medical examination at a doctor’s office using electronic equipment where the patient is physically present (service is not provided ‘at a distance’).
  • Telemarketing done by a human (service is not considered to be provided by ‘electronic means’)
  • Radio broadcasting (service is not supplied ‘at the individual request of a recipient of services’)

Substantial connection to the EU

A company is deemed to have a “substantial connection” to the EU if it meets any of the following conditions:

  • The service provider has an establishment in the EU,
  • The number of consumer & business users of the service in one or more EU member states is significant in relation to the population thereof, or
  • The service provider targets activities towards one or more EU member states.

Whether a service provider is considered to be targeting activities towards one or more member states is a matter of facts and circumstances, including the following factors:

  • The use of a language or currency generally used in a member state,
  • The possibility of residents of a member state ordering products or services,
  • The use of a relevant top-level domain,
  • The availability of an app in a relevant national app store,
  • The provision of local advertising,
  • The provision of advertising in a language used in a member state, or
  • Providing customer service in a language used in a member state.

A substantial connection will also be construed to exist if a service provider directs its activities to one or more member states within the meaning of Article 17(1), point (c) of Regulation (EU) No 1215/2012.

However, the mere technical accessibility of a website from the EU cannot, on that ground alone, be considered as establishing a substantial connection to the EU.

Online platforms

Online platforms are hosting service providers which also make hosted data publicly available. More specifically, an online platform is a provider of hosting services which, at the request of a user, stores and disseminates information to the public (unless that activity is a minor or purely ancillary feature of another service or a functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the Digital Services Act).

Examples of online platforms:

  • Social media apps (e.g. Facebook, Snapchat)
  • Online marketplaces (e.g. Amazon, eBay)

Non-Examples of online platforms:

  • The comment section of an online newspaper
  • Cloud computing, cloud storage, and web hosting services (i.e. web infrastructural services)
  • Email service providers
  • Private messaging apps

Online platforms are subject to additional rules beyond what all hosting service providers are subjected to.

Online search engines

An online search engine is an intermediary service provider that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject, in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found.

Notably, an online search engine is not necessarily an online platform or even a hosted service provider.

“Very large” online platforms and search engines

The DSA’s strictest rules only apply to “very large online platforms” (VLOPs) and “very large online search engines” (VLOSEs).

A VLOP is an online platform that, for at least 4 consecutive months, provides services to at least 45 million average monthly active users in the EU (corresponding to about 10% of the EU population), and that has been designated by the European Commission as a VLOP.

A VLOSE is an online search engine which reaches, on average, at least 45 million monthly active recipients of the service in the EU, and which has been designated as a VLOSE by the European Commission.

The European Commission has currently designated the following companies as VLOPs and VLOSEs:

  1. Alphabet (Google Search, YouTube, Google App store, Google Shopping, Google Maps)
  2. Microsoft (Bing, LinkedIn)
  3. Meta (Facebook, Instagram)
  4. Bytedance (TikTok)
  5. Snap (Snapchat)
  6. Pinterest
  7. Twitter
  8. Apple (Apple App Store)
  9. Wikimedia (Wikipedia)
  10. Amazon (Amazon Marketplace)
  11. Booking.com
  12. Alibaba (AliExpress)
  13. Zalando

What rules do intermediary service providers with a substantial connection to the EU have to comply with under the DSA?

The DSA prescribes rules based on two dimensions: (1) company size, and (2) type of service provided by the company. The table below shows which articles of the DSA apply to which sizes and types of companies.

Table 1:

Micro and small enterprisesCompanies that only allow consumers to buy from micro and small enterprises (which are not also very large companies)All other companiesVery large companies
Mere conduit service providersArticles 11-15n/aArticles 11-15Articles 11-15
Caching service providersArticles 11-15n/aArticles 11-15Articles 11-15
Hosting service providersArticles 11-15
Articles 16-18
n/aArticles 11-15
Articles 16-18
Articles 11-15
Articles 16-18
Online platformsArticles 11-15
Articles 16-18

Article 24(3)
Articles 11-15
Articles 16-18
Articles 20-28
Articles 11-15
Articles 16-18
Articles 20-28
Articles 11-15
Articles 16-18
Articles 20-28

Articles 33-43
Online platforms allowing consumers to buy goods or servicesArticles 11-15
Articles 16-18

Article 24(3)

Articles 30-32
Articles 11-15
Articles 16-18
Articles 20-28
Articles 11-15
Articles 16-18
Articles 20-28

Articles 30-32
Articles 11-15
Articles 16-18
Articles 20-28

Articles 33-43
Online search enginesArticles 11-15n/aArticles 11-15Articles 11-15

Articles 33-43

The primary obligations prescribed by the DSA fall into 4 categories: content moderation, fair (algorithm & business policy) design, transparency, and oversight. Table 2 below summarizes the obligations in each of those categories for each type of company subject to the DSA.

Table 2:

ObligationsUniversal

All providers of conduit, caching, hosting services
Basic

All hosting services
Advanced

Medium to large online platforms
Special

VLOPs & VLOSEs
Content ModerationArticle 14 (fair content moderation)Article 16 (notice)

Article 17 (statement of reasons)
Article 20 (internal redress)

Article 21 (out-of-court mechanism)

Article 22 (trusted flaggers)

Article 23 (anti-abuse provisions)

Articles 30-32 (specific rules on B2C marketplaces)
Article 34-35 (risk mitigation assessment)

Article 36 (crisis response mechanism)
Fair Design

(user interfaces, recommender systems, advertising, and other parts)
Article 14 (fair content moderation)Article 16 (user-friendly notice and action)Article 25 (fair design of user experience)

Article 26(3) (advertising)

Article 27 (recommender systems)

Article 28 (protection of minors)

Article 30 (traceability of traders)

Article 31 (facilitating design for traders)
Article 38 (recommender systems)

Article 39 (risk mitigation assessment)
TransparencyArticle 15 (annual reporting)Article 17(5) (database of all the statements of reasons)Article 22 (reports by trusted flaggers)

Article 24 (content moderation reports)

Article 26 (advertising disclosure)
Article 39 (advertising archives)

Article 42 (content moderation transparency)
OversightArticle 11 (regulator’s contact point)

Article 12 (recipient’s contact point)

Article 13 (legal representative)
Article 18 (notification of suspected relevant crimes)(-)Article 37 (auditing)

Article 40 (data access/scrutiny)

Article 41 (compliance function)

Content moderation:

All intermediary service providers must comply with DSA Article 14. Hosting service providers must also comply with Articles 16 and 17. Online platforms must also comply with Articles 20-23 and 30-32. And VLOPS and VLOSEs must comply with Articles 34-36.

Article 14 (content moderation): In their terms of service, companies must include information on any policies, procedures, measures, and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. This information must be set out in clear, user-friendly, and unambiguous language, and must also be made publicly accessible in a machine-readable format.

Article 16 (content moderation): Hosting service providers must put user-friendly mechanisms in place to allow any individual or entity to notify them (electronically) of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Hosting providers must then act on this knowledge to prevent becoming liable for the content. In particular, they must process any such notices and make a decision (possibly using automation) in a timely, diligent, non-arbitrary, and objective manner.

Article 17 (content moderation): If a hosting service provider imposes any restrictions on a user’s account, the provider must provide that user with a clear and specific reason why such restriction was taken. The provider must also provide information on the possibilities for redress available to the user in respect of the decision, such as through an internal complaint-handling mechanism, out-of-court dispute settlement, and/or judicial redress.

Article 20 (content moderation): An online platform must provide its users (and any person who files an Article 16 notice) with a mechanism for lodging complaints with the online platform’s decision to restrict or not restrict a piece of content or a user who posted it. The complaints must be handled not purely on an automated basis.

Article 21 (content moderation): Out-of-court dispute settlement. Users and Article 16 notice filers lodging an Article 20 complaint shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with Article 21(3) in order to resolve disputes relating to those decisions. The settlement body shall NOT have the power to impose a binding settlement of the dispute on the parties, however. The fees charged by the settlement body to the online platform must be reasonable and not in excess of the costs incurred by the body. For the complaint lodger, the dispute settlement must available for free or a nominal fee.

Article 22 (content moderation): Trusted flaggers. Online platforms must allow government-certified “trusted flaggers” to submit Article 16 notices and have those notices prioritized for speedy processing & decision.

Article 23 (content moderation): Online platforms must suspend the accounts of users who, after having been issued a prior warning, continue to frequently provide manifestly illegal content. Additionally, online platforms must suspend the ability of a person to file an Article 16 notice if that person frequently submits unfounded complaints through the Article 16 mechanism. Online platforms must have clearly disclosed policies for when either of the previously mentioned types of suspensions will be implemented.

Article 30 (content moderation): E-commerce seller information. Online platforms that allow allow consumers to buy goods or services online (i.e. to “conclude distance contracts with traders”) must collect and use freely accessible online databases to verify: (a) the name, address, phone number and email address of each trader, (b) a copy of the ID document of the trader as required by Article 3 of Regulation (EU) 910/2014, (c) the payment account details of the trader, (d) where the trader (business) is registered, and (e) a self-certification by the trader committing to only offer products or services that comply with EU law. If a trader fails to provide the information or provides inaccurate, incomplete, or not up-to-date information, then, after warnings, the online platform must suspend the trader’s ability to use the service. Traders must be able to lodge an Article 20/21 complaint if they disagree with their suspension. The online platform must make the information from (a), (d), and (e) available to users of the platform.

Article 31 (content moderation): Online platforms that allow traders to sell goods or services must ensure traders provide basic info about the products or services they are selling, and any trademark or logo of the seller. Platforms also must make reasonable efforts to perform random checks to ensure that the products or services sellers are offering are not illegal.

Article 32 (content moderation): If an online platform that allows traders to sell goods or services learns that an illegal product or service has been offered by a trader to consumers located in the EU, then the platform must notify consumers who purchased the product or service, and the notice must include information about any relevant means of redress.

Article 34 (content moderation): VLOPs and VLOSEs must perform an EU systemic risk assessment annually and before deploying any functionalities likely to have a critical impact on such risks. System risk that should be analyzed include:

  • (a) the dissemination of illegal content through their services,
  • (b) any actual or foreseeable negative effects for the exercise of fundamental rights (including Charter Article 1 relating to human dignity, Article 7 relating to respect for private and family life, Article 8 relating to protection of personal data, freedom of expression, freedom of information, the freedom and pluralism of the media, freedom to non-discrimination, Article 24 child rights, and Article 38 high-level consumer protection rights),
  • (c) any actual or foreseeable negative effects on civic discourse and electoral processes, and public security, and
  • (d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to a person’s physical and mental well-being.

When conducting risk assessments, VLOPs and VLOSEs must take into account their content moderation systems.

Article 35 (content moderation): VLOPs and VLOSEs must put in place reasonable, proportionate, and effective mitigation measures tailored to the specific systemic risks identified in their Article 34 risk assessment(s). Such measures may include:

  • (c) adapting content moderation processes, including the speed and quality of processing notices related to specific types of illegal content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content moderation;
  • (g) initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;
  • (k) ensuring that an item of information, whether it constitutes a generated or manipulated image, audio, or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

Article 36 (content moderation): VLOPs and VLOSEs must comply with crisis response orders from the Commission to whether and to what extent their services contribute to particular threats to public security or public health, identify specific measures to prevent or eliminate such contributions, and report to the Commission on the implementation and effectiveness of such measures.

Fair design:

All intermediary service providers must comply with DSA Article 14. Hosting service providers must also comply with Article 16. Online platforms must also comply with Articles 25-28 and 30-31. And VLOPS and VLOSEs must comply with Articles 38-39.

Article 14 (fair design): Any restrictions imposed by the terms of service on the use of an intermediary service must be applied and enforced in a fair, diligent, objective, and proportionate manner, to all parties.

Article 16 (fair design): Hosting service providers shall process any Article 16 notices in a timely, diligent, non-arbitrary, and objective manner.

Article 25 (fair design): Online platforms must not design, organize, or operate their online interfaces in a way that deceives or manipulates the recipients of the service, or in a way that otherwise materially distorts or impairs the ability of the recipients of the service to make free and informed decisions. This prohibition does not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679. The Commission may issue guidelines on how the prohibition applies to specific practices, including:

  • Giving more prominence to certain choices when asking the recipient of the service for a decision,
  • Repeatedly requesting the recipient of the service make a choice (e.g. via a pop-up) where that choice has already been made,
  • Making the procedure for terminating a service more difficult than subscribing to it.

Article 26 (fair design): Ads on online platforms must be clearly marked as ads, must disclose who is running the ad, and information clearly accessible from the ad about the main parameters used to determine the recipient to whom the ad is presented, and, where applicable, information about how to change those parameters. Online platforms may not present ads based on profiling as defined in Article 4, point (4) of Regulation (EU) 2016/679 using special categories of personal data referred to in Article 9(1) of Regulation (EU) 2016/679.

Article 27 (fair design): Where several options of recommender system are available under the terms of service of an online platform, that online platform must make available a functionality that allows the recipient of the service to select and modify at any time their preferred option.

Article 28 (fair design): Providers of online platform shall not present advertisements on their interface based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using personal data of the recipient of the service when they are aware
with reasonable certainty that the recipient of the service is a minor.

Article 30 (fair design): Traders who are suspended from an e-commerce platform pursuant to Article 30 are entitled to fair treatment if they lodge a complaint about the suspension.

Article 31 (fair design): Providers of online platforms allowing consumers to conclude distance contracts with traders shall ensure that its online interface is designed and organised in a way that enables traders to comply with their obligations regarding pre-contractual information, compliance and product safety information under applicable Union law.

Article 38 (fair design): VLOPs and VLOSEs that use a recommender system must provide at least one option for each of their recommender systems which is not based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679.

Article 39 (fair design): VLOPs and VLOSEs that present ads on their online interfaces must make publicly available a repository of all the ads run on the platform.

Transparency:

All intermediary service providers must comply with DSA Article 15. Hosting service providers must also comply with Article 17. Online platforms must also comply with Articles 22, 24, and 26. And VLOPS and VLOSEs must comply with Articles 39 and 42.

Article 15 (transparency): Intermediary service providers (unless they are small or micro enterprises and not also VLOPs or VLOSEs) must make publicly available, at least once a year, a report on any content moderation they engaged in.

Article 17 (transparency): Hosting service providers must provide a clear and specific statement of reasons to any recipients of the service affected by restrictions due to illegal or incompatible content.

Article 22 (transparency): Trusted flaggers must publish, at least once a year, a comprehensive and etailed report on notices submitted in accordance with Article 16.

Article 24 (transparency): Online platforms must include in their Article 15 reports information about the number, processing times, and outcomes of disputes submitted to Article 21 out-of-court dispute settlement bodies.

Article 26 (transparency): Online platforms that present ads must clearly mark content as ads and disclose info about who is running the ad and how the ad is targeted.

Article 39 (transparency): VLOPs and VLOSEs must publicly publish (through a human interface and through API) a database of all ads served through their online interfaces, including the content of the ads, the name of the product or service being sold or the brand being advertised, the person (natural or legal) on behalf of whom the ad was run, the person who paid for the ad, the period during which the ad was run, and the parameters used to target the ad.

Article 42 (transparency): VLOPs and VLOSEs must publish Article 15 reports every six months. VLOPs must also include additional information in these reports, including:

  • The human resources that the VLOP dedicates to content moderation for its EU services, broken down by each applicable official language of the EU member states,
  • The qualifications and linguistic expertise of the persons carrying out those content moderation services, and
  • The indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the EU member states.

VLOPs and VLOSEs must also make publicly available (a) a report with the results of their Article 34 risk assessment, (b) the specific mitigation measures they put in place pursuant to Article 35, (c) the audit report required by Article 37, (d) the audit implementation report described in Article 37, and (e) where applicable, info about the consultations conducted by the company in support of the risk assessments and the design of the risk mitigation measures.

Oversight:

All intermediary service providers must comply with DSA Articles 11-13. Hosting service providers must also comply with Article 18. And VLOPS and VLOSEs must comply with Articles 37 and 40-41.

Article 11 (oversight): Providers of intermediary services must designate a single point of contact to enable them to communicate electronically with EU member states’ authorities, the Commission, and the Article 61 Board.

Article 12 (oversight): Providers of intermediary services must designate a single point of contact that recipients of the service can use to directly and rapidly communicate with them electronically, in a manner which is not entirely automated.

Article 13 (oversight): Intermediary service providers serving the EU but not having an establishment in the EU must designate a legal representative in the EU, and this representative shall be liable for non-compliance with the DSA.

Article 18 (oversight): Where a hosting service provider becomes aware of any information giving rise to a suspicion that a criminal offense involving a threat to the life or safety of a person has taken place, is taking place, or is likely to take place, the provider must promptly notify law enforcement.

Article 37 (oversight): VLOPs and VLOSEs must pay for an independent audit at least once a year to assess their compliance with the DSA.

Article 40 (oversight): VLOPs and VLOSEs must, if and as requested, provide the EU government and/or vetted researchers with data necessary to monitor and assess compliance with the DSA.

Article 41 (oversight): VLOPs and VLOSEs must establish a compliance department which is independent from their operational departments and is composed of one or more compliance officers, including the head of the compliance department.

What are the penalties for companies that fail to comply with the Digital Services Act?

Individual EU member states can set their own rules on penalties applicable to infringements of the DSA by providers of intermediary services. However, the states are limited to a maximum fine equal to 6% of the annual worldwide turnover (revenue minus VAT and other taxes) of the intermediary service provider in the preceding financial year.

Additionally, the max fine that may be imposed for supplying incorrect, incomplete, or misleading information, for failing to reply or rectify such information, or failing to submit to an inspection, shall be 1% of the annual income or worldwide turnover of the intermediary service provider in the preceding financial year.

The max amount of any periodic penalty payment shall be 5% of the average daily worldwide turnover or income of the provider of intermediary services for the preceding financial year, per day.

For comparison, the maximum fine under GDPR was 4% of global turnover. That means the DSA has 50% more teeth than the GDPR.

When is the deadline for companies to be in compliance with the DSA to avoid penalties?

The DSA was published in the Official Journal of the European Union on October 27, 2022. The DSA “entered into force” on November 16, 2022. However, this only started the processes to determine which tech companies qualify as “very large”. The actual rules of the DSA do not apply to companies until later.

For most companies, the DSA rules will only apply to companies starting 15 months later — on February 17, 2024. However, VLOPs and VLOSEs have to comply with all obligations in Articles 11-43 of the DSA starting 4 months after being designated as VLOPs or VLOSEs. The first set of companies were given those designations on April 25, 2023, which means Facebook, Google, and other large companies have been liable for non-compliance since August 25, 2023.

TL;DR:

Most companies will need to comply with DSA rules starting on February 17, 2024, but some large tech companies have been subject to the DSA since August 25, 2023.

3 Business Opportunities Created by the Digital Services Act

1. Article 21 case law data & analytics tool

The certified out-of-court dispute settlement bodies described in Article 21 of the DSA are essentially going to be mini courts. These mini courts will generate precedents with each case reviewed, and like with ordinary courts, there will be a need for a software tool that can track and search all of these cases in the settlement bodies’ efforts to make fair and consistent decisions.

I think you could create a mini Lexis Nexis or Westlaw by partnering with all of the organizations that get certified as settlement bodies under the DSA. You’d collect their data, combine it with the data from all the other certified settlement bodies, index the data and make it searchable, and then sell the resulting packaged data tool back to the certified settlement bodies.

2. Article-16-as-a-Service

All intermediary service providers of all sizes must comply with Article 14 of the DSA which requires them to disclose their policies for content moderation and internal complaint handling in both clear, user-friendly natural language and machine-readable language.  Additionally, all hosting service providers are subject to Article 16’s requirement to have a mechanism for users and third parties to notify them of illegal content or content that violates terms of service of the hosting service.

I think you could create a SaaS product that is “Article-16-as-a-service” for intermediary service providers.  The tool would be configurable to apply to the database of the intermediary service provider and would provide an interface to users and third parties to file and dispute about content.  The tool could also generate text the terms of service and machine readable description disclosing the processes and make those generated disclosures available for simple insertion into the service provider’s website.

3. Email newsletter

Article 39 of the DSA requires big companies like Facebook to disclose everyone’s ads publicly. That is bad for advertisers because it means as soon as you start running an ad that works, all your competitors will be able to see that it works and copy it. However, Article 39 only applies to “very large” online platforms and search engines. That means if you advertise on an email newsletter with 1 million subscribers, you don’t have to disclose that ad publicly to your competitors. Email also gives you the option to target your ad to different subsets of your audience, unlike sponsoring a Youtuber (you could of course buy ads on a segment of the audience of a Youtube channel, but then Google would disclose your ad publicly).

All of this is to say, email newsletters are going to be a more attractive advertising channel for European businesses.

You could also start a newsletter ad agency that partners with lots of small to medium size newsletters instead of building a newsletter yourself.

Digital Services Act (DSA) vs Digital Markets Act (DMA)

The EU’s Digital Services Act (DSA) and Digital Markets Act (DMA) are two separate but related pieces of legislation passed in 2022. The DSA is concerned with harmful and illegal goods, services, and content online. In contrast, the DMA is concerned with anti-competitive behavior of internet companies.

The key concepts from the DMA are “core platform service” and “gatekeeper”.

Core platform service means any of the following:

  • Online intermediation services (not to be confused with “intermediary services” as defined in the DSA, an online intermediation service is defined in Article 2, point (2) of Regulation (EU) 2019/1150 as an information society service that allows business users to offer goods or services to consumers, with the goal of initiating direct transactions between those business users and consumers, with the information society service provided to business users on the basis of contractual relationships.)
  • Online search engines
  • Online social networking services
  • Video-sharing platform services
  • Number-independent interpersonal communications services
  • Operating systems
  • Web browsers
  • Virtual assistants
  • Cloud computing services
  • Online advertising services, including any advertising networks, ad exchanges and any other ad intermediation services, provided by an undertaking that provides any of the previously mentioned core platform services

A “gatekeeper” is an entity providing core platform services which has been designated as a gatekeeper by the EU Commission.

Digital Markets Act Do’s and Dont’s for Gatekeepers under the DMA

DoDon’t
– Allow third parties to interoperate with the gatekeeper’s own services in certain specific situations

– Allow their business users to access the data that they generate in their use of the gatekeeper’s platform

– Provide companies advertising on their platform with the tools and information necessary for advertisers and publishers to carry out their own independent verification of their advertisements hosted by the gatekeeper

– Allow their business users to promote their offer and conclude contracts with their customers outside the gatekeeper’s platform
– Treat services and products offered by the gatekeeper itself more favorably in ranking than similar services or products offered by third parties on the gatekeeper’s platform

– Prevent consumers from linking up to businesses outside their platforms

– Prevent users from uninstalling any pre-installed software or app if they wish to

– Track end users outside of the gatekeeper’s core platform service for the purpose of targeted advertising, without effective consent having been granted

The Changing Legal Landscape for Tech Companies

United StatesEuropean Union
Generation 1 Laws
(1996-2000)
Sec 230 CDA
Sec 512 DMCA
Articles 12-15 ECD

Directive 2000/31/EC
Generation 2 Laws
(2020-?)
unknown (here are some possibilities)Digital Services Act
Digital Markets Act

Goal of generation 1 laws: Protect free speech and innovation

Goal of generation 2 laws: Protect consumers and competition

Shift from: liability for content, to liability (accountability) for the design of services.

Glossary (EU tech law & regulation)

Content moderation means the activities, whether automated or not, undertaken by the providers of intermediary services, that are aimed, in particular, at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal content or that information, such as demotion, demonetization, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient’s account.

DSA stands for the EU Digital Services Act.

An information society service is any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. The definition is the same as provided in Article 1(1), point (b), of Directive (EU) 2015/1535.

For the purposes of this definition:

  • ‘at a distance’ means the service is provided without the parties being simultaneously present;
  • ‘by electronic means’ means that the service is sent initially and received at its destination by means of electronic equipment for the processing (including digital compression) and storage of data, and entirely transmitted, conveyed, and received by wire, radio, optical, or other electromagnetic means;
  • ‘at the individual request of a recipient of services’ means that the service is provided through the transmission of data on individual request.

Examples of information society services:

  • eBay
  • Hotels.com
  • Google Search
  • Uber

Non-Examples of information society services:

  • A taxi driver providing transportation services (service is not provided ‘at a distance’ from the passenger)
  • A medical examination at a doctor’s office using electronic equipment where the patient is physically present (service is not provided ‘at a distance’).
  • Telemarketing done by a human (service is not considered to be provided by ‘electronic means’)
  • Radio broadcasting (service is not supplied ‘at the individual request of a recipient of services’)

Intermediary service means one of the following information society services:

  1. A mere conduit service, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
  2. A caching service, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate, and temporary storage of that information, performed for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request; and
  3. A hosting service, consisting of the storage of information provided by, and at the request of, a recipient of the service.

Examples of mere conduit services:

  • A private messaging app like Facebook Messenger

Examples of caching services:

  • Content delivery networks (CDNs)

Examples of hosting services:

  • Cloud storage providers like Dropbox
  • Cloud computing providers like AWS, GCP, and Azure

An online platform is a provider of hosting services which, at the request of a user, stores and disseminates information to the public (unless that activity is a minor or purely ancillary feature of another service or a functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the Digital Services Act.

Examples of online platforms:

  • Social media apps (e.g. Facebook, Snapchat)
  • Online marketplaces (e.g. Amazon, eBay)

Non-Examples of online platforms:

  • The comment section of an online newspaper
  • Cloud computing, cloud storage, and web hosting services (i.e. web infrastructural services)
  • Email service providers
  • Private messaging apps

A recipient of the service means, with respect to an intermediary service, any natural or legal person who uses an intermediary service, in particular for the purposes of seeking information or making it accessible.

Recommender system means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritize that information, including as a result of a search initiated by the recipient of the service, or otherwise determining the relative order or prominence of information displayed.

An intermediary service provider is considered to have a substantial connection the the EU if:

  1. The service provider has an establishment in the EU,
  2. The number of consumer & business users of the service in one or more EU member states is significant in relation to the population thereof, or
  3. The service provider targets activities towards one or more EU member states.

Whether a service provider is considered to be targeting activities towards one or more member states is a matter of facts and circumstances, including the following factors:

  • The use of a language or currency generally used in a member state,
  • The possibility of residents of a member state ordering products or services,
  • The use of a relevant top-level domain,
  • The availability of an app in a relevant national app store,
  • The provision of local advertising,
  • The provision of advertising in a language used in a member state, or
  • Providing customer service in a language used in a member state.

A substantial connection will also be construed to exist if a service provider directs its activities to one or more member states within the meaning of Article 17(1), point (c) of Regulation (EU) No 1215/2012.

However, the mere technical accessibility of a website from the EU cannot, on that ground alone, be considered as establishing a substantial connection to the EU.

VLOP stands for Very Large Online Platform. This is a technical term used in the DSA. A VLOP is an online platform that, for at least 4 consecutive months, provides services to at least 45 million average monthly active users in the EU (corresponding to about 10% of the EU population), and that has been designated by the European Commission as a VLOP.

An online search engine is an intermediary service provider that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject, in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found.

VLOSE stands for Very Large Online Search Engine. This is a technical term used in the DSA. A VLOSE is an online search engine which reaches, on average, at least 45 million monthly active recipients of the service in the EU, and which has been designated as a VLOSE by the European Commission.

References

[1] EU Digital Services Act (English) – Regulation (EU) 2022/2065

[3] Directive (EU) 2015/1535

[4] EU Digital Markets Act (English) – Regulation (EU) 2022/1925

[5] European Commission Press Release – Questions & Answers: Digital Services Act (DSA)

[6] The European Commission designates, for the first time, six gatekeepers under the Digital Markets Act: Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft

Appendix: Video Summary of the Digital Services Act

[Transcript]

TikTok is now legally forbidden from shadow banning people in Europe, thanks to a new law that went into effect last week — the Digital Services Act. This law requires all big tech companies including TikTok and Instagram to disclose to any user when they have been restricted, and to clearly explain why.

The new law also mandates that big tech companies give Europeans the option to opt out of personalized recommendation algorithms, and give them an easy way to change the parameters used to target them with ads.

Perhaps most importantly of all though, the new law requires that big tech companies publish annual reports on how their services might negatively impact people, elections, and society, and to alter their services to mitigate those risks. Independent researchers and government regulators will audit those companies for compliance, and companies can be fined up to 6% of their global revenue if they don’t comply.

If you live in the U.S. and are feeling a bit jealous, you aren’t alone. Several U.S. lawmakers have been pushing for similar regulations in the U.S., although it may take years before anything gets passed.

Ricky Nave

In college, Ricky studied physics & math, won a prestigious research competition hosted by Oak Ridge National Laboratory, started several small businesses including an energy chewing gum business and a computer repair business, and graduated with a thesis in algebraic topology. After graduating, Ricky attended grad school at Duke University in the mathematics PhD program where he worked on quantum algorithms & non-Euclidean geometry models for flexible proteins. He also worked in cybersecurity at Los Alamos during this time before eventually dropping out of grad school to join a startup working on formal semantic modeling for legal documents. Finally, he left that startup to start his own in the finance & crypto space. Now, he helps entrepreneurs pay less capital gains tax.

Recent Posts