EU policy/Consultation on the Digital Services Act Package

 Home    About    Statement    Monitoring    Documentation    Handouts    Team    Transparency
Consultation on the Digital Services Act Package

2020

 

This page contains the questions from the open public consultation by the European Commission on its Digital Services Act package. It is intended as a working document for Wikimedians to collaboratively draft Wikimedia's answers to this legislative initiative of the EU.

The EU's survey will remain open until 8 September 2020, but we will take input into account that has been added here until 25 August 2020.

In parallel the Commission is also seeking views on two roadmaps until 30 June 2020:

  1. ex ante regulatory instrument of very large online platforms acting as gatekeepers
  2. deepening the Internal Market and clarifying responsibilities for digital services

Introduction edit

The Commission announced a Digital Services Act package with two main pillars: first, a proposal of new and revised rules to deepen the Single Market for Digital Services, by increasing and harmonising the responsibilities of online platforms and information service providers and reinforce the oversight over platforms’ content policies in the EU; second, ex ante rules to ensure that markets characterised by large platforms with significant network effects acting as gatekeepers, remain fair and contestable for innovators, businesses, and new market entrants. The Commission is initiating the present open public consultation as part of its evidence-gathering exercise, in order to identify issues that may require intervention at the EU level. This consultation covers in addition a series of topics related to the environment of digital services and online platforms, which will be further analysed in view of possible upcoming initiatives should the issues identified require a regulatory intervention. The consultation is structured in several modules, as follows:

  1. How to effectively keep users safer online?
  2. Reviewing the liability regime of digital services acting as intermediaries?
  3. What issues derive from the gatekeeper power of digital platforms?
  4. Other emerging issues and opportunities, including online advertising and smart contracts
  5. How to address challenges around the situation of self-employed individuals offering services through online platforms?
  6. What governance to reinforce ow to complete the Single Market for digital services?

The questionnaire refers to digital services (or ‘information society services’, within the meaning of the E-Commerce Directive), as 'services provided through electronic means, at a distance, at the request of the user'. It also refers more narrowly to a subset of digital services here termed online intermediary services. By this we mean services such as internet access providers, cloud services, online platforms, messaging services, etc., i.e. services that generally transport or intermediate content, goods or services made available by third parties. Parts of the questionnaire specifically focus on online platforms – such as e-commerce marketplaces, search engines, app stores, online travel and accommodation platforms or mobility platforms and other collaborative economy platforms, etc. A glossary further clarifies the terms and other technical concepts. You may respond to one, several or all of the modules. At the end of the questionnaire, you will also be able to upload a document, if you wish to do so, or add other issues not specifically covered in the questionnaire. You can save your replies and get back to the questionnaire at a later stage. Please make sure to save a draft of the questionnaire regularly as you fill it in, and to submit the questionnaire before the end of the consultation period on 8 September 2020. The questionnaire is published in English, but will soon be available in all the official languages of the European Union. Respondents may submit their replies in any official language.

I. How to effectively keep users safer online? edit

This module of the questionnaire is structured into several subsections:

First, it seeks evidence, experience, and data from the perspective of different stakeholders regarding illegal activities online, as defined by national and EU law. This includes the availability online of illegal goods (e.g. dangerous products, counterfeit goods, prohibited and restricted goods, protected wildlife, pet trafficking, illegal medicine, misleading offerings on food supplements), content (e.g. illegal hate speech, child sexual abuse material, content infringing intellectual property rights), and services, or practices infringing consumer law (such as scams, misleading advertising, exhortation to purchase made to children) online; it covers all types of illegal activities, both as regards criminal and civil law . It then inquires about other activities online which are not necessarily illegal but could cause harm to users, such as the spread of online disinformation or harmful content to minors. It also seeks facts and informed views on the potential risks of erroneous removal of legitimate content. It also inquires about transparency and accountability in measures taken by digital services and online platforms in particular in intermediating users’ access to their content and enabling oversight by third parties. Respondents might also be interested in related questions in the module of the consultation focusing on online advertising.

Second, it explores proportionate and appropriate responsibilities and obligations that could be required from online intermediaries, in particular online platforms, in addressing the set of issues discussed in the first sub-section. This module does not address the liability regime for online intermediaries, which is further explored in the next module of the consultation.

1. Main issues and experiences edit

A. Experiences and data on illegal activities online edit

Illegal goods edit

Did you ever come across illegal goods on online platforms (e.g. a counterfeit product, prohibited and restricted goods, protected wildlife, pet trafficking, illegal medicine, misleading offerings on food supplements)?

  • No, never
  • Yes, once
  • Yes, several times
  • I don’t know

Please specify

Comments

How easy was it for you to find information on where you could report the illegal good?
Please rate from 1 star (very difficult) to 5 stars (very easy)

How easy was it for you to report the illegal good? Please rate from 1 star (very difficult) to 5 stars (very easy)

How satisfied were you with the procedure following your report? Please rate from 1 star (very disatisfied) to 5 stars (very satisfied)

Are you aware of the action taken following your report?

  • Yes
  • No

Please explain'

Comments

In your experience, were such goods more easily accessible online since the start of the COVID-19 outbreak?

  • No, I do not think so
  • Yes, I came across illegal offerings more frequently
  • I don’t know

What good practices can you point to in handling the availability of illegal goods online since the start of the COVID-19 outbreak?

Comments
Illegal content edit

Did you ever come across illegal content online (for example illegal incitement to violence, hatred or discrimination on any protected grounds such as race, ethnicity, gender or sexual orientation; child sexual abuse material; terrorist propaganda; defamation; content infringing intellectual property rights, consumer law infringements)?

  • No, never
  • Yes, once
  • Yes, several times
  • I don’t know

How has the dissemination of illegal content changed since the outbreak of the COVID-19 pandemic? Please explain. 3000 character(s) maximum

Comments

What good practices can you point to in handling the dissemination of illegal content online since the start of the COVID-19 outbreak? 3000 character(s) maximum

Comments

What actions do online platforms take to minimise risks for consumers to be exposed to scams and other unfair practices (e.g. misleading advertising, exhortation to purchase made to children)? 3000 character(s) maximum

Comments

Do you consider these measures appropriate?

  • Yes
  • No
  • I don't know

Please explain. 3000 character(s) maximum

Comments


B. Transparency edit

If your content or offering of goods and services was ever removed or blocked from an online platform, were you informed by the platform?

  • Yes, I was informed before the action was taken
  • Yes, I was informed afterwards
  • Yes, but not on every occasion / not by all the platforms
  • No, I was never informed
  • I don’t know

Please explain. 3000 character(s) maximum

Comments


If you provided a notice to a digital service asking for the removal or disabling of access to such content or offering of goods or services, were you informed about the follow-up to the request?

  • Yes, I was informed
  • Yes, but not on every occasion / not by all the platforms
  • No, I was never informed
  • I don’t know


When content is recommended to you - such as products to purchase on a platform, or videos to watch, articles to read, users to follow - are you able to obtain enough information on why such content has been recommended to you? Please explain. 3000 character(s) maximum

Comments



C. Activities which could cause harm but are not, in themselves, illegal edit

In your experience, are children adequately protected online from harmful behaviours, such as grooming and bullying, or inappropriate content? 3000 character(s) maximum

Comments

To what extent do you agree with the following statements related to online disinformation? (Fully agree Somewhat agree Neither agree not disagree Somewhat disagree Fully disagree I don't know/ No reply)

  • Online platforms can easily be manipulated by foreign governments or other coordinated groups to spread divisive messages
  • To protect freedom of expression online, diverse voices should be heard
  • Disinformation is spread by manipulating algorithmic processes on online platforms
  • Online platforms can be trusted that their internal practices sufficiently guarantee democratic integrity, pluralism, non-discrimination, tolerance, justice, solidarity and gender equality.

Please explain

Comments

In your personal experience, how has the spread of harmful (but not illegal) activities online changed since the outbreak of the COVID-19 pandemic? 3000 character(s) maximum

Comments

What good practices can you point to in handling such harmful activities since the start of the COVID-19 outbreak? 3000 character(s) maximum

Comments


D. Experiences and data on erroneous removals edit

This section covers situation where content, goods or services offered online may be removed erroneously contrary to situations where such a removal may be justified due to for example illegal nature of such content, good or service (see sections of this questionnaire above).


The following questions are targeted at organisations. Individuals responding to the consultation are invited to go to section 2 here below on responsibilities for online platforms and other digital services


What is your experience in flagging content, or offerings of goods or services you deemed illegal to online platforms and/or other types of online intermediary services? Please explain in what capacity and through what means you flag content. 3000 character(s) maximum

Comments


If applicable, what costs does your organisation incur in such activities? 3000 character(s) maximum

Comments


Have you encountered any issues, in particular, as regards illegal content or goods accessible from the EU but intermediated by services established in third countries? If yes, how have you dealt with these? 3000 character(s) maximum

Comments

If part of your activity is to send notifications or orders for removing illegal content or goods or services made available through online intermediary services, or taking other actions in relation to content or goods, please explain whether you report on your activities and their outcomes:

  • Yes, through regular transparency reports
  • Yes, through reports to a supervising authority
  • Yes, upon requests to public information
  • Yes, through other means. Please explain
  • No, no such reporting is done

Does your organisation access any data or information from online platforms?

  • Yes, data regularly reported by the platform, as requested by law
  • Yes, specific data, requested as a competent authority
  • Yes, through bilateral or special partnerships
  • On the basis of a contractual agreement with the platform
  • Yes, generally available transparency reports
  • Yes, through generally available APIs (application programme interfaces)
  • Yes, through web scraping or other independent web data extraction approaches
  • Yes, because users made use of their right to port personal data
  • Yes, other. Please specify in the text box below
  • No

What sources do you use to obtain information about users of online platforms and other digital services – such as sellers of products online, service providers, website holders or providers of content online? For what purpose do you seek this information? 3000 character(s) maximum

Comments

Do you use WHOIS information about the registration of domain names and related information?

  • Yes
  • No
  • I don't know

How valuable is this information for you? Please rate from 1 star (not particularly important) to 5 (extremely important)

Do you use or ar you aware of alternative sources of such data? Please explain. 3000 character(s) maximum

Comments

Are you aware of evidence on the scale and impact of erroneous removals of content, goods, services, or banning of accounts online? Are there particular experiences you could share? 5000 character(s) maximum

Comments


The following questions are targeted at online intermediaries.

A. Measures taken against illegal offering of goods and services online and content shared by users edit

What systems, if any, do you operate for addressing illegal activities conducted by the users of your service (sale of illegal goods -e.g. a counterfeit product, an unsafe product, prohibited and restricted goods, wildlife and pet trafficking - dissemination of illegal content or illegal provision of services)?

A notice-and-action system for users to report illegal activities A dedicated channel through which authorities report illegal activities Cooperation with trusted organisations who report illegal activities, following a fast-track assessment of the notification A system for the identification of professional users (‘know your customer’) A system for sanctioning users who are repeat infringers A system for informing consumers that they have purchased an illegal good, once you become aware of this Multi-lingual moderation teams Automated systems for detecting illegal activities. Please specify the detection system and the type of illegal content it is used for Other systems. Please specify in the text box below No system in place

Please explain. 5000 character(s) maximum

Comments

What issues have you encountered in operating these systems? 5000 character(s) maximum

Comments

On your marketplace (if applicable), do you have specific policies or measures for the identification of sellers established outside the European Union?

  • Yes
  • No

Please quantify, to the extent possible, the costs of the measures related to ‘notice-and-action’ or other measures for the reporting and removal of different types of illegal goods, services and content, as relevant. 5000 character(s) maximum

Comments

Please provide information and figures on the amount of different types of illegal content, services and goods notified, detected, removed, reinstated and on the number or complaints received from users. Please explain and/or link to publicly reported information if you publish this in regular transparency reports. 5000 character(s) maximum

Comments

Do you have in place measures for detecting and reporting the incidence of suspicious behaviour (i.e. behaviour that could lead to criminal acts such as acquiring materials for such acts)? 3000 character(s) maximum

Comments


B. Measures against other types of activities which might be harmful but are not, in themselves, illegal edit

Do your terms and conditions and/or terms of service ban activities such as:

  • Spread of political disinformation in election periods?
  • Other types of coordinated disinformation e.g. in health crisis?
  • Harmful content for children?
  • Online grooming, bullying?
  • Harmful content for other vulnerable persons?
  • Content which is harmful to women?
  • Hatred, violence and insults (other than illegal hate speech)?
  • Other activities which are not illegal per se but could be considered harmful?

Please explain your policy. 5000 character(s) maximum

Comments


Do you have a system in place for reporting such activities? What actions do they trigger? 3000 character(s) maximum

Comments

What other actions do you take? Please explain for each type of behaviour considered. 5000 character(s) maximum

Comments

Please quantify, to the extent possible, the costs related to such measures. 5000 character(s) maximum

Comments

Do you have specific policies in place to protect minors from harmful behaviours such as online grooming or bullying?

  • Yes
  • No

C. Measures for protecting legal content goods and services edit

Does your organisation maintain an internal complaint and redress mechanism to your users for instances where their content might be erroneously removed, or their accounts blocked?

  • Yes
  • No

What action do you take when a user disputes the removal of their good or content or service, or restrictions on their account? Is the content/good reinstated? 5000 character(s) maximum

Comments

What are the quality standards and control mechanism you have in place for the automated detection or removal tools you are using for e.g. content, goods, services, user accounts or bots? 3000 character(s) maximum

Comments

Do you have an independent oversight mechanism in place for the enforcement of your content policies?

  • Yes
  • No

Please explain. 5000 character(s) maximum

Comments

D. Transparency and cooperation edit

Do you actively provide the following information (multiple choice):

  • Information to users when their good or content is removed, blocked or demoted
  • Information to notice providers about the follow-up on their report
  • Information to buyers of a product which has then been removed as being illegal

Do you publish transparency reports on your content moderation policy?

  • Yes
  • No

What information is available about the automated tools you use for identification of illegal content, goods or services and their performance, if applicable? Who has access to this information? In what formats? 5000 character(s) maximum

Comments

How can data related to your digital service be accessed by third parties and under what conditions?

  • Contractual conditions
  • Special partnerships
  • Available APIs (application programming interfaces) for data access
  • Reported, aggregated information through reports
  • Portability at the request of users towards a different service
  • At the direct request of a competent authority
  • Regular reporting to a competent authority
  • Other means. Please specify
Comments

Please explain or give references for the different cases of data sharing and explain your policy on the different purposes for which data is shared. 5000 character(s) maximum

Comments


The following questions are open for all respondents.

2. Clarifying responsibilities for online platforms and other digital services edit

What responsibilities should be legally required from online platforms and under what conditions?

Should such measures be taken, in your view, by all online platforms, or only by specific ones (e.g. depending on their size, capability, extent of risks of exposure to illegal activities conducted by their users)? If you consider that some measures should only be taken by large online platforms, please identify which would these measures be.

(1.Yes, by all online platforms, according to the activities they intermediate (e.g. content hosting, selling goods or services) 2.Yes, only by larger online platforms 3.Yes, only platforms at particular risk of exposure to illegal activities by their users 4.Such measures should not be legally required)

  • Maintain an effective ‘notice and action’ system for reporting illegal goods or content
  • Maintain a system for assessing the risk of exposure to illegal goods or content
  • Have content moderation teams, appropriately trained and resourced
  • Systematically respond to requests from law enforcement authorities
  • Cooperate with national authorities and law enforcement, in accordance with clear procedures
  • Cooperate with trusted organizations with proven expertise who can report illegal activities for fast analysis ('trusted flaggers')
  • Detect illegal content, goods or services
  • In particular where they intermediate sales of goods or services, inform their professional users about their obligations under EU law
  • Request professional users to identify themselves clearly (‘know your customer’ policy)
  • Provide technical means allowing professional users to comply with their obligations (e.g. enable them to publish on the platform the pre-contractual information consumers need to receive in accordance with applicable consumer law)
  • Inform consumers when they become aware of product recalls or sales of illegal goods
  • Cooperate with other online platforms for exchanging best practices, sharing information or tools to tackle illegal activities
  • Be transparent about their content policies, measures and their effects
  • Maintain an effective ‘counter-notice’ system for users whose goods or content is removed to dispute erroneous decisions
  • Other. Please specify

Please elaborate if you wish to further explain your choices.

Comments

What information would be, in your view, necessary and sufficient for users and third parties to send to an online platform in order to notify an illegal activity (sales of illegal goods, offering of services or sharing illegal content) conducted by a user of the service?

  • Precise location: e.g. URL
  • Precise reason why the activity is considered illegal
  • Description of the activity
  • Identity of the person or organisation sending the notification. Please explain under what conditions such information is necessary:
  • Other, please specify

Please explain 3000 character(s) maximum

Comments

How should the reappearance of illegal content, goods or services be addressed, in your view? What approaches are effective and proportionate? 5000 character(s) maximum

Comments

Where automated tools are used for detection of illegal content, goods or services, what opportunities and risks does their use represent as regards different types of illegal activities and the specificities of the different types of tools? 3000 character(s) maximum

Comments

How should the spread of illegal goods, services or content across multiple platforms and services be addressed? Are there specific provisions necessary for addressing risks brought by:

  • a. Digital services established outside of the Union?
  • b. Sellers established outside of the Union, who reach EU consumers through online platforms?

3000 character(s) maximum

Comments

What would be appropriate and proportionate measures that digital services acting as online intermediaries, other than online platforms, should take – e.g. other types of hosting services, such as web hosts, or services deeper in the Internet stack, like cloud infrastructure services, content distribution services, DNS services, etc.? 5000 character(s) maximum

Comments

What should be rights and responsibilities of other entities, such as authorities, or interested third-parties such as civil society organisations or equality bodies in contributing to tackle illegal activities online? 5000 character(s) maximum

Comments

What would be, in your view, appropriate and proportionate measures for online platforms to take in relation to activities or content which might cause harm but are not necessarily illegal? 5000 character(s) maximum

Comments

In particular, are there specific measures you would find appropriate and proportionate for online platforms to take in relation to potentially harmful activities or content concerning minors? Please explain. 5000 character(s) maximum

Comments

Please rate the necessity of the following measures for addressing the spread of disinformation online. (Please rate from 1 (not at all necessary) to 5 (very necessary) each option below.)

  • Transparently inform consumers about political advertising and sponsored content, in particular during electoral periods
  • Provide users with tools to flag disinformation online and establishing transparent procedures for dealing with users’ complaints
  • Tackle the use of fake-accounts, fake engagements, bots and inauthentic users behaviour aimed at amplifying false or misleading narratives
  • Transparency tools and secure access to platforms’ data for trusted researchers in order to monitor inappropriate behaviours and better understand the impact of disinformation and the policies designed to counter it
  • Transparency tools and secure access to platforms’ data for authorities in order to monitor inappropriate behaviours and better understand the impact of disinformation and the policies designed to counter it
  • Adapted risk assessments and mitigation strategies undertaken by online platform
  • Ensure effective access and visibility of a variety of authentic and professional journalistic sources
  • Auditing systems over platforms’ actions and risk assessments
  • Regulatory oversight and auditing competence over platforms’ actions and risk assessments, including on sufficient resources and staff, and responsible examination of metrics and capacities related to fake accounts and their impact on manipulation and amplification of disinformation.
  • Other, please specify
Comments


In special cases, where crises emerge and involve systemic threats to society, such as a health pandemic, and fast-spread of illegal and harmful activities online, what are, in your view, the appropriate cooperation mechanisms between digital services and authorities? 3000 character(s) maximum

Comments

What would be effective measures service providers should take, in your view, for protecting the freedom of expression of their users? (Please rate from 1 (not at all necessary) to 5 (very necessary).)

  • High standards of transparency on their terms of service and removal decisions
  • Diligence in assessing the content notified to them for removal or blocking
  • Maintaining an effective complaint and redress mechanism
  • Diligence in informing users whose content/goods/services was removed or blocked or whose accounts are threatened to be suspended
  • High accuracy and diligent control mechanisms, including human oversight, when automated tools are deployed for detecting, removing or demoting content or suspending users’ accounts
  • Enabling third party insight – e.g. by academics – of main content moderation systems
  • Other. Please specify
Comments

Are there other concerns and mechanisms to address risks to other fundamental rights such as freedom of assembly, non-discrimination, gender equality, freedom to conduct a business, or rights of the child? How could these be addressed? 5000 character(s) maximum

Comments

In your view, what information should online platforms make available in relation to their policy and measures taken with regards to content and goods offered by their users? Please elaborate, with regards to the identification of illegal content and goods, removal, blocking or demotion of content or goods offered, complaints mechanisms and reinstatement, the format and frequency of such information, and who can access the information. 5000 character(s) maximum

Comments

What type of information should be shared with users and/or competent authorities and other third parties such as trusted researchers with regard to the use of automated systems used by online platforms to detect, remove and/or block illegal content, goods, or user accounts? 5000 character(s) maximum

Comments

In your view, what measures are necessary with regard to algorithmic recommender systems used by online platforms? 5000 character(s) maximum

Comments

In your view, is there a need for enhanced data sharing between online platforms and authorities, within the boundaries set by the General Data Protection Regulation? Please select the appropriate situations, in your view:

  • For supervisory purposes concerning professional users of the platform - e.g. in the context of platform intermediated services such as accommodation or ride-hailing services, for the purpose of labour inspection, for the purpose of tax collection, for the purpose of collecting social security contributions
  • For supervisory purposes of the platforms’ own obligations – e.g. with regard to content moderation obligations, transparency requirements, actions taken in electoral contexts and against inauthentic behaviour and foreign interference
  • Specific request of law enforcement authority or the judiciary
  • On a voluntary and/or contractual basis in the public interest or for other purposes

Please explain. What would be the benefits? What would be concerns for the companies, consumers or other third parties? 5000 character(s) maximum

Comments

What types of sanctions would be effective, dissuasive and proportionate for online platforms which systematically fail to comply with their obligations (See also the last module of the consultation)? 5000 character(s) maximum

Comments

Are there other points you would like to raise? 3000 character(s) maximum

Comments

II. Reviewing the liability regime of digital services acting as intermediaries? edit

The liability of online intermediaries is a particularly important area of internet law in Europe and worldwide. The E-Commerce Directive harmonises the liability exemptions applicable to online intermediaries in the single market, with specific provisions for different services according to their role: from Internet access providers, to messaging services, to hosting service providers.

The previous section of the consultation explored obligations and responsibilities which online platforms and other services can be expected to take – i.e. processes they should put in place to address illegal activities which might be conducted by the users abusing their service. In this section, the focus is on the legal architecture for the liability regime for service providers when it comes to illegal activities conducted by their users. The Commission seeks informed views on the functioning of the current liability exemption regime and the areas where an update might be necessary.

The liability regime for online intermediaries is primarily established in the E-Commerce Directive, which distinguishes between different types of services: so called ‘mere conduits’, ‘caching services’, and ‘hosting services’. In your understanding, are these categories sufficiently clear and complete for characterising and regulating today’s digital intermediary services? Please explain. 5000 character(s) maximum

Comments

For hosting services, the liability exemption for third parties’ content or activities is conditioned by a knowledge standard (i.e. when they get ‘actual knowledge’ of the illegal activities, they must ‘act expeditiously’ to remove it, otherwise they could be found liable). Are there elements that require further legal clarification? 5000 character(s) maximum

Comments

Does the current legal framework dis-incentivize service providers to take proactive measures against illegal activities? If yes, please provide your view on how disincentives could be corrected. 5000 character(s) maximum

Comments

Do you think that the concept characterising intermediary service providers as playing a role of a 'mere technical, automatic and passive nature' in the transmission of information (recital 42 of the E-Commerce Directive) is sufficiently clear and still valid? Please explain. 5000 character(s) maximum

Comments

The E-commerce Directive also prohibits Member States from imposing on intermediary service providers general monitoring obligations or obligations to seek facts or circumstances of illegal activities conducted on their service by their users. In your view, is this approach, balancing risks to different rights and policy objectives, still appropriate today? Is there further clarity needed as to the parameters for ‘general monitoring obligations’? Please explain. 5000 character(s) maximum

Comments

Do you see any other points where an upgrade may be needed for the liability regime of digital services acting as intermediaries? 5000 character(s) maximum

Comments


III. What issues derive from the gatekeeper power of digital platforms? edit

There is wide consensus concerning the benefits for consumers and innovation, and a wide-range of efficiencies, brought about by online platforms in the European Union’s Single Market. Online platforms facilitate cross-border trading within and outside the Union and open entirely new business opportunities to a variety of European businesses and traders by facilitating their expansion and access to new markets. At the same time, regulators and experts around the world consider that large online platforms are able to control increasingly important online platform ecosystems in the digital economy. Such large online platforms connect many businesses and consumers. In turn, this enables them to leverage their advantages – economies of scale, the network effects and the important data assets- from one area of their activity to improve or develop new services in adjacent areas. The concentration of economic power in then platform economy creates a small number of ‘winner-takes it all/most’ online platforms. The winner online platforms can also readily take over (potential) competitors and it is very difficult for an existing competitor or potential new entrant to overcome the winner’s competitive edge. The Commission announced that it ‘will further explore, in the context of the Digital Services Act package, ex ante rules to ensure that markets characterised by large platforms with significant network effects acting as gatekeepers, remain fair and contestable for innovators, businesses, and new market entrants’.

This module of the consultation seeks informed views from all stakeholders on this framing, on the scope, the specific perceived problems, and the implications, definition and parameters for addressing possible issues deriving from the economic power of large, gatekeeper platforms. The Communication ’Shaping Europe’s Digital Future’ also flagged that ‘competition policy alone cannot address all the systemic problems that may arise in the platform economy’. Stakeholders are invited to provide their views on potential new competition instruments through a separate, dedicated open public consultation that will be launched soon.

In parallel, the Commission is also engaged in a process of reviewing EU competition rules and ensuring they are fit for the modern economy and the digital age. As part of that process, the Commission has launched a consultation on the proposal for a New Competition Tool aimed at addressing the gaps identified in enforcing competition rules. The initiative intends to address as specific objectives the structural competition problems that prevent markets from functioning properly and that can tilt the level playing field in favour of only a few market players. This could cover certain digital or digitally-enabled markets, as identified in the report by the Special Advisers and other recent reports on the role of competition policy, and/or other sectors. As such, the work on a proposed New Competition Tool and the initiative at stake complement each other. The work on the two impact assessments will be conducted in parallel in order to ensure a coherent outcome. In this context, the Commission will take into consideration the feedback received from both consultations. We would therefore invite you, in preparing your responses to the questions below, to also consider your response to the parallel consultation on a New Competition Tool.

To what extent do you agree with the following statements? (Fully agree Somewhat agree Neither agree not disagree Somewhat disagree Fully disagree I don't know/ No reply) Consumers have sufficient choices and alternatives to the offerings of online platforms.

  • It is easy for consumers to switch between services provided by online platform companies and use same or similar services provider by other online platform companies (“multi-home”).
  • It is easy for individuals to port their data in an useful form for alternative service providers outside of an online platform.
  • There is sufficient level of interoperability between services of different online platform companies.
  • There is an asymmetry of information between the knowledge of online platforms about consumers, which enables them to target them with commercial offers, and the knowledge of consumers about market conditions.
  • It is easy for innovative SME online platforms to expand or enter the market.
  • Traditional businesses are increasingly dependent on a limited number of very large online platforms.
  • There are imbalances in the bargaining power between these online platforms and their business users.
  • Businesses and consumers interacting with these online platforms are often asked to accept unfavourable conditions and clauses in the terms of use/contract with the online platforms.
  • Certain large online platform companies create barriers to entry and expansion in the Single Market (gatekeepers).
  • Large online platforms often leverage their assets from their primary activities (customer base, data, technological solutions, skills, financial capital) to expand into other activities.
  • When large online platform companies expand into such new activities, this often poses a risk of reducing innovation and deterring competition from smaller innovative market operators.

Main features of gatekeeper online platform companies and main relevant criteria for assessing their economic power edit

Which characteristics are relevant in determining the gatekeeper role of large online platform companies? Please rate each criterion identified below from 1 (not relevant) to 5 (very relevant)

  • Large user base
  • Wide geographic coverage in the EU
  • They capture a large share of total revenue of the market you are active/of a sector
  • Impact on a certain sector
  • They build on and exploit strong network effects
  • They leverage their assets for entering new areas of activity
  • They raise barriers to entry for competitors
  • They accumulate valuable and diverse data and information
  • There are very few, if any, alternative services available on the market
  • Lock-in of users/consumers
  • Other

If you replied "other", please list 3000 character(s) maximum

Comments

Please explain your answer. How could different criteria be combined to accurately identify large online platform companies with gatekeeper role? 3000 character(s) maximum

Comments

Do you believe that the integration of any or all of the following activities within a single company can strengthen the gatekeeper role of large online platform companies (‘conglomerate effect’)? Please select the activities you consider to strengthen the gatekeeper role:

  • online intermediation services (i.e. consumer-facing online platforms such as e-commerce marketplaces, social media, mobile app stores, etc., as per Regulation (EU) 2019/1150 - see glossary)
  • search engines
  • operating systems for smart devices
  • consumer reviews on large online platforms
  • network and/or data infrastructure/cloud services
  • digital identity services
  • payment services (or other financial services)
  • physical logistics such as product fulfilment services
  • data management platforms
  • online advertising intermediation services
  • other. Please specify in the text box below.

Other - please list 1000 character(s) maximum

Comments

Emerging issues edit

The following questions are targeted particularly at businesses and business users of large online platform companies.

As a business user of large online platforms, do you encounter issues concerning trading conditions on large online platform companies?

  • Yes
  • No

Please specify which issues you encounter and please explain to what types of platform these are related to (e.g. e-commerce marketplaces, app stores, search engines, operating systems, social networks). 5000 character(s) maximum

Comments

Have you been affected by unfair contractual terms or unfair practices of very large online platform companies? Please explain your answer in detail, pointing to the effects on your business, your consumers and possibly other stakeholders in the short, medium and long-term? 5000 character(s) maximum

Comments

The following questions are targeted particularly at consumers who are users of large online platform companies.

Do you encounter issues concerning commercial terms and conditions when accessing services provided by large online platform companies? Please specify which issues you encounter and please explain to what types of platform these are related to (e.g. e-commerce marketplaces, app stores, search engines, operating systems, social networks). 5000 character(s) maximum

Comments

Have you considered any of the practices by large online platform companies as unfair? Please explain. 3000 character(s) maximum

Comments

The following questions are open to all respondents.

Are there specific issues and unfair practices you perceive on large online platform companies? 5000 character(s) maximum

Comments

In your view, what practices related to the use and sharing of data in the platforms’ environment are raising particular challenges? 5000 character(s) maximum

Comments

What impact would the identified unfair practices can have on innovation, competition and consumer choice in the single market? 3000 character(s) maximum

Comments

Do startups or scaleups depend on large online platform companies to access or expand? Do you observe any trend as regards the level of dependency in the last five years (i.e. increases; remains the same; decreases)? Which difficulties in your view do start-ups or scale-ups face when they depend on large online platform companies to access or expand on the markets? 3000 character(s) maximum

Comments

Which are possible positive and negative societal (e.g. on freedom of expression, consumer protection, media plurality) and economic (e.g. on market contestability, innovation) effects, if any, of the gatekeeper role that large online platform companies exercise over whole platform ecosystem? 3000 character(s) maximum

Which issues specific to the media sector (if any) would, in your view, need to be addressed in light of the gatekeeper role of large online platforms? If available, please provide additional references, data and facts. 3000 character(s) maximum

Comments

Regulation of large online platform companies acting as gatekeepers edit

Do you believe that in order to address any negative societal and economic effects of the gatekeeper role that large online platform companies exercise over whole platform ecosystems, there is a need to consider dedicated regulatory rules?

  • I fully agree
  • I agree to a certain extent
  • I disagree to a certain extent
  • I disagree
  • I don’t know

Please explain 3000 character(s) maximum

Comments

Do you believe that such dedicated rules should prohibit certain practices by large online platform companies with gatekeeper role that are considered particularly harmful for users and consumers of these large online platforms?

  • Yes
  • No
  • I don't know

Please explain your reply and, if possible, detail the types of prohibitions that should in your view be part of the regulatory toolbox. 3000 character(s) maximum

Comments

Do you believe that such dedicated rules should include obligations on large online platform companies with gatekeeper role?

  • Yes
  • No
  • I don't know

Please explain your reply and, if possible, detail the types of obligations that should in your view be part of the regulatory toolbox. 3000 character(s) maximum

Comments

If you consider that there is a need for such dedicated rules setting prohibitions and obligations, as those referred to in your replies to questions 3 and 5 above, do you think there is a need for a specific regulatory authority to enforce these rules?

  • Yes
  • No
  • I don't know

Please explain your reply. 3000 character(s) maximum

Comments

Do you believe that such dedicated rules should enable regulatory intervention against specific large online platform companies, when necessary, with a case by case adapted remedies?

  • Yes
  • No
  • I don't know

If yes, please explain your reply and, if possible, detail the types of case by case remedies. 3000 character(s) maximum

Comments

If you consider that there is a need for such dedicated rules, as referred to in question 9 above, do you think there is a need for a specific regulatory authority to enforce these rules?

  • Yes
  • No

Please explain your reply 3000 character(s) maximum

Comments

If you consider that there is a need for a specific regulatory authority to enforce dedicated rules referred to questions 3, 5 and 9 respectively, would in your view these rules need to be enforced by the same regulatory authority or could they be enforced by different regulatory authorities? Please explain your reply. 3000 character(s) maximum

Comments

At what level should the regulatory oversight of platforms be organised?

  • At national level
  • At EU level
  • Both at EU and national level.
  • I don't know


If you consider such dedicated rules necessary, what should in your view be the relationship of such rules with the existing sector specific rules and/or any future sector specific rules? 3000 character(s) maximum

Comments

Should such rules have an objective to tackle both negative societal and negative economic effects deriving from the gatekeeper role of these very large online platforms? Please explain your reply. 3000 character(s) maximum

Comments

Specifically, what could be effective measures related to data held by very large online platform companies with a gatekeeper role beyond those laid down in the General Data Protection Regulation in order to promote competition and innovation as well as a high standard of personal data protection and consumer welfare? 3000 character(s) maximum

Comments

What could be effective measures concerning large online platform companies with a gatekeeper role in order to promote media pluralism, while respecting the subsidiarity principle? 3000 character(s) maximum

Comments

Which, if any, of the following characteristics are relevant when considering the requirements for a potential regulatory authority overseeing the large online platform companies with the gatekeeper role:

  • Institutional cooperation with other authorities addressing related sectors – e.g. competition authorities, data protection authorities, financial services authorities, consumer protection authorities, cyber security, etc
  • Pan-EU scope
  • Swift and effective cross-border cooperation and assistance across Member States
  • Capacity building within Member States
  • High level of technical capabilities including data processing, auditing capacities
  • Cooperation with extra-EU jurisdictions
  • Other

Please explain if these characteristics would need to be different depending on the type of ex ante rules (see questions 3, 5, 9 above) that the regulatory authority would be enforcing? 3000 character(s) maximum

Comments

Which, if any, of the following requirements and tools could facilitate regulatory oversight over very large online platform companies (multiple answers possible):

  • Reporting obligation on gatekeeping platforms to send a notification to a public authority announcing its intention to expand activities
  • Monitoring powers for the public authority (such as regular reporting)
  • Investigative powers for the public authority
  • Other

Please explain if these requirements would need to be different depending on the type of ex ante rules (see questions 3, 5, 9 above) that the regulatory authority would be enforcing? 3000 character(s) maximum

Comments

Taking into consideration the parallel consultation on a proposal for a New Competition Tool focusing on addressing structural competition problems that prevent markets from functioning properly and tilt the level playing field in favour of only a few market players. Please rate the suitability of each option below to address market issues arising in online platforms ecosystems. Please rate the policy options below from 1 (not effective) to 5 (most effective). 1. Current competition rules are enough to address issues raised in digital markets 2. There is a need for an additional regulatory framework imposing obligations and prohibitions that are generally applicable to all large online platforms with gatekeeper power 3. There is a need for an additional regulatory framework allowing for the possibility to impose tailored remedies on individual large online platforms with gatekeeper power, on a case-by-case basis 4. There is a need for a New Competition Tool allowing to address structural risks and lack of competition in (digital) markets on a case-by-case basis 5. There is a need for combination of two or more of the options 2 to 4.

Please explain which of the options, or combination of these, would be, in your view, suitable and sufficient to address the market issues arising in the online platforms ecosystems. 3000 character(s) maximum

Comments

Are there other points you would like to raise? 3000 character(s) maximum

IV. Other emerging issues and opportunities, including online advertising and smart contracts edit

Online advertising has substantially evolved over the past years and represents a major revenue source for many digital services, as well as other businesses present online, and opens unprecedented opportunities for content creators, publishers, etc. To a large extent, maximising revenue streams and optimising online advertising is a major business incentive for the business users of the online platforms and for the data policy of the platforms. At the same time, revenues from online advertising and increased visibility and audience reach are also an important incentive for potentially harmful intents such is the case in online disinformation campaigns, for example.

Another emerging issue is linked to the conclusion of so-called ‘smart contracts’ which represent an important innovation for digital and other services, but face some legal uncertainties. This section of the open public consultation seeks to collect data, information on current practices, and informed views on potential issues emerging in the space of online advertising and smart contracts, and invites the respondents to reflect on other areas where further measures may be needed to facilitate innovation in the single market. This does not address privacy and data protection concerns; it is understood that all aspects related to data sharing and data collection are to be regarded with the highest standard of protection of personal data.

Online advertising edit

When you see an online ad, is it clear to you who has placed the advertisement online?

  • Yes, always
  • Sometimes: but I can find the information when this is not immediately clear
  • Sometimes: but I cannot always find this information
  • I don’t know
  • No

As a publisher online (e.g. owner of a website where ads are displayed), what types of advertising systems do you use for covering your advertising space? What is their relative importance? (Answer for each either "% of ad space" or "% of ad revenue")

  • Intermediated programmatic advertising though real-time bidding
  • Private marketplace auctions
  • Programmatic advertising with guaranteed impressions (non-auction based)
  • Behavioural advertising (micro-targeting)
  • Contextual advertising
  • Other

What information is publicly available about ads displayed on an online platform that you use?
3000 character(s) maximum

Comments

As a publisher, what type of information do you have about the advertisement placed next to your content/on your website?
3000 character(s) maximum

Comments

To what extent do you find the quality and reliability of this information satisfactory for your purposes?
Please rate your level of satisfaction from 1 to 5

As an advertiser or an agency acting on the advertisers’ behalf (if applicable), what types of programmatic advertising do you use to place your ads? What is their relative importance in your ad inventory??
Please answer either "% of ad inventory" or "% of ad expenditure" for each.

  • Intermediated programmatic advertising though real-time bidding
  • Private marketplace auctions
  • Programmatic advertising with guaranteed impressions (non-auction based)
  • Behavioural advertising (micro-targeting)
  • Contextual advertising
  • Other

As an advertiser or an agency acting on the advertisers’ behalf (if applicable), what type of information do you have about the ads placed online on your behalf?

Comments

To what extent do you find the quality and reliability of this information satisfactory for your purposes?
Please rate your level of satisfaction from 1 to 5

The following questions are targeted specifically at online platforms. As an online platform, what options do your users have with regards to the advertisements they are served and the grounds on which the ads are being served to them? Can users access your service through other conditions than viewing advertisements? Please explain.
3000 character(s) maximum

Comments

Do you publish or share with researchers, authorities or other third parties detailed data on the advertisements published, their sponsors and viewership rates? Please explain.
3000 character(s) maximum

Comments

What systems do you have in place for detecting illicit offerings in the advertisements you intermediate? 3000 character(s) maximum

Comments

The following questions are open to all respondents.

Based on your experience, what actions and good practices can tackle the placement of ads next to illegal content or goods, and/or on websites that disseminate such illegal content or goods, and to remove such illegal content or goods when detected?
3000 character(s) maximum

Comments

From your perspective, what measures would lead to meaningful transparency in the ad placement process?
3000 character(s) maximum

Comments

What information about ads displayed online should be made publicly available?
3000 character(s) maximum

Comments

Based on your expertise, which effective and proportionate auditing systems could bring meaningful accountability in the ad placement system?
3000 character(s) maximum

Comments

What is, from your perspective, a functional definition of ‘political advertising’? Are you aware of any specific obligations attaching to 'political advertising' at a European or national level ?
3000 character(s) maximum

Comments

What information disclosure would meaningfully inform consumers in relation to political advertising? Are there other transparency standards and actions needed, in your opinion, for an accountable use of political advertising and political messaging?
3000 character(s) maximum

Comments

What impact would have, in your view, enhanced transparency and accountability in the online advertising value chain, on the gatekeeper power of major online platforms and other potential consequences such as media pluralism?
3000 character(s) maximum

Comments

Are there other emerging issues in the space of online advertising you would like to flag?
3000 character(s) maximum

Comments

Smart contracts edit

Is there sufficient legal clarity in the European Union for the provision and use of “smart contracts” – e.g. with regard to validity, applicable law and jurisdiction?
Please rate from 1 (lack of clarity) to 5 (sufficient clarity)

Please explain the difficulties you perceive.
3000 character(s) maximum

Comments

In which of the following areas do you find necessary further regulatory clarity?

  • Mutual recognition of the validity of smart contracts in the European Union as concluded in accordance with the national law
  • Minimum standards for the validity of “smart contracts” in the European Union
  • Measures to ensure that legal obligations and rights flowing from a smart contract and the functioning of the smart contract are clear and unambiguous, in particular for consumers
  • Allowing interruption of smart contracts
  • Clarity on liability for damage caused in the operation of a smart contract
  • Further clarity for payment and currency-related smart contracts.

Please explain. 3000 character(s) maximum

Comments

Are there other points you would like to raise? 3000 character(s) maximum

Comments

V. How to address challenges around the situation of self-employed individuals offering services through online platforms? edit

Individuals providing services through platforms may have different legal status (workers or self-employed). This section aims at gathering first information and views on the situation of self-employed individuals offering services through platforms (such as ride-hailing, food delivery, domestic work, design work, micro-tasks etc.). Furthermore, it seeks to gather first views on whether any detected problems are specific to the platform economy and what would be the perceived obstacles to the improvement of the situation of individuals providing services through platforms. This consultation is not intended to address the criteria by which persons providing services on such platforms are deemed to have one or the other legal status.

The issues explored here do not refer to the selling of goods (e.g. online marketplaces) or the sharing of assets (e.g. sub-renting houses) through platforms.

'The following questions are targeting self-employed individuals offering services through online platforms. SKIPPING THIS SECTION AS IT SURELY DOESN'T APPLY TO WIKIMEDIA

The following questions are targeting online platforms.

Role of platforms edit

What is the role of your platform in the provision of the service and the conclusion of the contract with the customer?

Comments


What are the risks and responsibilities borne by your platform for the non-performance of the service or unsatisfactory provision of the service?

Comments


19What happens when the service is not paid for by the customer/client?

Comments


Does your platform own any of the assets used by the individual offering the services?

  • Yes
  • No

Out of the total number of service providers offering services through your platform, what is the percentage of self-employed individuals?

  • Over 75%
  • Between 50% and 75%
  • Between 25% and 50%
  • Less than 25%

Rights and obligations edit

What is the contractual relationship between the platform and individuals offering services through it? 3000 character(s) maximum

Comments

Who sets the price paid by the customer for the service offered?

  • The platform
  • The individual offering services through the platform
  • Others, please specify

Please explain. 3000 character(s) maximum

Comments

How is the price paid by the customer shared between the platform and the individual offering the services through the platform?
3000 character(s) maximum

Comments

On average, how many hours per week do individuals spend offering services through your platform?
3000 character(s) maximum

Comments

Do you have measures in place to enable individuals providing services through your platform to contact each other and organise themselves collectively?

  • Yes
  • No

Please describe the means through which the individuals who provide services on your platform contact each other.
3000 character(s) maximum

Comments

What measures do you have in place for ensuring that individuals offering services through your platform work legally - e.g. comply with applicable rules on minimum working age, hold a work permit, where applicable - if any?
(If you replied to this question in your answers in the second first module of the consultation, there is no need to repeat your answer here.) 3000 character(s) maximum

Comments

The following questions are open to all respondents

Situation of self-employed individuals prividing services through platforms edit

Are there areas in the situation of individuals providing services through platforms which would need further improvements?
Please rate the following issues from 1 (no improvements needed) to 5 (substantial issues need to be addressed).

  • Earnings
  • Flexibility of choosing when and /or where to provide services
  • Transparency on remuneration
  • Measures to tackle non-payment of remuneration
  • Transparency in online ratings
  • Ensuring that individuals providing services through platforms can contact each other and organise themselves for collective purposes
  • Tackling the issue of work carried out by individuals lacking legal permits
  • Prevention of discrimination of individuals providing services through platforms, for instance based on gender, racial or ethnic origin
  • Allocation of liability in case of damage
  • Other, please specify

Please explain the issues that you encounter or perceive.
3000 character(s) maximum

Comments

Do you think individuals providing services in the 'offline/traditional' economy face similar issues as individuals offering services through platforms?

  • Yes
  • No
  • I don't know

Please explain and provide examples.
3000 character(s) maximum

Comments

In your view, what are the obstacles for improving the situation of individuals providing services

  • through platforms?
  • in the offline/traditional economy?

3000 character(s) maximum

Comments

To what extent could the possibility to negotiate collectively help improve the situation of individuals offering services:

  • through online platforms?
  • in the offline/traditional economy?

Which are the areas you would consider most important for you to enable such collective negotiations?
3000 character(s) maximum

Comments

In this regard, do you see any obstacles to such negotiations?
3000 character(s) maximum

Comments

Are there other points you would like to raise?
3000 character(s) maximum

Comments


VI. What governance for reinforcing the Single Market for digital services? edit

The EU’s Single Market offers a rich potential for digital services to scale, including for innovative European companies. Today there is a certain degree of legal fragmentation in the Single Market . One of the main objectives for the Digital Services Act will be to enhance the innovation opportunities and ‘deepen the Single Market for Digital Services’.

This section of the consultation intends to collect evidence and views on the current state of the single market and on steps for further improvements for a competitive and vibrant Single market for digital services. This module also inquires about the relative impact of the COVID-19 crisis on digital services in the Union.

It then focuses on the appropriate governance and oversight over digital services across the EU and means to enhance the cooperation across authorities for an effective supervision of services and for the equal protection of all citizens across the single market. It also inquires about specific cooperation arrangements such as in the case of consumer protection authorities across the Single Market, or the regulatory oversight and cooperation mechanisms among media regulators. This section is not intended to focus on the enforcement of GDPR provisions.

Main issues edit

How important are digital services such as accessing websites, social networks, downloading apps, reading news online, shopping online, selling products online in your daily life or your professional transactions?
Please answer on a scale from 1 to 5 for each.

  • Overall
  • Those offered from outside of your Member State of establishment

The following questions are targeted at digital service providers Approximately, what share of your EU turnover is generated by the provision of your service outside of your main country of establishment in the EU?

  • Less than 10%
  • Between 10% and 50%
  • Over 50%
  • I cannot compute this information

To what extent are the following obligations a burden for your company in providing its digital services, when expanding to a/several EU Member State(s)?
Please rate the following obligations from 1 (not at all burdensome) to 5 (very burdensome).

  • Different processes and obligations imposed by Member States for notifying, detecting and removing illegal content/goods/services
  • Requirements to have a legal representative or an establishment in more than one Member State
  • Different procedures and points of contact for obligations to cooperate with authorities
  • Other types of legal requirements. Please specify below

Have your services been subject to enforcement measures by an EU Member State other than your country of establishment?

  • Yes
  • No
  • I don't know

Were you requested to comply with any ‘prior authorisation’ or equivalent requirement for providing your digital service in an EU Member State?

  • Yes
  • No
  • I don't know

Are there other issues you would consider necessary to facilitate the provision of cross-border digital services in the European Union?
3000 character(s) maximum

Comments

What has been the impact of COVID-19 outbreak and crisis management measures on your business’ turnover

  • Significant reduction of turnover
  • Limited reduction of turnover
  • No significant change
  • Modest increase in turnover
  • Significant increase of turnover
  • Other

Do you consider that deepening of the Single Market for digital services could help the economic recovery of your business?

  • Yes
  • No
  • I don't know

Please explain 3000 character(s) maximum

Comments

The following questions are targeted at all respondents.

Governance of digital services and aspects of enforcement edit

The ‘country of origin’ principle is the cornerstone of the Single Market for digital services. It ensures that digital innovators, including start-ups and SMEs, have one set of rules to follow (that of their home country), rather than 27 different rules.

This is an important precondition for services to be able to scale up quickly and offer their services across borders. In the aftermath of the COVID-19 outbreak and effective recovery strategy, more than ever, a strong Single Market is needed to boost the European economy and to restart economic activities in the EU.

At the same time, enforcement of rules is key; the protection of all EU citizens regardless of their place of residence, will be in the centre of the Digital Services Act.

The current system of cooperation between Member States foresees that the Member State where a provider of a digital service is established has the duty to supervise the services provided and to ensure that all EU citizens are protected. A cooperation mechanism for cross-border cases is established in the E-Commerce Directive.

Based on your own experience, how would you assess the cooperation in the Single Market between authorities entrusted to supervise digital services?
5000 character(s) maximum

Comments

What governance arrangements would lead to an effective system for supervising and enforcing rules on online platforms in the EU in particular as regards the intermediation of third party goods, services and content (See also Chapter 1 of the consultation)?
Please rate, on a scale of 1 (not at all important) to 5 (very important), each of the following elements.

  • Clearly assigned competent national authorities or bodies as established by Member States for supervising the systems put in place by online platform
  • Cooperation mechanism within Member States across different competent authorities responsible for the systematic supervision of online platforms and sectorial issues (e.g. consumer protection, market surveillance, data protection, media regulators, anti-discrimination agencies, equality bodies, law enforcement authorities etc.)
  • Cooperation mechanism with swift procedures and assistance across national competent authorities across Member States
  • Coordination and technical assistance at EU level
  • An EU-level authority
  • Cooperation schemes with third parties such as civil society organisations and academics for specific inquiries and oversight
  • Other: please specify in the text box below
Comments

Please explain 5000 character(s) maximum

Comments

What information should competent authorities make publicly available about their supervisory and enforcement activity?
3000 character(s) maximum

Comments

What capabilities – type of internal expertise, resources etc. - are needed within competent authorities, in order to effectively supervise online platforms?
3000 character(s) maximum

Comments

In your view, is there a need to ensure similar supervision of digital services established outside of the EU that provide their services to EU users?

  • Yes, if they intermediate a certain volume of content, goods and services provided in the EU
  • Yes, if they have a significant number of users in the EU
  • No
  • Other
  • I don’t know

Please explain 3000 character(s) maximum

Comments

How should the supervision of services established outside of the EU be set up in an efficient and coherent manner, in your view?
3000 character(s) maximum

Comments

In your view, what governance structure could ensure that multiple national authorities, in their respective areas of competence, supervise digital services coherently and consistently across borders?
3000 character(s) maximum

Comments

As regards specific areas of competence, such as on consumer protection or product safety, please share your experience related to the cross-border cooperation of the competent authorities in the different Member States.
3000 character(s) maximum

Comments

In the specific field of audiovisual, the Audiovisual Media Services Directive established a regulatory oversight and cooperation mechanism in cross border cases between media regulators, coordinated at EU level within European Regulators’ Group for Audiovisual Media Services (ERGA). In your view is this sufficient to ensure that users remain protected against illegal and harmful audiovisual content (for instance if services are offered to users from a different Member State)? Please explain your answer and provide practical examples if you consider the arrangements may not suffice.
3000 character(s) maximum

Comments

Would the current system need to be strengthened? If yes, which additional tasks be useful to ensure a more effective enforcement of audiovisual content rules?
Please assess from 1 (least beneficial) – 5 (most beneficial). You can assign the same number to the same actions should you consider them as being equally important.

  • Coordinating the handling of cross-border cases, including jurisdiction matters
  • Agreeing on guidance for consistent implementation of rules under the AVMSD
  • Ensuring consistency in cross-border application of the rules on the promotion of European works
  • Facilitating coordination in the area of disinformation
  • Other areas of cooperation

Other areas of cooperation - (please, indicate which ones)
3000 character(s) maximum

Comments

Are there other points you would like to raise?
3000 character(s) maximum

Comments


Final remarks edit

Should you wish to upload a position paper, article, report, or other evidence and data you would like to flag to the European Commission, please do so. Other final comments 3000 character(s) maximum

Comments