An opinion by Alexandra Geese, Patrick Breyer and Marcel Kolaja, Greens/EFA MEPs.
With the Digital Services Act (DSA), Europe has the chance to set global standards. The DSA will regulate how we can exercise our rights and freedoms online in the future.
The European Parliament’s reports contain recommendations for the EU Commission to take them into account for the upcoming legislative proposal for a Digital Services Act.
Our priorities and successes in the DSA reports:
- Put rights and freedoms at the heart of the Digital Services Act: Ensure that effective complaint and redress mechanisms are available on all platforms and that the reporting and removal of toxic content will significantly improve. Unfortunately, platforms such as YouTube, Instagram, Twitter and TikTok filter and moderate with a lot of collateral damage: Too often, hate crime especially targeting minority groups remains online. On the other hand, legitimate posts, videos, accounts and ads are removed and the platforms make it difficult to contest. This has a serious implication for freedom of expression online. Terms and Conditions are not above the law: We successfully pushed for language to introduce legal certainty and binding democratically decided rules regarding illegal content online, instead of letting big tech companies decide on what we can read, write or watch online, or what might be “unacceptable” content in our democratic society.
- Stop automated censorship: Fight against the broadening of flawed automated content filters which result in over blocking of legal speech, including for minorities and disadvantaged groups. In many areas, automated tools are unable to differentiate illegal content from content that is legal in a given context. The Digital Services Act should explicitly exclude any obligation to use automated tools for content moderation and regulate their voluntary use by providing for robust safeguards and transparency. Within the context of the DSA, content moderation procedures used by providers shall not lead to any ex-ante control measures based on automated tools or upload-filtering of content.
- Address the business model of big platforms that contributes to spreading problematic and illegal content: Social networks have become an important player in our democracy. The spread of legal but unwanted content such as disinformation and hate speech on social media should not be addressed by take-down measures but instead be contained by introducing a tiered transparency model and giving users control over content proposed to them: 1. Supervisory authorities should be able to go into the companies to carry out audits of the software and decision-making processes, 2. scientists should have access to content curation and recommendation algorithms and 3. users of social networks should provide a meaningful explanation of how the algorithms on the platforms work. In addition, they should have a right to see their timeline in chronological order without any content curation. Major platforms should provide a technical interface (API) for users that allows them to have content curated by software or services of their choice.
- End surveillance capitalism: The online activities of individuals allow for deep insights into their personality and make it possible to manipulate them. The collection and use of personal data concerning the use of digital services shall therefore be limited to the extent strictly necessary in order to provide the service and to bill the users. We introduced text in the DSA reports for a ban on behavioural advertising and micro-targeting, in order to allow users to regain control over data, to ensure the independence of the press from gatekeepers and free and robust elections. Since anonymity effectively prevents unauthorised data disclosure, identity theft and other forms of abuse of personal data collected online, intermediaries shall be required to enable the anonymous use of their services and payment for them. On the other hand, commercial traders on online marketplaces shall be identified to make sure consumers can exercise their rights.
- For a diverse online ecosystem and user choice: It is our goal that users shall be able to communicate with each other across different services (e.g. messaging services, social networks). This means for instance that we should be able to communicate with contacts on Facebook even if we use a different messaging service. Through real interoperability, mentioned by both the JURI and IMCO reports, we will enable a competitive market for the most innovative services and user choice.
Greens/EFA MEPs Alexandra Geese, Patrick Breyer and Marcel Kolaja are the authors of this paper. Alexandra Geese is our shadow in the European Parliament’s Committee on Internal Market and Consumer Protection (IMCO), Patrick Breyer is our shadow in the Committee on Legal Affairs (JURI). Marcel Kolaja has been involved as IMCO shadow on the opinions from the JURI Committee.