EU deal on social media regulation a world first
EU lawmakers strike provisional agreement to gain access to social media algorithms and create a safer and more open digital space for users
The European Union has reached a provisional agreement to regulate digital markets and social media services in Europe – a world first in the field of digital regulation.
The Digital Services Act (DSA), together with the Digital Markets Act, will set standards for a safer and more open digital space for users and a level playing field for companies for years to come.
The rules are yet to be finalised at a technical level, before both MEPs and EU ministers give their formal approval.
Once the process is completed, the rules will start to apply 15 months later.
MEPs from the Internal Market Committee will now visit several social media giants in Silicon Valley – Meta, Google, Apple and others – in May, to discuss in person the DSA package, and other digital legislation in the pipeline, and hear the position of American companies, start-ups, academia and government officials.
“The Digital Services Act will set new global standards. Citizens will have better control over how their data are used by online platforms and big tech-companies. We have finally made sure that what is illegal offline is also illegal online. For the European Parliament, additional obligations on algorithmic transparency and disinformation are important achievements,” said Danish rapporteur MEP Christel Schaldemose (S&D).
“These new rules also guarantee more choice for users and new obligations for platforms on targeted ads, including bans to target minors and restricting data harvesting for profiling.”
More responsible online platforms
Under the new rules, online platforms such as social media and marketplaces will have to protect their users from illegal content, goods and services.
The European Commission and member states will also have access to the algorithms of very large online platforms.
Companies will have to swiftly remove illegal content online, including products and services. They will have to ensure that consumers can purchase safe products or services online, by strengthening checks to prove that the information provided by traders is reliable, and make efforts to prevent illegal content appearing on their platforms.
Victims of cyber violence will be better protected especially against non-consensual sharing such as revenge porn with immediate takedowns.
Online platforms and search engines can be fined up to 6% of their worldwide turnover for penalties and non-compiance.
New transparency obligations will allow users to be better informed about how content is recommended to them, so-called recommender systems, and to choose at least one option not based on profiling.
Online platforms will not be allowed to nudge people into using their services, for example by giving more prominence to a particular choice or urging the recipient to change their choice through interfering pop-ups.
Harmful content and disinformation
Very large online platforms will have to comply with stricter obligations under the DSA, proportionate to the significant societal risks they pose when disseminating illegal and harmful content, including disinformation.
Very large online platforms will have to assess and mitigate systemic risks and be subject to independent audits each year. In addition, those large platforms that use so-called “recommender systems” (algorithms that determine what users see) must provide at least one option that is not based on profiling.
And when a crisis occurs, such as a public security or health threat, the Commission may require very large platforms to limit any urgent threats on its platforms. These specific actions are limited to three months.
This article is part of a content series called Ewropej. This is a multi-newsroom initiative part-funded by the European Parliament to bring the work of the EP closer to the citizens of Malta and keep them informed about matters that affect their daily lives. This article reflects only the author’s view. The action was co-financed by the European Union in the frame of the European Parliament's grant programme in the field of communication. The European Parliament was not involved in its preparation and is, in no case, responsible for or bound by the information or opinions expressed in the context of this action. In accordance with applicable law, the authors, interviewed people, publishers or programme broadcasters are solely responsible. The European Parliament can also not be held liable for direct or indirect damage that may result from the implementation of the action.