Digital Services Act Ushers in New Era of Online Content Moderation

After months of negotiation, the European Union has reached an agreement on the Digital Services Act.

The legislation has been touted as a landmark for online content regulation. Executive Vice-President of the European Commission, Margrethe Vestager, said the Act would effectively carry over the same principles of content and disinformation from existing laws into online content.

“The Digital Services Act will make sure that what is illegal offline is also seen [and] dealt with as illegal online,” she tweeted.

The Digital Services Act promises to be tougher on large tech firms, adopting more rigorous requirements regarding content moderation. Organisations will also be forced to demonstrate greater transparency about the information published on their sites.

In a statement, European Commission President Ursula von der Leyen said that the Digital Services Act “will upgrade the ground rules for all online services in the EU. It will ensure that the online environment remains a safe space, safeguarding freedom of expression and opportunities for digital businesses”.

What is the Digital Services Act?

The EU has described the Digital Services Act as “a world first in the field of digital regulation”. It aims to make the Internet safer for people by implementing rules for digital service providers, including social media platforms, search engines and online marketplaces.

Tech giants such as Google and Twitter will be required to conduct regular assessments to spot deceptive or exploitative content, and to implement appropriate controls to prevent such material from appearing on its sites.

The Act establishes a framework for removing harmful and illegal content. It also requires organisations to clearly label paid, promoted content.

Similarly, there are strict rules on profiling and targeted advertising – specifically when it relates to the use of children’s personal data.

Online platforms will also be required to explain how they personalise content for users via their recommendation systems. This could include, for example, disclosing the purpose for collecting cookies and explaining the effect that custom settings will have on their user experience.

A late addition to the Digital Services Act is the requirement to implement a crisis response mechanism. It enables the European Commission to mandate that large tech firms take specific actions when dealing with content related to social and political emergencies.

The rules were adopted in response to Russia’s invasion of Ukraine and the abundance of political videos that have been posted. Under the new rules, large tech firms are required to remove content under certain circumstances.

According to the European Commission, a crisis would only be declared following a majority vote from the board of national authorities.

The consequences of non-compliance

Organisations that are subject to the Digital Services Act will be required to complete regular compliance audits. Those that fail the audit – or fail to complete it – will be subject to fines of up to 6% of their annual global turnover.

It’s an even higher threshold than the GDPR (General Data Protection Regulation), which drew criticism upon its publication for the scale of fines that could be issued.

Under the Digital Services Act, the largest tech firms could receive penalties of up to €6.5 billion. However, given the restraint that regulators have shown in relation to GDPR violations, it’s highly unlikely that we’ll see fines anywhere close to this scale.

Nonetheless, even comparatively lenient penalties, such as the €225 million fine levied against WhatsApp for a GDPR breach last year, will have serious consequences.

If the Digital Services Act delivers penalties on that scale, it will demonstrate how seriously the European Commission takes online content moderation.

Another signal of the European Commission’s intent is its promise that organisations that repeatedly breach its rules could face an outright ban across the European Union.

What happens next?

The text of the Digital Services Act is still being finalised by the European Union’s legal team. Once that process is complete, the legislation needs to be formally approved and published.

Organisations will then have 15 months to bring their practices into line with its requirements.

IT Governance will be tracking the Digital Services Act through to it taking effect. You can keep abreast of any developments by following our blog or subscribing to our Weekly Round-up, where you can receive all the latest cyber security news and advice delivered straight to your inbox.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.