The Digital Services Act is another landmark piece of legislation from the EU, which demands tech companies take control of content moderation.
An agreement was reached on the EU’s Digital Services Act after more than 16 hours of negotiations that began on Friday (22 April).
The core principle of the Digital Services Act is that what is illegal offline will be illegal online. “Not as a slogan, as reality,” tweeted European commissioner Margrethe Vestager.
European Commission president Ursula von der Leyen tweeted that the “historic” agreement on these rules “will protect users online, ensure freedom of expression and opportunities for businesses”.
Amnesty International agreed this was a landmark moment for tech regulation, but legal and policy adviser Claudia Prettner flagged a “missed opportunity” to “phase out all invasive surveillance-based advertising practices”.
What is the Digital Services Act?
The EU has billed the Digital Services Act (DSA) as “a world first in the field of digital regulation”. It sets out to make the internet safer with new rules for all digital services, from social media platforms to search engines to online marketplaces and more.
It was first proposed in December 2020 along with the Digital Markets Act, and follows in the footsteps of the General Data Protection Regulation (GDPR), another watershed piece of EU legislation.
Where GDPR focuses on data protection and privacy and the Digital Markets Act takes aim at the market dominance of Big Tech, the focus of the DSA is illegal content and the protection of users’ rights.
“Citizens will have better control over how their data are used by online platforms and Big Tech companies,” said rapporteur Christel Schaldemose. “These new rules also guarantee more choice for users and new obligations for platforms on targeted ads, including bans to target minors and restricting data harvesting for profiling.”
How will this impact Big Tech?
The measures set out in the Digital Services Act are proportionate to the scale of platforms.
“Very large” platforms are defined as those with more than 45m monthly active users in the EU, and these will face more stringent requirements. This figure will continue to be adjusted to represent 10pc of the EU population.
Major tech platforms such as Facebook, Instagram, WhatsApp, YouTube, TikTok and Amazon will qualify as very large platforms. These will be required to stay on top of content moderation and can expect annual audits of these practices.
The act also calls for simple measures for users to flag content and for swift action to be taken on such reports.
Platforms with fewer than 45m monthly active users as well as businesses that qualify as micro or small enterprises will be exempt from some obligations of the DSA.
What counts as illegal content?
The illegal content targeted under the DSA is broad and sweeping. It includes hate speech, child sexual abuse material, scams, non-consensual sharing of private images, promotion of terrorism, the sale of counterfeit or unsafe products and copyright infringement.
For marketplaces, the onus is on them to vet third-party traders and ensure products and services sold there are genuine and safe. This means adopting Know Your Business Customer principles à la the practices used to vet financial services users. The act also expects marketplaces to conduct randomised checks for illegal content.
Is there more to it than illegal content?
Yes, much more.
Very large platforms need to be able to monitor and manage any harmful content, which includes disinformation.
Platforms are also going to have to ensure their interfaces don’t intentionally mislead users using what the European Parliament calls “dark patterns”.
These tricks of UI include manipulative ‘nudge tactics’ such as giving more prominence to certain buttons or links that will lead users to opt in to something, while obscuring the steps to opt out. According to the Digital Services Act, cancelling a subscription should be as easy as subscribing.
What about targeted content?
Remarkably, the EU is also demanding access to platforms’ recommendations engines to ensure algorithmic accountability and transparency. The algorithms that recommend content to users are very much the secret sauce of online platforms and not something they will be keen to expose. (Though advocates for ‘explainable AI’ argue that this makes systems more trustworthy and could drive innovation.)
On the users’ side, platforms will have to offer the option to switch off any profiling used for recommendations.
Ad targeting also takes a hit under these rules. Users are to be given more control over the advertising they are exposed to while targeting users based on sensitive information such as religion, ethnicity or sexual orientation is now prohibited.
When it comes to children, all ad targeting is effectively banned. In fact, where platforms are aware of users that are minors, they will be required to have special protection measures in place.
Is that it?
The DSA also includes provisions for a crisis response mechanism. These measures make sense in light of recent crises such as the coronavirus pandemic and Russia’s invasion of Ukraine, where disinformation campaigns have been used to manipulate users and cause harm.
In times of crisis, the EU will decide proportionate measures to mitigate the impact of such content manipulation, limited to a three-month time frame.
What happens if the rules are broken?
Users will have the right to seek redress for any damages or losses incurred due to infringements of the DSA.
Regulators who find businesses to be non-compliant will be able to issue fines of up to 6pc of a company’s global turnover. For a multi-platform giant such as Meta, this could amount to about $7bn.
It’s a higher threshold than GDPR fines, which go up to 4pc of global turnover, and lower than the Digital Markets Act, which can penalise for up to 10pc, or 20pc in the case of repeated infringements.
Repeat violators of the DSA, however, could face an outright ban across the EU.
Who will enforce these rules?
When it comes to the Big Tech players, the EU will be directly involved with supervision, in cooperation with member states. Other entities and requirements covered by the DSA will be supervised by regulators in the country of origin.
Speaking on RTÉ Radio 1, Dr Johnny Ryan from the Irish Council for Civil Liberties dubbed this a “missed opportunity” for Ireland to be a “super regulator” under the DSA, seeing as so many of the major tech players have European bases here.
In the December 2020 proposal for the act, the European Commission estimated it would be actively monitoring 20 to 25 very large platforms, requiring a team of about 50 to be in place by 2025.
This work will reportedly be supported by the companies that are the subject of the act, through a supervisory fee of up to 0.1pc of annual global net income. This is could accrue €20m to €30m per year, according to Reuters.
By comparison, the Irish Data Protection Commission, which is charged with investigating the GDPR compliance of many tech giants in the EU, has a budget of €23.2m for 2022.
What happens next?
The text of the DSA is still being finalised by the EU’s legal language experts. Once this has been prepared, the act needs to be formally approved. It will then come into effect 20 days after publication.
Companies will have 15 months to comply before the rules come into force. It is expected this will bring enforcement of the Digital Services Act into 2024.
Next month, European Parliament representatives will visit the US headquarters of major tech companies such as Meta, Google and Apple, to hear their position on this and other digital legislation in the pipeline.
This legislation could have a knock-on effect across the Atlantic, as happened with the implementation of GDPR. There are legislators in the US who have been calling for tighter regulation of online platforms, and other prominent figures who support such measures.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.
Margrethe Vestager pictured at the European Parliament. Image: © European Union 2019. Source: EP (CC-BY-4.0)