Platforms
5 mins read

A look at challenges @twitter Elon Musk will face in the light of the EU’s DSA

Getting your Trinity Audio player ready...

Will the Digital Services Act, or DSA, make Elon Musk regret buying Twitter?

The European Union has been busy this year agreeing on landmark legislation such as the Digital Markets Act, or DMA, in March and almost a month later the European Parliament and European Council reached an agreement on the Digital Services Act, or DSA.

The legal language still needs to be finalized and both have to be passed to come into effect – DMA is expected “sometime in October”; DSA in the coming weeks. Both will be applied fifteen months after or from 1 January 2024, whichever comes later, after entry into force.

The DMA aims to balance competition in the tech world and the DSA’s ambition is to create a standard regarding illegal and harmful content, also it will seek more transparency in terms of algorithms and data gathering.

Into all of this came the news that Elon Musk reaches a deal to buy Twitter, the 217-million active users strong social network which he is taking private. Still, the rules mentioned above will also apply to him or rather his social network.

What will DSA bring to platforms

In the press release, the European Commission explains:

The DSA sets out an unprecedented new standard for the accountability of online platforms regarding illegal and harmful content. It will provide better protection for internet users and their fundamental rights, as well as define a single set of rules in the internal market, helping smaller platforms to scale up.

Ursula von der Leyen, the European Commission President reiterated what EU officials have been saying even before they started to craft the legislation: “It [DSA] gives practical effect to the principle that what is illegal offline, should be illegal online.”

The Verge summed up nicely a number of obligations for the platforms:

  • Ban for targeted ads based on religion, sexual orientation, ethnicity or on minors. 
  • No ‘dark patterns’ (deceptive user interface design).
  • Canceling subscriptions should be as easy as signing up for them (something the FTC in US has last year vowed to ramp up enforcement on).
  • Large platforms with algorithms will have to make it clear how they work and also offer a recommender system “not based on profiling” (so, a chronological feed recently re-introduced by Instagram as an option).
  • If illegal content is removed, hosting services and platforms have to give reason and offer appeal for users.
  • Large platforms will also have to provide key data to researchers.
  • Online marketplaces must keep basic information about traders on their platform to track down individuals selling illegal goods or services.
  • Large platforms will also have to introduce new strategies for dealing with misinformation during crises (a provision inspired by the recent invasion of Ukraine).

As Daphne Keller from Stanford’s Cyber Policy Center explains, the DSA creates a range of new legal protections and tools for understanding or shaping platform behavior. Keller also stressed the new legislation is as important as GDPR.

There are still many questions and details to be made clear, but already you can hear voices like Jacob Mchangama, the executive director of Justitia, a Copenhagen based think tank focusing on human rights, who think that DSA will weaken free speech laws beyond the breaking point. 

Writing for Foreign PolicyMchangama notes that while many politicians mention terrorist propaganda, hate speech, and disinformation, in terms of the illegality of the content, available data suggests that most of the problematic content online is legal.

Mchangama also mentions the Network Enforcement Act, or NetzDG, adopted in 2017 in Germany as a piece of legislation that was copied by authoritarian regimes to weaken free speech.

Last year, The New York Times looked at NetzDG just before German elections and concluded than even though it’s labeled as one of the world’s toughest laws against online hate speech and harassment it had little to no effect on stopping problematic posts. And in a 2019 survey, German women said they did not share political opinions online for fear of abuse.

Free speech activists cite the same reservations regarding NetzDG and also DSA – the law encouraged (or will encourage, in case of DSA) companies to remove potentially offensive speech that is perfectly legal and in effect undermine free expression rights.

Elon, DSA and the misunderstanding of what is free speech 

That’s where Elon Musk and his purchase of Twitter come into picture. Musk is a ‘free speech absolutist‘ and tweeted in the past “given that Twitter serves as the de facto public town square, failing to adhere to free speech principles fundamentally undermines democracy,”

“Free speech is the bedrock of a functioning democracy, and Twitter is the digital town square where matters vital to the future of humanity are debated,” said Musk in the press release announcing he finalized the deal to buy the social network.

First, let’s look at how content moderation and free speech has evolved on social media in the past decade or so. Every social network started more than ten years ago (Facebook, Twitter, Youtube, etc.) began with the founders and their employees saying they built these platforms for free speech.

A decade later, each of the platforms spends hundreds of millions or billions of dollars on content moderation, has community rules, guidelines for removal of harmful content and journalists can’t stop explaining that social media is not a public square, they are all private companies.

Social network founders are really to blame here. They started these platforms without thinking about the bad actors and bad behaviour, talked about free speech all the time and now are scrambling to set up the right rules.

Europe’s DSA is in a way a response to this mess and aims to bring more accountability and as Politico.eu put it “wrote the new rulebook for how internet players moderate and manage content.”

Daphne Keller also stressed extra rules and obligations for the “Very Large Online Platforms” or VLOPs, which have at least 45 million monthly active users in the EU.

Twitter will likely fall into that category. Which means will bring extra responsibilities: ongoing engagement with regulators, providing vetted researchers and regulators with access to internal data, labelling deep fakes, publishing transparency reports, removing content in emergencies in compliance with crisis protocols and more.

At a TED conference a couple of days Musk called Twitter the “de facto town square” and said there needs to be an arena for inclusive free speech. Both Zuckerberg and Dorsey are on the record in the past saying something similar.

Yet when harmful content started to spread and affected a wider public and the platform came under scrutiny by legislators all over the world suddenly it was OK to police this “digital town square”.

Last year, Wired published a great interview with Evelyn Douek, an Australian scholar at Harvard Law School, on the topic of how social networks all started out with the idea of American-style free speech and all ended up regulating speech not only to please regulators but also to try to keep a normal discourse on the platforms.

Elon Musk will be facing all of this as Twitter’s owner.

In a recent piece titled Elon Musk won’t fix Twitter (but he won’t kill it, either), Max Read suggests the Tesla CEO might change less about the social network than we expect as he is using Twitter as a marketing platform for his other businesses and it’s in his interest not to mess up the status quo.

Twitter will likely eat up more of his time than he thinks at the moment. He could probably go and ask Zuckerberg how many headaches does content moderation cause him every day.

David Tvrdon

This piece was originally published in The Fix and is re-published with permission.