Pavel Durov, the founder of messaging app Telegram, may feel a sense of vulnerability akin to his series of shirtless pictures on Instagram recently, now that he’s under the intense gaze of international regulators.
He’s been formally accused by French authorities of permitting illegal activities to proliferate on Telegram. For the time being, Durov has avoided imprisonment, having been released on €5m (£4.2m) bail.
The Russian-born tycoon, who also has French citizenship, found himself arrested last weekend following charges that Telegram serves as a breeding ground for nefarious online behavior ranging from child sexual exploitation to drug trafficking and financial scams, as reported by City AM.
This scenario showcases the uncommon occurrence where a tech executive is held directly accountable for user conduct on their platform.
Despite facing mounting pressure, Telegram maintains its compliance with European legislation, asserting, “It is absurd to claim that a platform, or its owner, are responsible for abuse of that platform.”
Although Durov has consistently championed freedom of expression and a policy of limited moderation on his app, detractors insist that such a laissez-faire stance has transformed Telegram into a sanctuary for terrorists, narcotics peddlers, arms dealers, and radical extremists.
Moreover, Telegram’s relatively modest team of about 50 employees stands in stark contrast to the workforce of competitors such as Facebook and Instagram proprietor Meta, which hires tens of thousands for monitoring content, sparking debate over the regulation of social media platforms.
Social media regulation The lawsuit against Durov arrives amidst growing scrutiny over social media firms and their content moderation responsibilities.
Recently, Telegram was used to spread messages that incited riots lasting nearly a week in Southport, UK, after tragic stabbings occurred. The platform responded by closing the channels involved, one of which had more than 13,000 members.
Violence-inciting posts also appeared on Facebook and X, while the UN-backed Tech Against Terrorism pinpointed a TikTok account posting solely provocative content about Southport, attracting over 57,000 views in just hours.
Following the unrest, there have been louder calls for tougher regulations and enhanced powers to censor online content. London Mayor Sadiq Khan has urged a reassessment of the Online Safety Act 2023, which has faced criticism for not fully achieving its objectives, especially concerning misinformation.
Legal expert Mark Jones, a Partner at Payne Hicks Beach, commented that the bill could have been a turning point but ultimately “provides no additional support to the pre-existing criminal law covering incidents of incitement of violence.”
Conversely, the legal action taken against Durov has raised concerns within certain circles about a potential “chilling effect,” where the fear of legal consequences might drive social media leaders to excessively moderate or censor content.
Mark Zuckerberg, the head of Meta, has recently admitted to censoring content on the company’s social media platforms during the Covid-19 pandemic. He disclosed that his firm succumbed to pressure from the US government to suppress anti-vaccination posts and other materials, including memes.
Not one to shy away from the topic of free speech, X owner Elon Musk, who has previously stated that “moderation is a propaganda word for censorship,” has joined the debate.
Following Durov’s arrest, Musk humorously tweeted: “POV: It’s 2030 in Europe, and you’re being executed for liking a meme.”
“POV: It’s 2030 in Europe and you’re being executed for liking a meme t.co/OkZ6YS3u2P – Elon Musk (@elonmusk) August 24, 2024″.
Could this be the precursor to a more extensive crackdown?
With the European Commission’s ongoing probe into X for purported non-compliance with disinformation rules, coupled with demands for more stringent legislation in the UK post-riots, it seems the regulatory environment for social media could be set to tighten.
Public opinion may already be tipping the scales, as there appears to be a consensus on holding social media firms to account. A YouGov survey indicates that two-thirds of Britons feel these companies should be liable for posts that incite criminal activity, and 70 per cent believe the current regulations on these platforms are not robust enough.
As the boundaries between free speech and responsible moderation become increasingly blurred, Durov’s case could establish a precedent for how governments globally deal with social media platforms that fail to curb unlawful activity.
Like this story? Why not sign up to get the latest business news straight to your inbox.