telegram – Techdirt (original) (raw)

Ctrl-Alt-Speech: Is This The Real Life? Is This Just Fakery?

from the ctrl-alt-speech dept

Ctrl-Alt-Speech is a weekly podcast about the latest news in online speech, from Mike Masnick and Everything in Moderation‘s Ben Whitelaw.

Subscribe now on Apple Podcasts, Overcast, Spotify, Pocket Casts, YouTube, or your podcast app of choice — or go straight to the RSS feed.

In this week’s round-up of the latest news in online speech, content moderation and internet regulation, Ben is joined by guest host Cathryn Weems, who has held T&S roles at Yahoo, Google, Dropbox, Twitter and Epic Games. They cover:

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund, and by our sponsor Concentrix, the technology and services leader driving trust, safety, and content moderation globally. In our Bonus Chat at the end of the episode, clinical psychologist Dr Serra Pitts, who leads the psychological health team for Trust & Safety at Concentrix, talks to Ben about how to keep moderators healthy and safe at work and the innovative use of heart rate variability technology to monitor their physical response to harmful content.

Filed Under: ai, artificial intelligence, content moderation, disinformation, elon musk, misinformation
Companies: google, telegram, twitter, x

Ctrl-Alt-Speech: Judge, Jury And Moderator

from the ctrl-alt-speech dept

Ctrl-Alt-Speech is a weekly podcast about the latest news in online speech, from Mike Masnick and Everything in Moderation‘s Ben Whitelaw.

Subscribe now on Apple Podcasts, Overcast, Spotify, Pocket Casts, YouTube, or your podcast app of choice — or go straight to the RSS feed.

In this week’s round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.

Filed Under: brazil, content moderation, internet archive, oversight board, pavel durov, starlink, texas
Companies: meta, spacex, telegram, twitter, x

Ctrl-Alt-Speech: The Platform To Prison Pipeline

from the ctrl-alt-speech dept

Ctrl-Alt-Speech is a weekly podcast about the latest news in online speech, from Mike Masnick and Everything in Moderation‘s Ben Whitelaw.

Subscribe now on Apple Podcasts, Overcast, Spotify, Pocket Casts, YouTube, or your podcast app of choice — or go straight to the RSS feed.

In this week’s round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.

Filed Under: brazil, content moderation, donald trump, mark zuckerberg, pavel durov, section 230, third circuit
Companies: telegram, tiktok, twitter, x

Durov’s Arrest Details Released, Leaving More Questions Than Answers

from the still-concerning dept

Is the arrest of Pavel Durov, founder of Telegram, a justified move to combat illegal activities, or is it a case of dangerous overreach that threatens privacy and free speech online? We had hoped that when French law enforcement released the details of the charges we’d have a better picture of what happened. Instead, we’re actually just left with more questions and concerns.

Earlier today we wrote about the arrest and how it already raised a lot of questions that didn’t have easy answers. Soon after that post went out, the Tribunal Judiciaire de Paris released a press release with some more details about the investigation (in both French and English). All it does is leave most of the questions open, which might suggest they don’t have very good answers.

First, the report notes “the context of the judicial investigation” which may be different from what he is eventually charged with, though the issues are listed as “charges.”

I would bucket the list of charges into four categories, each of which raise concerns. If I had to put these in order of greatest concern to least, it would be as follows:

  1. Stuff about encryption. The last three charges are all variations on “providing a cryptology service/tool” without some sort of “prior declaration” or “certified declaration.” Apparently, France (like some other countries) has certain import/export controls on encryption. It appears they’re accusing Durov of violating those by not going through the official registration process. But, here, it’s hard not to see that as totally pretextual: an excuse to arrest Durov over other stuff they don’t like him doing.
  2. “Complicity” around a failure to moderate illegal materials. There are a number of charges around this. Complicity to “enable illegal transactions” for “possessing” and “distributing” CSAM, for selling illegal drugs, hacking tools, and organized fraud. But what is the standard for “complicity” here? This is where it gets worrisome. If it’s just a failure to proactively moderate, that seems very problematic. If it’s ignoring direct reports of illegal behavior, then it may be understandable. If it’s more directly and knowingly assisting criminal behavior, then things get more serious. But the lack of details here make me worry it’s the earlier options.
  3. Refusal to cooperate with law enforcement demands for info: This follows on from my final point in number two. There’s a suggestion in the charges (the second one) that Telegram potentially ignored demands from law enforcement. It says there was a “refusal to communicate, at the request of competent authorities, information or documents necessary for carrying out and operating interceptions allowed by law.” This could be about encryption, and a refusal to provide info they didn’t have, or about not putting in a backdoor. If it’s either of those, that would be very concerning. However, if it’s just “they didn’t respond to lawful subpoenas/warrants/etc.” that… could be something that’s more legitimate.
  4. Finally, money laundering. Again, this one is a bit unclear, but it says “laundering of the proceeds derived from organized group’s offences and crimes.” It’s difficult to know how serious any of this is, as that could represent something legitimate, or it could be French law enforcement saying “and they profited off all of this!” We’ve seen charges in other contexts where the laundering claims are kind of thrown in. Details could really matter here.

In the end, though, a lot of this does seem potentially very problematic. So far, there’s been no revelation of anything that makes me say “oh, well, that seems obviously illegal.” A lot of the things listed in the charge sheet are things that lots of websites and communications providers could be said to have done themselves, though perhaps to a different degree.

So we still don’t really have enough details to know if this is a ridiculous arrest, but it does seem to be trending towards that so far. Yes, some will argue that Durov somehow “deserves” this for hosting bad content, but it’s way more complicated than that.

I know from the report that Stanford put out earlier this year that Telegram does not report CSAM to NCMEC at all. That is very stupid. I would imagine Telegram would argue that as a non-US company it doesn’t have to abide by such laws. These charges are in France rather than the US, but it still seems bad that the company does not report any CSAM to the generally agreed-upon organization that handles such reports, and to which companies operating in the US have a legal requirement to report.

But, again, there are serious questions about where you draw these lines. CSAM is content that is outright illegal. But some other stuff may just be material that some people dislike. If the investigation is focused just on the outright illegal content that’s one thing. If it’s not, then this starts to look worse.

On top of that, as always, are the intermediary liability questions, where the question should be how much responsibility a platform has for its users’ use of the system. The list of “complicity” in various bad things worries me because every platform has some element of that kind of content going on, in part because it’s impossible to stop entirely.

And, finally, as I mentioned earlier today, it still feels like many of these issues would normally be worthy of a civil procedure, perhaps by the EU, rather than a criminal procedure by a local court in France.

So in the end, while it’s useful to see the details of this investigation, and it makes me lean ever so slightly in the direction of thinking these potential charges go too far, we’re still really missing many of the details. Nothing released today has calmed the concerns that this is overreach, but nothing has made it clear that it definitely is overreach either.

Filed Under: complicity, content moderation, csam, encryption, france, law enforcement, pavel durov
Companies: telegram

Arrest Of Telegram’s Pavel Durov Raises Questions, But The Answers May Not Be Known For A While

from the let's-not-jump-to-conclusions-either-way dept

There’s plenty of news flying around over the past few days after it was reported on Saturday that Pavel Durov, the founder and CEO of Telegram, had been arrested at Bourget airport in France after taking his private plane there from Azerbaijan. Durov, who got a French citizenship in 2021 apparently knew that there was a risk he might be arrested, but chose to go anyway.

The reporting on why he was arrested has been somewhat vague, to the point that it could be hyped up nonsense, or it could actually be legit. Initial reports claimed that he was arrested over a “lack of moderation” but other reports suggested potentially more serious claims around drug trafficking, terrorism, and CSAM.

France’s OFMIN, an office tasked with preventing violence against minors, had issued an arrest warrant for Durov in a preliminary investigation into alleged offences including fraud, drug trafficking, cyberbullying, organised crime and promotion of terrorism, one of the sources said.

Durov is accused of failing to take action to curb the criminal use of his platform.

“Enough of Telegram’s impunity,” said one of the investigators, adding they were surprised Durov came to Paris knowing he was a wanted man.

The problem is, without more details, we have no idea what is actually being charged and what his alleged responsibility is. After all, we’ve seen other cases where people have been charged with sex trafficking, when the reality was that was just how law enforcement spun a refusal to hand over data on users.

On top of that, leaping to criminal charges against an exec over civil penalties for a company… seems strange. For that to make any sense, someone should need to show actual criminal behavior by Durov, and not just “his service hosted bad stuff.”

The head of OFMIN, the French police agency that issued the warrant, posted to LinkedIn (of all places) that: “At the heart of this issue is the lack of moderation and cooperation of the platform (which has nearly 1 billion users), particularly in the fight against paedophilia.” Again, that is frightfully unclear. Is it just that Telegram wasn’t doing enough to fight CSAM? And if so, what “lack of moderation and cooperation” is enough? Because lots of websites are accused (often unfairly) of not doing enough in the fight against CSAM. Or is there something more?

And if it was just that they weren’t “cooperating” does it make sense to jump straight to criminal charges against the CEO, rather than penalties and fines for the company?

One thing, which I’ve talked about on the Ctrl-Alt-Speech podcast a few times, is how often Telegram comes up in discussions of content moderation and bad behavior, but politicians kind of wave it off as untouchable. Telegram had claimed to be under the threshold that would cause it to be registered as a “Very Large Online Platform” (VLOP) in the EU, and EU officials seemed to buy that claim.

But the numbers were still quite close (a claimed 41 million EU users, when the threshold is 45 million). And even if you’re not a VLOP, there were some requirements for smaller platforms, and it was unclear if Telegram was even remotely concerned with complying.

On top of that there were plenty of stories of bad behavior across social media first being planned on Telegram. The most recent example was the riots in the UK. While lots of people talked about misinformation on ExTwitter that contributed to that, much of that content originated on Telegram.

But, hosting bad behavior alone shouldn’t lead to criminal charges. Even ignoring law enforcement demands seems like it should lead to civil penalties before reaching criminal charges. That’s why I’m really hoping that there are more details here that justify the arrest. Without the details, though, it’s really difficult to know if this is an attack on free speech, or legitimate charges over actual criminal behavior.

I know that many people are leaping to conclusions one way or the other, but until we know the details, everyone’s guessing.

Earlier this year, Durov had given a surprising and rare interview with the Financial Times, where he actually talked about some of the effort (or lack thereof?) that Telegram puts into dealing with criminal behavior on the platform:

Durov said Telegram planned to improve its moderation processes this year as multiple global elections unfold and “deploy AI-related mechanisms to address potential issues”.

But “unless they cross red lines, I don’t think that we should be policing people in the way they express themselves”, said Durov. “I believe in the competition of ideas. I believe that any idea should be challenged . . . Otherwise, we can quickly degrade into authoritarianism.”

That same interview noted that the company only had 50 full time employees, though some reports have suggested it did have some other outsourced moderators. But in general it took a pretty hands off approach. That alone should never lead to criminal charges, though.

Also, there are different parts to Telegram’s service. There are the various channels, which act as sort of semi-public “groups” around certain topics. That part is more like social media communities. But there is also parts that are more about person-to-person communication, which the company has long insisted is end-to-end encryption, though many people have doubted the security of it, since Telegram does not reveal how it works.

On top of that, the “encrypted” messaging is not enabled by default, only works in one-to-one communications (any group messaging is unencrypted) and is quite hard to actually turn on. In other words, the vast, vast, vast majority of content on Telegram is not encrypted and can be seen by the company.

So, there are big questions about whether or not the charges against him relate to the more social media style content, or the (supposedly) encrypted communications part.

On top of that, there’s the Russia question in all of this. Telegram was based in Dubai, and part of the reason for that was that the Russian-born Durov was effectively forced to flee Russia and sell his former company, VK (basically a Russian clone of Facebook that was quite successful), after refusing to remove some content that the Kremlin didn’t like.

However, more recently, there have been claims that the Russian government has access to private Telgram communications, and Russian officials have said that the company “cooperates with Russian law enforcement.” And the response to Durov’s arrest from Russian officials suggest that they’re not happy about the arrest. While the Kremlin itself has been somewhat cautious in its public response, Russian media has been condemning the arrest, and various politicians have been calling for the French to release Durov.

The other interesting point is how central Telegram has been to Russia’s war in Ukraine, for both sides.

Of course, Europol has also said that Telegram cooperates with its request for dealing with terrorism online. And other reports have talked about Telegram cooperating with German officials and handing over data on users.

Combine all that and, basically, at this point, no one really knows what’s going on. It’s possible that Telegram cooperated on some law enforcement efforts and didn’t on others. It’s possible that it had good reasons to cooperate or not cooperate. It’s possible the team got overwhelmed. But it’s also possible that it just said “fuck it” and decided to ignore legal demands because they didn’t care.

As of right now, we just don’t know.

It sounds potentially worrisome, because if it’s really just “well, they refused to take down what we wanted,” that would be a dangerous attack on free speech. But if it’s “Durov himself was actively involved in the creation of and the sharing of illegal content,” then it could be trickier. And there’s a wide spectrum in between.

I will note that, over on Twitter, Elon’s fans are insisting that this is a test run before officials arrest Elon, but that seems ridiculously unlikely.

Also, I have to remind folks that a little over two decades ago, France also put out an arrest warrant on Yahoo CEO Tim Koogle, charging him as a war criminal, because Yahoo’s auction site in the US (notably, not the French version) allowed people to sell Nazi memorabilia. Eventually he was acquitted. You would hope in the two decades since then that officials would be a bit more sophisticated about this stuff, but at this moment, it’s just not clear at all.

Filed Under: arrest, content moderation, criminal liability, france, intermediary liabilty, pavel durov
Companies: telegram

Georgia Prosecutors Stoke Fears Over Use Of Encrypted Messengers And Tor

from the prosecutorial-fud dept

In an indictment against Defend the Atlanta Forest activists in Georgia, state prosecutors are citing use of encrypted communications to fearmonger. Alleging the defendants—which include journalists and lawyers, in addition to activists—in the indictment were responsible for a number of crimes related to the Stop Cop City campaign, the state Attorney General’s prosecutors cast suspicion on the defendants’ use of Signal, Telegram, Tor, and other everyday data-protecting technologies.

“Indeed, communication among the Defend the Atlanta Forest members is often cloaked in secrecy using sophisticated technology aimed at preventing law enforcement from viewing their communication and preventing recovery of the information” the indictment reads. “Members often use the dark web via Tor, use end-to-end encrypted messaging app Signal or Telegram.”

The secure messaging app Signal is used by tens of millions of people, and has hundreds of millions of global downloads. In 2021, users moved to the nonprofit-run private messenger en masse as concerns were raised about the data-hungry business models of big tech. In January of that year, former world’s richest man Elon Musk tweeted simply “Use Signal.” And world-famous NSA whistle-blower Edward Snowden tweeted in 2016 what in information security circles would become a meme and truism: “Use Tor. Use Signal.”

Despite what the bombastic language would have readers believe, installing and using Signal and Tor is not an initiation rite into a dark cult of lawbreaking. The “sophisticated technology” being used here are apps that are free, popular, openly distributed, and widely accessible by anyone with an internet connection. Going further, the indictment ascribes the intentions of those using the apps as simply to obstruct law enforcement surveillance. Taking this assertion at face value, any judge or reporter reading the indictment is led to believe everyone using the app simply wants to evade the police. The fact that these apps make it harder for law enforcement to access communications is exactly because the encryption protocol protects messages from everyone not intended to receive them—including the users’ ISP, local network hackers, or the Signal nonprofit itself.

Elsewhere, the indictment hones in on the use of anti-surveillance techniques to further its tenuous attempts to malign the defendants: “Most ‘Forest Defenders’ are aware that they are preparing to break the law, and this is demonstrated by premeditation of attacks.” Among a laundry list of other techniques, the preparation is supposedly marked by “using technology avoidance devices such as Faraday bags and burner phones.” Stoking fears around the use of anti-surveillance technologies sets a dangerous precedent for all people who simply don’t want to be tracked wherever they go. In protest situations, carrying a prepaid disposable phone can be a powerful defense against being persecuted for participating in first-amendment protected activities. Vilifying such activities as the acts of wrongdoers would befit totalitarian societies, not ones in which speech is allegedly a universal right.

To be clear, prosecutors have apparently not sought to use court orders to compel either the defendants or the companies named to enter passwords or otherwise open devices or apps. But vilifying the defendants’ use of common sense encryption is a dangerous step in cases that the Dekalb County District Attorney has already dropped out of, citing “different prosecutorial philosophies.”

Using messengers which protect user communications, browsers which protect user anonymity, and employing anti-surveillance techniques when out and about are all useful strategies in a range of situations. Whether you’re looking into a sensitive medical condition, visiting a reproductive health clinic with the option of terminating a pregnancy, protecting trade secrets from a competitor, wish to avoid stalkers or abusive domestic partners, protecting attorney-client exchanges, or simply want to keep your communications, browsing, and location history private, these techniques can come in handy. It is their very effectiveness which has led to the widespread adoption of privacy-protective technologies and techniques. When state prosecutors spread fear around the use of these powerful techniques, this sets us down a dangerous path where citizens are more vulnerable and at risk.

Republished from the EFF’s Deeplinks blog.

Filed Under: encrypted messaging, encryption, fud, georgia, stop cop city
Companies: signal, telegram, tor

Ctrl-Alt-Speech: Won’t Someone Please Think Of The Adults?

from the ctrl-alt-speech dept

Ctrl-Alt-Speech is a weekly podcast about the latest news in online speech, from Mike Masnick and Everything in Moderation‘s Ben Whitelaw.

Subscribe now on Apple Podcasts, Overcast, Spotify, Pocket Casts, YouTube, or your podcast app of choice — or go straight to the RSS feed.

In this week’s round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.

Filed Under: age estimation, age verification, digital services act, dsa, ireland, nist, social media
Companies: nextdoor, telegram, tiktok, twitter, x

from the it's-like-the-old-days dept

In a bizarre turn of events over the past few weeks, Spain’s high court ordered a ban on Telegram because some users (gasp!) used the tool to share copyright-protected content. The judge then suspended his own order a few days later after receiving a lot of criticism. Then, the judge asked the police to investigate the potential impact of such a ban on users. Confused? Welcome to the twisted world of copyright nonsense.

Sometimes it’s odd to me how basically every internet speech story in the first decade of the 2000s was really a story about copyright. And then, post-SOPA, there were still some copyright stories, but things focused more on other legal issues, such as Section 230 or now the DSA. That’s not to say that copyright’s impact on speech has gone away, because of course it has not. But it’s felt strange how it seemed to at least fade a bit towards the background.

Last week, though, we had a story that felt very much like a story from a decade or so ago: Spain’s high court ordered a ban of the entire (extremely popular) Telegram messaging app after four large Spanish media companies whined about people sharing infringing materials via the app.

Judge Santiago Pedraz agreed to temporarily ban the platform after four of the country’s main media groups – Mediaset, Atresmedia, Movistar and Egeda – complained that the app was disseminating content generated by them and protected by copyright without authorisation from the creators.

Access to the platform – which is the fourth most-used messaging service in the country – will be suspended from Monday but it was already being suppressed on certain mobile phone providers on Saturday.

Just a few days later, though, after there was widespread outrage and concern about banning an entire app and what that meant for free speech, the judge suspended his own order and asked the police to determine what the impact would be of the ban:

Pedraz has now halted the order and called for a police report to investigate the impact the temporary ban might have on users.

The whole thing is bizarre on multiple levels. As I discussed on last week’s Ctrl-Alt-Speech episode, there was a period of time when Spain seemed like the one country in the world that was recognizing how copyright law should work in the internet era, making it clear that the liability should land on actual infringers, not the tools they used. However, the US entertainment industry completely lost its collective mind over such a possibility and directly gave the Spanish government new copyright laws. Then, they got the US government to declare Spain a pirate nation in the Special 301 report and threaten sanctions.

So, in response, Spain passed a long series of increasingly draconian copyright laws, even as economists noted the harm they would do. But, the Spanish government admitted that they felt they needed to pass the laws to avoid more pressure from the U.S. And the laws have only gotten worse since then.

Blocking an entire app from the entire country because a few users are abusing it to share infringing content should obviously be seen as overkill. But, again, Spanish copyright law these days is weighted so heavily in favor of industry, it doesn’t even feel all that surprising.

Still, it seems bizarre for the Judge to then ask the police to investigate the potential impact of banning an app used by 8 million Spaniards, or approximately 20% of the country’s population. Isn’t it supposed to be the judge’s job to figure that out?

Anyway, the case is still going on, so it’s possible that Telegram will get banned again down the road. But just the fact that anyone is seriously thinking about banning an entire app because some people misuse it to infringe… just kinda takes us back to ridiculous copyright takes from the early 2000s.

Filed Under: banning apps, copyright, intermediary liability, spain
Companies: telegram

Ctrl-Alt-Speech: The Most Moderated Word On Meta

from the ctrl-alt-speech dept

Ctrl-Alt-Speech is a weekly podcast about the latest news in online speech, from Mike Masnick and Everything in Moderation‘s Ben Whitelaw.

Subscribe now on Apple Podcasts, Overcast, Spotify, Amazon Music, YouTube, or your podcast app of choice — or go straight to the RSS feed.

In this week’s round-up of the latest news in online speech, content moderation, and internet regulation, Mike and Ben cover:

The episode is brought to you with financial support from the Future of Online Trust & Safety Fund. We weren’t able to schedule our usual Bonus Chat at the end of the episode so Mike and Ben talk about their thinking on podcast sponsorship, why advertising content doesn’t have to be all bad and how you get in touch if you’re a company or organization looking to reach Ctrl-Alt-Speech’s growing and global audience.

Filed Under: content moderation, oversight board, spain
Companies: bluesky, facebook, meta, telegram

Ctrl-Alt-Speech: The Global Internet – Or Is it?

from the ctrl-alt-speech dept

Ctrl-Alt-Speech is a weekly podcast about the latest news in online speech, from Mike Masnick and Everything in Moderation's Ben Whitelaw.

Subscribe now on Apple Podcasts, Spotify, Amazon Music, YouTube, or your podcast app of choice.

In this week’s round-up of news about online speech, content moderation and internet regulation, Mike and Ben cover:

The episode is brought to you with financial support from the Future of Online Trust & Safety Fund, and by our launch sponsor Modulate, the prosocial voice technology company making online spaces safer and more inclusive. In our Bonus Chat at the end of the episode, Modulate CEO Mike Pappas joins us to talk about how safety lessons from the gaming world can be applied to the broader T&S industry and how advances in AI are helping make voice moderation more accurate.

Filed Under: content moderation, ctrl-alt-speech, online speech, podcast, trust and safety
Companies: bluesky, telegram, tiktok