Pavel Durov’s arrest suggests that the law enforcement dragnet is being widened from private financial transactions to private speech.
The arrest of the Telegram CEO Pavel Durov in France this week is extremely significant. It confirms that we are deep into the second crypto war, where governments are systematically seeking to prosecute developers of digital encryption tools because encryption frustrates state surveillance and control. While the first crypto war in the 1990s was led by the United States, this one is led jointly by the European Union — now its own regulatory superpower.
Durov, a former Russian, now French citizen, was arrested in Paris on Saturday, and has now been indicted. You can read the French accusations here. They include complicity in drug possession and sale, fraud, child pornography and money laundering. These are extremely serious crimes — but note that the charge is complicity, not participation. The meaning of that word “complicity” seems to be revealed by the last three charges: Telegram has been providing users a “cryptology tool” unauthorised by French regulators.
Well, except Telegram isn’t a good tool for privacy.
There is no E2EE. Simple encryption is only available for 1:1 chats and disabled by default. Telegram doesn’t disclose their encryption methods, so there is no way to verify the (in)effectiveness. Telegram is able to block channels from their end, so there is no privacy from their end either.
That’s not the point. The hunting down on tools and their creators (and on our right to privacy) is the issue here. At least, imho.
It has nothing to do with privacy. Telegram is an old-school social network in that it doesn’t even require that you register to view the content pages. It’s also a social network taken to the extreme of free speech absolutism in that it doesn’t mind people talking openly about every kind of crime and their use of its tools to make it easier to obtain the related services. All that with no encryption at all.
Free speech is good. Government regulated speech is bad.
free speech can be good. free speech can also be bad. overall, it’s more good than bad however society seems to agree that free speech has limits - you can’t defame someone, for example
free speech absolutism is fucking dumb; just like most other absolutist stances
this also isn’t even about free speech - this is about someone having access to information requested by investigators to solve crimes, and then refusing to give that information
This is pure nonsense.
Western governments hate Telegram because until now Telegram didn’t cooperate with Western intelligence services like American social media companies do. Everything on Meta or Google gets fed into NSA, but Telegram has been uncooperative.
This will likely change after Durov’s arrest, but it was nice while it lasted.
we don’t disagree about that: governments don’t like that telegram doesn’t cooperate; that’s not in dispute
where the disagreement comes is the part after. telegram (and indeed meta, google, etc) have that data at their disposal. when served with a legal notice to provide information to authorities or shut down illegal behaviour on their platforms, they comply - sometimes that’s a bad thing if the government is overreaching, but sometimes it’s also a good thing (in the case of CSAM and other serious crimes)
there are plenty of clear cut examples of where telegram should shut down channels - CSAM etc… that’s what this arrest was about; the rest is academic
Was it? The French authorities did not provide any convincing evidence, just accusations.
you think they’re going to link to still available (that’s the point - they’re still available) sources of CSAM?
if that’s your burden of proof then buddy i’m sorry to say there’s no way anyone’s going to convince you, and that’s not a good thing
Why use a tool that relies on the goodwill of the operator to secure your privacy? It’s foolish in the first place.
The operator of that tool tomorrow may not be the operator of today, and the operator of today can become compromised by blackmail, legally compelled (see OP), physically compelled, etc to break that trust.
ANYONE who understood how telegram works and also felt it was a tool for privacy doesn’t really understand privacy in the digital age.
Quoting @[email protected] :
And frankly, if they have knowledge of who is sharing CSAM, it’s entirely ethical for them to be compelled to share it.
But what about when it’s who is questioning their sexuality or gender? Or who is organizing a protest in a country that puts down protests and dissent violently? Or… Or… Or… There are so many examples where privacy IS important AND ethical, but in zero of those does it make sense to rely on the goodwill of the operator to safeguard that privacy.
Telegram is the most realistic alternative to breaking Meta’s monopoly. You might like Signal very much, but nobody uses it and the user experience is horrible.
if metas monolopoloy is literally the only thing you care about, but replacing a terrible platform with another platform that lacks privacy protections is not much of an upgrade
deleted by creator
Joke’s on you, I use nothing by Meta, nor Signal, nor telegram. My comment had nothing whatsoever to do with what I like or not.
That apparently applies to child abuse and CSAM
Questionable interpretation. Privacy doesn’t mean mathematically proven privacy. A changing booth in a store provides privacy but it’s only private because the store owner agreed to not monitor it (and in many cases is required by law not to monitor it).
Effectively what you and the original commenter are saying (collectively) is that mathematically proven privacy is the only privacy that matters for the Internet. Operators that do not mathematically provide privacy should just do whatever government officials ask them to do.
We only have the French government’s word to go off of right now. Maybe Telegram’s refusals are totally unreasonable but maybe they’re not.
A smarter route probably would’ve been to fight through the court system in France on a case by case level rather than ignore prosecutors (assuming the French narrative is the whole story). Still, I think this is all murkier than you’d like to think.
It’s a street, not a changing booth. Also, I’m familiar with every charge against Durov and I personally have seen the illegal content I talked about. If it’s so easily accessible to the public and persists for years, it has nothing to do with privacy and there is no moderation - though his words also underscore the latter.
Who said it’s a street? What makes it a street?
Did you seek it out? I and nobody I know personally, have ever encountered anything like what was described on that platform and I’ve been on it for years.
Was it the same “channel” or “group chat” that persisted for years?
What gives them the right or responsibility to moderate a group chat or channel more than say Signal or Threema? Just because their technical back end lets them?
I mean by that argument Signal could do client side scanning on everything (that’s an enforcement at the platform level that fits their technical limitations). Is that where we’re at? “If you can figure out how to violate privacy in the name of looking for illegal content, you should.”
Nothing Telegram offers is equivalent to the algorithmic feeds that require moderation like YouTube, Twitter, Instagram, or Facebook, everything you have to seek out.
Make no mistake, I’m not defending the content. The people who used the platform to share that content should be arrested. However, I’m not sure I agree with the moral dichotomy we’ve gotten ourselves into where e.g., the messenger is legally responsible for refusing service to people doing illegal activity.
I won’t go into the specific channels as to not promote them or what they do but we can talk about one known example, which is how Bellingcat got to the FSB officers responsible for the poisoning of Navalny via their mobile phone call logs and airline ticket data. They used the two highly popular bots called H****a and the E** ** G**, which allow to get everything known to the government and other social networks on every citizen of Russia for about $1 to $5. They use the Telegram API and have been there for years. How do you moderate that? You don’t. You take it down as the illegal, privacy-violating, and doxing-enabling content that it is.
Edit: “Censored” the names of the bots, as I still don’t want to make them even easier to find.
Was that a bad thing? I’ve never heard the name Bellingcat before, but it sounds like this would’ve been partially responsible for the reporting about the Navalny poisoning?
Ultimately, that sounds like an issue the Russian government needs to fix. Telegram bots are also trivial to launch and duplicate so … actually detecting and shutting that down without it being a massive expensive money pit is difficult.
It’s easy to say “oh they’re hosting it, they should just take it down.”
https://www.washingtonpost.com/politics/2018/10/16/postal-service-preferred-shipper-drug-dealers/
Should the US federal government hold themselves liable for delivering illegal drugs via their own postal service? I mean there’s serious nuance in what’s reasonable liability for a carrier … and personally holding the CEO criminally liable is a pretty extreme instance of that.
Telegram is in the news often for public groups with lots of crime
“The news” is too vague a source to dispute.
deleted by creator
Signal can very clearly see all the messages you send if they just add a bit of code.
deleted by creator
I am going to quote myself here:
Allow me to quote myself too, then:
I do not disagree with your remarks (I do not use Telegram), I simply consider it’s not the point or that it should not be.
Obviously, laws should be enforced. What those laws are and how they are used to erode some stuff that were considered fundamental rights not so long ago is the sole issue, once again, im(v)ho ;)
It IS the point. If Telegram was designed and set up as a pure carrier of encrypted information, no one could/should fault them for how the service is used.
However, this is not the case, and they are able to monitor and control the content that is shared. This means they have a moral and legal responsibility to make sure the service is used in accordance with the law.
The point is that if you’re going to keep blackmail, you have to share with the government.
The easy answer is to stop keeping blackmail.
Signal fans being edgy cool kids
Signal has its own issues. At least it has proper encryption
Yay, let’s all hate on the one crypto messenger, that is independently verifiably secure.
If Telegram wasn’t good for privacy, Western governments would not be trying to shut it down.
E2EE is nice, but doesn’t matter if the government can just sieze or hack your phone. Much better to use non-Western social media and messaging apps.
deleted by creator
Dis you miss the entire Snowden revelations? Western governments are hostile to online privacy and freedom.
They are not trying shutdown Telegram, they are trying to control it.
What kind of argument is this supposed to be? Governments can size your phone anywhere … oh wait … lemmy.ml … yeah, I see…
They like to poke fun at the “west” but Russia, China and others are all worse some how. At least in most countries it is controversial to attack journalists and encryption
deleted by creator
In case you are serious: Lemmy.ml is known for being a tankie instance. So a nonsensical anti-west statement makes a lot more sense considering the instance the user chose.
If it would be a good tool for privacy, Russia would try to shut it down the same way they did with Signal.
Russia tried for years to ban Telegram. They stopped after Telegram managed to keep itself alive by proxies.
they did ban it, and everyone still used it (Telegram was good at evading the bans back then, but eventually Roskomnadzor became decent at banning it), and then they unbanned it, whatever that means