If the answer is yes then law enforcement can too.
https://www.forbes.com/sites/anthonykosner/2012/08/05/how-se...
Is it technically possible for them to see it: yes
Does Telegram let them see it: I don't think so. That seems to be the core issue around Durov being arrested.
They probably should implement E2EE for everything. Then they will have a good excuse not to cooperate, because they simply don't have the data.
This is exceptionally naive. Even if he was arrested for not sharing with the French, what about for other countries? Was he arrested for not ever sharing or not sharing enough? Even if he, personally, has never shared, that doesn’t say anything about his employees who have the same access to these systems.
Your data is not private with Telegram. You are trusting Telegram. It is a trust-based app, not a cryptographically secure app.
If you trust telegram, that’s your choice, but just because a person says the right words in interviews doesn’t mean your data is safe.
Also, you could just send a notification instructing the app to fetch a new message from your server.
From the docs:
Encryption for data messages
The Android Transport Layer (see FCM architecture) uses point-to-point encryption. Depending on your needs, you may decide to add end-to-end encryption to data messages. FCM does not provide an end-to-end solution. However, there are external solutions available such as Capillary or DTLS.
https://firebase.google.com/docs/cloud-messaging/concept-opt...
And if they are so off base on this, they must either be incompetent or liars. Neither of which builds trust.
The UAE requires decryption keys as part of their Telco regulations.
If Telegram can operate in the UAE without VPN (and it can), then at the very least the UAE MoI has access.
They (and their shadow firms like G42 and G42's shadow firms) were always a major buyer for offensive capabilities at GITEX.
On that note, NEVER bring your personal phone to DEFCON/Blackhat or GITEX.
Edit: cannot reply below so answering here
Cybersecurity conferences.
DEFCON/Blackhat happen during the same week, so you have a lot of script kiddies who lack common sense trying to pwn random workloads. They almost always get caught (and charged - happens every year), but it's a headache.
GITEX is MENA and Asia's largest cybersecurity conference. You have intelligence agencies from most of the Middle East, Africa, Europe, and Asia attending, plus a lot of corporate espionage because of polticially connected MSSPs as well as massive defense tenders.
I'm genuinely interested.
He explained in his blog why he doesn't like E2EE:
https://telegra.ph/Why-Isnt-Telegram-End-to-End-Encrypted-by...
Why Isn’t Telegram End-to-End Encrypted by Default?
Pavel Durov August 15, 2017
They know what is being said and that’s what they want to arrest, that information can be sent and received. And by “they” I mean more than just the French. That was just coincidental and pragmatic.
The French state does not operate that quickly on its own, to get an arrest warrant five minutes after he landed and execute on it immediately. That has other fingerprints all over it in my view.
I do think so: https://archive.is/M5zw4
Also, 'exile' https://istories.media/en/news/2024/08/27/pavel-durov-has-vi...
Certainly not because then Telegram would lose alot of its functionality that makes it great. One thing that I really enjoy about Telegram is that I can have it open and synched across many independent devices. Telegram also has e2e as an option on some clients which cant be synched
I don't see anywhere saying he's been arrested for anything to do with encryption or cooperating with investigations.
eg https://www.bbc.co.uk/news/articles/ckg2kz9kn93o but pretty much all the sources I have read say the same
To me it's a good tradeoff, of course I wouldn't use Telegram for anything illegal or suspect.
Besides Slack and Discord and Teams and whatever the heck Google has these days and iMessage and...
I think you mean it's the only messaging app that purports to have a focus on security where messages are stored in the cloud, which is true, but also sus. There's a reason why none of the others are doing it that way, and Telegram isn't really claiming to have solved a technical hurdle that the E2E apps didn't, it's just claiming that you can trust them more than you can trust the major messaging apps.
Maybe you can and maybe you can't, the point is that you can't know that they're actually a safer choice than any of the other cloud providers.
Granted it can be clunky at times, but the properties are there and decentralised end to end encrypted messaging is quite and incredible thing. (Yes, Matrix nerds, it's not messaging per se it's really state replication, I know :))
All the cool kids in the block eliminated the need to trust the provider decades ago. PGP: 33 years ago, OTR 20 years ago, Signal 14 years ago.
Worse, iPhones immediately start backing up to iCloud when set up for a new user - the only way to keep your network passwords and all manner of other stuff from hitting iCloud servers is to set the phone up with no network connection or even a SIM card installed.
Did I mention there's no longer a SIM slot, so you can't even control that?
And that iPhones by default if they detect a 'weak' wifi network will switch to cellular, so you can't connect the phone to a sandboxed wifi network?
You shouldn't have to put your phone in a faraday cage to keep it from uploading plaintext versions of your private communications and network passwords.
Yet, Apple tries to create an image that iPhone is a "secure" device, but if you use iCloud, they can give your contact list to government any time they want.
Apple by default doesn't use E2E for cloud backups, and Telegram doesn't use E2E for chats by default. So Telegram has comparable level of security to that of the leaders of the industry.
So it's not as much as trade-off, as it is half-assed security design.
Yeah, it's a bit of a joke.
Unreal. Please share how you came to this world view.
Wrong, Matrix does it too, but fully e2ee.
> and allows you to access your chats from any device.
No it doesn't, because it is possible withh e2ee as well
Instagram. FB Messenger. Skype. LINE. KakaoTalk. Discord. Slack. Teams. iMessage.
So do all the others with the exception of something like IRC.
Of course you can send your backup to Google for WhatsApp and signal but that's optional. You can keep it locally too. And it's encrypted too. With WhatsApp you can even choose to keep the key locally only.
Once that’s set, after the SMS code, then (assuming you don’t have access to an existing logged in device because then you are already in…), you can either reset the password via an email confirmation _or_ you can create a new account under that phone number (with no existing history, contacts, etc).
If you set a password and no recovery email, there is no way for them to get access to your contacts or chat history barring getting them from Telegram themselves.
I upload encrypted backups to a cloud service provider (AWS, Google Cloud). I go to another computer, download them, use a key/password to decrypt them.
Sure, I get it, you're typing in something that decrypts the data into their app. That's true of all apps including WhatsApp, etc... The only way this could really be secure is if you used a different app to the encryption that you wrote/audited such that the messaging app never has access to your password/private key. Otherwise, at some point, you're trusting their app to do what they claim.
> use a key/password
The previous poster intentionally mentioned password recovery flow. If you can gain access without your password, than law enforcement can too. If you could only gain access with your password, you could consider your data safe.
Client creates a Public Private key pair used for E2EE.
Client uses the 'account password (raw)' as part of the creation of a symmetric encryption key, and uses that to encrypt and store the SECRET key on the service's cloud.
NewClient signs in, downloads the encrypted SECRETKeyBlob and decodes using the reconstructed symmetric key based on the sign in password. Old messages can then be decoded.
-- The part that's insecure. -- If the password ever changes the SAME SECRET then needs to be stored to the cloud again, encrypted by the new key. Some padding with random data might help with this but this still sounds like a huge security loophole.
-- Worse Insecurity -- A customer's device could be shipped a compromised client which uploads the SECRET keys to requesting third parties upon sign-in. Those third parties could be large corporations or governments.
I do not see how anyone expects to use a mobile device for any serious security domain. At best average consumers can have a reasonable hope that it's safe from crooks who care about the average citizen.
You can't use your password as input to the mud puddle test.
I found it interesting that countries like Singapore haven’t introduced requirements for backdoors. They are notorious for passing laws for whatever they want as the current government has a super majority and court that tends to side with the government.
Add on top Telegram is used widely in illegal drug transactions in Singapore.
What’s the reason? They just attack the human factor.
They just get invites to Telegram groups, or they bust someone and force them to handover access to their Telegram account. Set up surveillance for the delivery and boom crypto drug ring is taken down. They’ve done it again and again.
One could imagine this same technique could be used for any Telegram group or conversation.
Edit: Actually, yeah that proves your point.
- locally create a recovery key and use it to wrap any other essential keys
- Split that or wrap that with two or more keys.
- N - 1 goes to the cloud to be used as MFA tokens on recovery.
- For the other, derive keys from normalized responses to recovery questions, use Shamir's secret sharing to pick a number of required correct responses and encrypt the Nth key.
You can recover an account without knowing your original password or having your original device.
As an alternative, Signal or Jami conversations are always e2e encrypted.
Maybe in the future, creators of encrypted messaging apps will get locked up. I certainly hope not. But this case doesn’t indicate anything one way or another.
I dunno man, kinda seems like you ought to either have a right to privacy or not. Surely there's other ways to make a case, without extraordinarily abusable legal strong-arming.
Why should a wealthy person be able to legally afford encrypted communication on a secure device, when 90+% of people can't because they're poor and tech illiterate?
Does our historically unequal society need more information and rights asymmetry between rich and poor? Between privileged and marginalized?
We should also not forget that, in the time when all social media (Reddit, X, Instagram etc.) close their APIs, Telegram is one of the only networks that still has a free API.
Telegram would be fine if it advertised itself as a public square of the internet, like Twitter does. Instead, it lures people into false sense of security for DMs and small group chats, which is what Green's post and thus this thread is ultimately about.
Free API doesn't mean anything until they fix what's broken, i.e. provide meaningful security for cases where there's reasonable expectation of it.
Most social media platforms doesn't support e2ee.
Some chat apps do support e2ee but also requires a god damn phone number to login (yeah so does telegram), this makes "encryption" useless because authorities just ask the teleco to hand out the login SMS code.
Telegram has E2E encryption, but only in Secret Chats: https://telegram.org/faq#secret-chats
There is even that freqtrade bot that runs on telegram, even RSS bots. It really is amazing. So easy to use for chat ops.
I don't know what else you would use the API for.
For instance 2 days ago my partner wanted to show me a message her friend sent, went to whatsapp and couldn't find it then realized said friend had used instagram DM for that. Most people don't care enough.
Do you want to say that social networks must implement E2E? Personally I think it is a good idea, but existing social networks and dating apps do not implement it so Telegram is not obliged to do it as well.
As for promises of security, everybody misleads users. Take Apple. They advertise that cloud backups are encrypted, but what they don't like to mention is that by default they store the encryption keys in the same cloud, and even if the user opts into "advanced" encryption, the contact list and calendar are still not E2E encrypted under silly excuse (see the table at [1]). If you care about privacy and security you probably should never use iCloud in the first place because it is not fully E2E encrypted. Also note, that Apple doesn't even mention E2E in user interface and instead uses misleading terms like "standard encryption".
This is not fair. Apple doesn't do E2E cloud backups by default and nobody cares, phone companies do not encrypt anything, Cloudflare has disabled Encrypted Client Hello [2], but every time someone mentions Telegram, they are blamed for not having E2E chats by default. It looks like the bar is set different for Telegram compared to other companies.
[1] https://support.apple.com/en-us/102651
[2] https://developers.cloudflare.com/ssl/edge-certificates/ech/
Telegram is fast, responsive, gets frequent updates, has great group chat, tons of animated emojis, works flawlessly on all desktop and mobile platforms, has great support for media, bots, and a great API, allows edits and deleting messages for all users, and I really like the sync despite it not being e2e.
I use both on desktop for different people and the desktop Signal client doesn’t hold up well in comparison. In some ways it feels more clunky than the iMessage ancestor iChat did 20 years ago.
Really? I don't see any real difference between the UX of WhatsApp and Signal for example. And they're really on-par feature wise.
The only things in your list that are not available on Signal are "tons of animated emojis" and "bots". Recently they also introduced usernames to keep your phone number private. And Signal have had all the other things for a few years now, and with actual security.
https://signal.org/blog/phone-number-privacy-usernames/
Signal doesn't require sharing of phone numbers
And it has those little features like masked text and what not, features wise, telegram is just the best. I didn’t use Signal for a long time, you can’t edit the messages there!?
It no longer doesn't. It took them a while because you can't just slap features like that. It's not a string in a database like with Telegram.
Telegram has great UX because you can build things fast and easy when you don't have to give two shits about the security side of things. You can cover that part with grass-roots marketing department and volunteering shills.
# No smooth animations - that's makes Telegram stand out from everything else here, but maybe not everyone is happy when 6-core phones can deliver something more than 60fps in 2024...
That's what I remember and yes - mostly those are probably easy to fix UI/UX features/bugs, but even being open-source - they aren't.
Signal is excellent for tiny groups of known participants. I prefer it over anything else for this use case. The group permissions Signal introduced a few years ago are well suited for that purpose. I've recently started running small groups on Signal with about 100 participants who mostly know each other, but not tightly. The recent addition of phone number privacy makes this feasible.
Once you start moving up in scale you really need moderation tools, and Signal doesn't do so well there. When you have thousands of people and it's open to the public you need to moderate or else bad actors will cause your valuable contributors to leave. Basic permissions like having admins who can kick people out and restricting how new members can join only gets you so far.
The issue is that in Signal there is no group as far as the server is concerned: The state of the group exists only on client devices and is updated in a totally asynchronous manner. As a consequence it is more difficult for Signal to provide such features. For example, Signal currently has no means to temporarily mute users, to remove posts from all group members, easy bots to deal with spam, granting specific users special privileges like ability to pin messages, transferable group ownership as opposed to a flat "admin" privilege, etc.
Think about the consequences of Signal's async nature with no server state: What does it mean to kick someone out? An admin sends a group update message that tells other clients to stop including that user in future messages. Try this: Have a group member just delete Signal and then re-register. Send a message to the group. They're still in the group. You get an identity has changed message. These are really only actionable with people who you know... that is, in tiny groups.
And then, the biggest strengths of Signal, which are its end to end encryption and heroic attempts to avoid giving the server metadata, are less valuable in the context of a large public group: Anyone interested in surveilling the group can simply join it, so you have to assume you're being logged anyway. Signal lacks strong identities as a design choice, so in big groups it's harder to know who you're really talking to like you know that "Joe Example, founder of Foo Project" is @Foo1988 on Telegram and @FooOfficial on X and u/0xFooMan on Reddit.
What's the difference between a fiat and a ferrari? What's the difference between CentOS and Linux Mint? What's the difference between a macdonalds and a michelin burger?
I have friends and groups on both platforms. On Signal, I'm basically just sending messages (and only unimportant one, like, when are we meeting. Sending media mostly sucks so I generally only have very dry chats on Signal).
Whereas on Telegram, I'm having fun. In fact it's so versatile, that my wife and I use it as a collaborative note-taking system, archiver, cvs, live shopping list, news app (currently browsing hackernews from telegram), etc. We basically have our whole life organised via Telegram. I lose count of all the features I use effortlessly on a daily basis, and only realise it when I find myself on another app. This is despite the fact that both Signal and whatsapp have since tried to copy some of these features, because they do so badly. A simple example that comes to mind: editing messages. It took years for whatsapp to be able to edit a message (I still remember the old asterisk etiquette to indicate you were issuing a correction to a previous message). Now you can, but it's horrible ux; I think you long press and then there's a button next to copy which opens a menu where you find a pencil which means edit, or sth like that. In telegram I don't even remember how you do it, because it's so intuitive that I don't have to.
Perhaps that's why I find the whole "Telegram encryption" discussion baffling to be honest. For me, it's just one of Telegram's many extra features you can use. You don't have to use it, but it's there if you want to. I don't feel like Telegram has ever tried to mislead its users that it's raison d'etre is for it to be a secret platform only useful if you're a terrorist (like the UK government seems to want to portray it recently).
I get the point about "encryption by default", but this doesn't come for free, there are usability sacrifices that come with it, and not everyone cares for it. Insisting that not having encryption by default marrs the whole app sounds similar to me saying not having a particular set of emojis set as the default marrs the whole app. It feels disingenuous somehow.
It’s more like they implemented it to check a box …
I guess I fail to see the need for having fun in a messaging app. Signal covers all my major requirements, Telegram, while fun, does not.
The first feature that comes to mind for me is being able to use multiple devices. Signal only allows using it with one phone. If you add a second device, the first one stops working. You can use a computer and a phone, but not multiple phones. Telegram supports this without any issues. I still struggle to understand this limitation.
Honestly it would be better if Telegram dropped the facade of having E2EE. It's generally very low on the priority list of most people anyway, as much as it would hurt anyone reading this, but that's the truth. People are not using it for secure messaging, but for a better UX and reliability.
EDIT: Telegram does require a phone number to sign up.
Do they still not require ID when you buy a SIM card in Ukraine?
It's only on HN I ever see people set up Telegram as some supposed uber-secure private app for Tor users and then demolish that strawman gleefully.
Today, on the same topic, another tech site which generally gets a lot of things right (but whoever is responsible for writing about Telegram, or maybe their internal KB, is consistently wrong and doesn't care about feedback) wrote that it is an encrypted chats service: https://tweakers.net/nieuws/225750/ceo-en-oprichter-telegram... ("versleutelde-chatdienst" means that for those fact checking at home)
The average person I know that uses Telegram ("non-techie" as GP comment put it) certainly doesn't. People join telegram because it has a group they want to join, or via word-of-mouth of a friend recommending it. Normal people don't read tech news, and if they do they don't give it much weight.
Maybe that sucks, maybe they'd be better off somehow if they did, but the reality is that most people live in a different universe from those of us who care about e2ee security or read tech news with interest.
Also, people tend to state they have nothing to hide, when they feel they have nothing to fight with. But I can't count the number of times I've seen a stranger next to me on a bus cover their chat the second I sit next to them. Me, a complete random person with no interest in their life is a threat to them.
I just did it to gather anecdotal evidence and the answer was, the founder is in jail to protect their privacy.
the perceived secure nature of telegram has been memorialized in mainstream rap, courtesy kendrick lamar in 2017 (https://genius.com/11665524).
There is so much misinformation around telegram that alone made me trust it more (if a known liar tries to discredit something, it increases chances of it being good--it is about comments here on HN).
https://telegram.org/faq#q-do-you-process-data-requests
> To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions. The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.
> Thanks to this structure, we can ensure that no single government or block of like-minded countries can intrude on people's privacy and freedom of expression.
> Telegram can be forced to give up data only if an issue is grave and universal enough to pass the scrutiny of several different legal systems around the world.
> To this day, we have disclosed 0 bytes of user data to third parties, including governments.
That's true.
You need to run your own platform people. XMPP is plenty simple, plenty powerful, and plenty safe -- and even your metadata is in your control.
Just self host. There's no excuse in 2024.
Wake up people!
Why should the arrest of someone else affect YOU?
I'm someone who's been on the business end of a subpoena for a platform I ran, and narcing on my friends under threat of being held in contempt is perhaps the worst feeling I'm doomed to live with.
"XMPP is ..." not the solution I'd recommend, even with something like OMEMO. Is it on by default? Can you force it to be turned on? The answer to both of those is, as it turns out, "no," which makes it less than useful. (This is notwithstanding several other issues OMEMO has.)
Gung-ho evangelists rarely convert like a reasonable take on the subject does
> Just self host. There's no excuse in 2024.
I hate to break it to you, but there's plenty of excuses. We live in a bubble on HN.May I remind you what the average person is like with this recently famous reddit post:
If you want self hosting to happen, with things like Matrix, and so on, the hard truth is that it has to not be easy for someone who can program, but trivial for someone who says "wow, can you hack into <x>" if they see you use a terminal
Self-hosting is terrible in that it gives Mike, the unbeknownst creepy tech guy in the group 100% control over the metadata of their close ones. Who talks to whom, when etc. It's much better to either get rid of that with Tor-only p2p architecture (you'll lose offline-messaging), or to outsource hosting to some organization that doesn't have interest in your metadata.
The privacy concern Green made was confidentiality of messages. There is none for Telegram, and Telegram should have moderated content for illegal stuff because of that. They made a decision to become a social media platform like Facebook, but they also chose not to co-operate with the law. Durov was asked to stop digging his hole deeper back in 2013, and now he's reaping what he sow.
Simply not cooperating with law enforcement is technically moderately difficult, but politically and legally impossible.
Between a difficult and an impossible option, the rational decision is to pick the difficult one.
Given that users can access their messages without interaction with people at Telegram, automatic aggregation of the cloud data for single end points is in place.
In consequence the data can be accessed from a single jurisdiction anyways.
We can be null in cryptography, but handing over both the secret and the key to this secret to the very same person is quite a trustful step, even when they say 'I promise I will not peek or let others peek, pinky promise!' - with an 'except if we have to or if we change our mind' in the small prints or between the lines.
> Translated: Contrary to what has been publicly stated so far, the operators of the messenger app Telegram have released user data to the Federal Criminal Police Office (BKA) in several cases.
https://torrentfreak.com/telegram-discloses-user-details-of-...
> Telegram has complied with an order from the High Court in Delhi by sharing user details of copyright-infringing users with rightsholders.
Anyways just some examples in which their structure doesn't matter. In the end, user data is still given away. It's also why e2ee should be the sole focus. Everything else is "trust me bro it's safe" levels of security.
This is utter bullshit I debunked back in 2021.
https://security.stackexchange.com/questions/238562/how-does...
Or the CEO and owner, staring down the barrel of a very long time in prison, obtains the keys from his employees and provides them to the authorities.
Would he do this? To me, it matters little how much I trust someone and believe in their mental fortitude. I could instead rely on mathematical proofs to keep secrets, which have proven to be far better at it than corporations.
Also
> To this day, we have disclosed 0 bytes of user data to third parties, including governments.
Didn't they conclude an agreement with Russian gvt in 2021?
That's all you need to know. Matrix and Signal can't be forced in any way.
Many rooms are not encrypted because they are public rooms, where there would be no point in it. Encryption has been the default for quite a while now.
"At St. Petersburg State University, Mr. Durov studied linguistics. In lieu of military service, he trained in propaganda, studying Sun Tzu, Genghis Khan and Napoleon, and he learned to make posters aimed at influencing foreign soldiers."
https://www.nytimes.com/2014/12/03/technology/once-celebrate...
You really think the FBI would casually go to Durov and start telling him which libraries to deploy in his software.
This "They're trying to influence me that means its working" 5D-chess is the most stupid way to assess security of anything.
There's nothing to backdoor because it's already backdoored:
Code does not lie about what it does. And Telegram clients' code doesn't lie it doesn't end-to-end encrypt data it outputs to Telegram's servers. That's the backdoor. It's there. Right in front of you. With a big flashing neon light says backdoor. It's so obvious I can't even write a paper about it because no journal or conference wouldn't accept me stating the fucking obvious.
For desktop secret chats you may use Unigram client (although it’s hard for me to justify a potentially non-mobile secret chat).
The rest is trivial and isn’t that hard unless you contact hundreds of new people a day. In that case, I’d already thought of using ahk automation or a full-blown telethon bot.
Telegram serves one set of goals, Signal serves another. E.g. I would not trust Signal to remember that recipe photo a friend posted from their old phone last year in my group chat, or for the wedding photos they shared in the 'wedding' group to survive when I want to make an album for their on their 10 year anniversary. But I would 100% trust Telegram for this. Specifically because I know Telegram is not an encrypted communications service but a social platform, and that group chats are stored unencrypted on a server. Which is literally what I want. Just because it also happens to separately offer encryption functionality for 1on1 chats doesn't mean it needs the whole security shebang (which necessarily has consequences on other functions that people already use and rely on).
It's like I said "macdonalds does indeed have salads" and you reply "Great I'll go to macdonalds and ask for soylent green OH WAIT THEY DON'T HAVE ANY, HUR HUR HUR". Well, no, it's true, they don't. If you want soylent green go somewhere else. Specifically don't go to macdonalds and have a tantrum.
Is Discord end to end encrypted, is IRC? Nope, does it make them useless? Again no.
Same with Telegram, it's a chat tool where you can select your audience and have a good UX with native bot support. (like Discord and IRC).
That's what I want, nothing more.
If I want to plan a coup, I'd use something else of course.
it’s their own fault. a better question might be:
why do they keep over and over crying when people call them out for endangering their users? it’s super odd.
Neither discord, nor any of the popular IRC clients (HexChat, WeeChat, mIRC) even mention the word security or privacy to promote their products.
Moreover, as Mathew Green mentioned in his blog post, there are many instances where Telegram (or Pavel Durov) has gone out of his way to attack the encryption offered by Signal and WhatsApp. If he were pitting his messenger against discord, why would he be worried about Signal or WhatsApp?
> I am not specifically calling out Telegram for this, since the same problem [with metadata] exists with virtually every other social media network and private messenger.
Notably, Signal offers a feature called Sealed Sender[0]. While it doesn't solve the metadata problem entirely, it does at least reduce it a bit.
* https://www.ndss-symposium.org/wp-content/uploads/ndss2021_1...
Generally you need something like TOR to hide who is talking to who.
As for TOR, that wouldn't really help much, would it, given that the described attack is at the application level of Signal. Or are you talking about not using Signal altogether?
The arrest cites that he was not cooperating with authorities to crack down on various drug illegal activities on telegram. None of the other social networks have their ceos arrested. Is it simply that telegram is the only one without backdoors for five eyes?
It seems to me the secret chat feature actually works too well?
He's under arrest precisely because it is bad enough that Telegram is in a position to share data with law enforcement, but it chooses not to.
It's probably not enough for French authorities to know that some other country's equivalent is getting a copy of all messages and metadata when they want it themselves.
Even so, most messages sent on Telegram are plaintext, they're encrypted only in transport layer, but Telegram's servers see them in full. Secret chats (the only E2EE chats on Telegram) are hidden away from the users, hence the original link.
But that's why it's good. With all the mainstream media censoring stuff, telegram was a (good for the people) exception.
On the other hand, that's probably why they arrested him.
you contradict yourself in the same sentence
Your browser sends a clear message over an encrypted pipe, and the server on the other side, sees this clear message.
Telegram channels are public, unencrypted web shops for all kinds of illegal goods. I guess the French government alleges that Durov is not doing enough to stop these activities on his platform.
It doesn't necessarily have anything to do with encryption.
(At least at the moment, in most countries) it's not illegal to not ship a backdoor in your end-to-end-encrypted software upon government request, but in most it is illegal to not share data you're holding in a form accessible to you when you receive a warrant for it.
Personally I find Telegram kind of refreshing in nowadays internet landscape where everything is so sanitized. You can discover all kinds of niches you never knew existed.
Do you honestly think that any backdoor would be used for such mundane crimes? Even more so, it being in any way acknowledged that there might be a backdoor?
On that topic, it's highly likely Telegram is cooperating with Russian LE. Services and people that don't get thrown out quickly in Russia.
> The arrest cites that he was not cooperating with authorities to crack down on various drug illegal activities on telegram. [...] None of the other social networks have their ceos arrested.
Because if you want to operate in any country, you're either cooperating with the authorities or you'll get shut down or arrested. Hiding evidence you have is not tolerated anywhere.
and even eventually ended to become a major propaganda tool for the Russian army.
- Telegram is not encrypted from Putin's perspective
- Telegram is encrypted from everyone else's perspective
We had a nice scandal of sorts here in Denmark where a bunch of young men shared pictures of young women without consent. If you’re old enough to remember those old “rate this girl” web pages from the 90ies you’ll know what the pictures were used for. Basically it was a huge database on hot girls in Denmark and where they went to school. Today around 1000 young men have that on their permanent record as Facebook worked with law enforcement to catch the criminals. Telegram doesn’t do that. This was even a little more innocent that it may sound, considering the men were at least aged similar to the women they were sharing pictures of. Disgusting and illegal, but Telegram houses far worse and refuses to deal with it.
I know a lot of tech minded people are up in arms over this, but it’s really mainly about not wanting an unmoderated social network. Not because big brother is angry, but because people use it to organise bullying, share revenge porn, sell drugs and far, far, worse. There is also political factions within the EU who rants to kill encryption (though they were severely weakened when the brits left), but the anger against SoMe platforms is much more “European”. In that we (and I say this as the EU culture in general, not as in 100% of us) tend to view the people who enable bad behaviour as being participating in that behaviour. Platforms like Facebook, Twitter, Instagram and YouTube have been sort of protected by being early movers with mass adoption. Being American companies probably helps as well considering EU / US relations. Telegram never had such advantages, and is further disadvantaged by how its almost exclusively used for crime in Western Europe.
Obviously banning the platform won’t help. There will just be another platform. But then, we’ve also been losing a drug war for 50+ years even though we can’t even keep drugs out of our prisons.
Fapping on? And what's the problem with that, exactly?
Because it was so bad he had access to all that content, and because he had access to it, he should have moderated it, and because he didn't he's now arrested.
>Is it simply that telegram is the only one without backdoors for five eyes?
Telegram doesn't have a backdoor. Its open source client can be used to verify it leaks every group message, and every desktop message you ever send, to the service provider without ever applying secret-chat grade encryption
>It seems to me the secret chat feature actually works too well?
Well, Signal can be used to verify its end-to-end encryption is actually used everywhere, but nobody's calling for arresting Moxie or Meredith. So maybe playing 5D-chess over the news isn't working, unless you're here just to amplify this ridiculously fallacious line of thinking.
“They practically detained the head of communication of the Russian army,”
This is exactly the problem with Telegram. Telegram defaults to client-server encryption for everything, and you can't enable end-to-end encryption for anything on desktop, or group chats ever. Only 1:1 chats and calls on mobile have end-to-end encryption. Client-server encryption is exactly the "100% secure encrypt in wire". When that data arrives to the server, it's no longer encrypted, and Telegram can do whatever it wants with that data, including leaking it to some state actor (like FSB/SVR).
>Anything you type into the chat box is only encrypted by the app after you type and probably storing it in the clear in some local SQLite db.
If endpoint security is of concern, your options with networked TCBs are quite limited. Are you sure the malware doesn't have a chance to escalate its privileges and read messages in clear from RAM?
>It gives them a whole bunch of options to mess with that plain text data.
I'm looking forward to hearing about how you managed to fix this. Should we implement memory as eFuses (https://en.wikipedia.org/wiki/EFuse) to prevent editing logs? What if the user wants to delete his messages?
>Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.
E.g. with Signal android, you can pull off the APK from the device, and compare its hash against the client that was reproducibly built from the source code you have in your possession. Been there done that https://imgur.com/a/wXYVuWG
>I am amazed at the low quality comments here.
Too bad you're not exactly improving them with your nonsense.
Um, surely you understand the difference between piping random-looking bytes uselessly to whoever and having a readable copy of all data readily available to whoever hacks the system or applies for a sysadmin role? Or are you making the assumption that people use a closed-source client and the server can push malicious code?
> Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.
Doesn't work if you have third parties also working with the system or forking the code to work with it. It gets noticed. Your concept of "e2ee can be 100% leaked anyway" only works if you don't know what code you're running. You need to trust the community in general to uncover issues you've overlooked (in the code or build process) but that's not the same as not having encryption at all. You can't audit the servers but you can audit the client code.
My point is that this community could just be your friendly CIA operatives running the show with a veneer of open source. Also this “community” has no liability unlike the closed platform companies.
Even worse than Apple. They at least have some e2ee options.
Sure, it may not be on the same level as Signal when it comes to security but it simply is leagues above others in terms of usability, stability and bells&whistles. It's like comparing a Ford Zephyr with a Volvo EX30.
Obviously if your phone is compromised your e2ee chat is not safe.
Pretty much, a lot of people think that seeing E2EE means everything is safe, which I believe gives a false sense of security. You can have your phone compromised (especially when I know your phone number, Signal I’m looking at you) or be subject to other means of attacks, exposing everything. I would rather know that this app is not secure so I don’t share anything important, while keeping secure communication to other means.
And I think WhatsApp probably does it, otherwise why the authorities never complied that WhatsApp did not let them see the conversations?
Rule of thumb: never trust anything Facebook. I’m sure sending your messages through mail is more secure and private than WhatsApp these days.
Matrix is far better in terms of security than Signal, but Matrix is far behind compared to Telegram features.
Knowing someone's phone number doesn't automatically let you compromise their device. This is such a ridiculous argument.
>I would rather know that this app is not secure so I don’t share anything important, while keeping secure communication to other means.
This is nirvana fallacy. It's essentially saying "We should not talk about Telegram lying about its security, when in reality nothing is 100% secure". Yeah, nothing is, there's always an attack. That doesn't contribute anything of interest to the topic, it just tries to kill the criticism. And I'm saying this as someone who has worked on this exact topic for ten years: https://github.com/maqp/tfc
One way or another, phone numbers are like home addresses in the digital world. Once exposed, it’s just a matter of time and resources dedicated to that. Not to mention, sometimes it’s just needed to cross over the identity, that’s it.
> This is a nirvana fallacy. It's essentially saying
I didn’t say that. As I mentioned in the other comment to you, some or a lot of people just don’t care about security, and as long as this info is known, it should be treated just like any social media.
Great project with TFC, I never heard of it, but it looks interesting. I would definitely give it a try! I have a question though: does your project require a phone number? If not, why? And would you recommend Signal to anyone who is after security, privacy, and anonymity?
No idea how secure the encryption is, but calling someone on Telegram is safer than sending texts.
Yes, and that's where the 'practical' argument pops up. With all the E2EE buzz, is it really helping in the scenarios where it's supposed to work the best?
This thread gives an overview on why Signal and other apps are not really practical: https://x.com/Pinboard/status/1474096410383421452
> The broader problem of ephemeral or spur of the moment protest activity leaving a permanent data trail that can be forensically analyzed and target individuals many years after the fact is unsolved and poses a serious risk to dissent. But E2E is not the solution to it.
> I feel like Moxie and a lot of end-to-end encryption purists fall into the same intellectual tarpit as the cryptocurrency people, which is that it should be possible to design technical systems that require zero trust, and that the benefits of these designs are self-evident
Is this true for Signal too? I thought it wasn’t.
Signal does indeed use an architecture (at least for chats with contacts, or optionally everyone when you enable the "sealed sender" option that makes you a bit more prone to receiving spam) where Signal doesn't know who's sending a given message from a given IP address, and only which account it's destined for.
But any entity in position to globally correlate traffic flows into and out of Signal's servers can just make correlations like "whenever Alice, as identified by her phone's IP, sends traffic to Signal, Bob seems to be getting a push notification from Apple or Google, and then his phone connects to Signal, so I think they're talking".
Also, Signal relies on AWS, which could also perform such an attack it seems.
The Internet Is Broken: https://secushare.org/broken-internet
The Hitchhiker’s Guide to Online Anonymity: https://anonymousplanet.org/guide.html
Pointers to more resources: https://discuss.grapheneos.org/d/15005-books-or-sources-on-p...
It is, because you cannot use Signal without giving them your mobile phone number, and from that point onward they (and anyone they might be sharing data with) know the who/what/when, and more. My gut feeling, notwithstanding any apologist and their weak arguments, is that the design choice is exactly about the who/what/when because it's mandatory despite being entirely unnecessary from a technical perspective.
>Many systems use encryption in some way or another. However, when we talk about encryption in the context of modern private messaging services, the word typically has a very specific meaning: it refers to the use of default end-to-end encryption to protect users’ message content. When used in an industry-standard way, this feature ensures that every message will be encrypted using encryption keys that are only known to the communicating parties, and not to the service provider.
>From your perspective as a user, an “encrypted messenger” ensures that each time you start a conversation, your messages will only be readable by the folks you intend to speak with. If the operator of a messaging service tries to view the content of your messages, all they’ll see is useless encrypted junk. That same guarantee holds for anyone who might hack into the provider’s servers, and also, for better or for worse, to law enforcement agencies that serve providers with a subpoena.
>Telegram clearly fails to meet this stronger definition for a simple reason: it does not end-to-end encrypt conversations by default. If you want to use end-to-end encryption in Telegram, you must manually activate an optional end-to-end encryption feature called “Secret Chats” for every single private conversation you want to have. The feature is explicitly not turned on for the vast majority of conversations, and is only available for one-on-one conversations, and never for group chats with more than two people in them.
It's a walled-garden system which is fine for private chats between groups of friends, but Discord is increasingly being used as a place to report bugs and share information. Telegram furthermore requires signing up with a phone number which Discord did not (now, often, you need to for participating when an admin of a community aka gild aka misnomer "server" turned on that requirement)
https://xkcd.com/979/ This comic will not be understood by gamers growing up today... (Except in many cases someone posted a solution or nudged DenverCoder9 in the right direction at least; with Discord, Slack, or Telegram you'd simply never find the thread in a search engine to begin with.)
So is telegram. I'm in numerous groups with developers of linux distros and other apps. Many developers uses telegram's channels to post updates about their works.
I was recently very curious about this question and asked similar ones here:
https://news.ycombinator.com/item?id=41267877
https://news.ycombinator.com/item?id=41270863
On a side note, I was just recommending Telegram as alternative to WhatsApp (but I did mention that we need to enable Private chats for E2E). It is definitely not an ideal UX.
As for applications in use today that address the metadata problem, have a look at Signal's Sealed Sender feature: https://signal.org/blog/sealed-sender/
As for recommending Telegram for secure messages, I side with the sibling comments ("Don't").
If you care about privacy and security, please don't. Defaults matter, and private chats are effectively unusable for anyone using more than one device or needing group chats. And that's not even considering their strange home-baked cryptography.
So why recommend telegram over signal?
I don't believe they lost any credibility with this, I thing people don't know about it for the most part, or don't care for the majority of the remaining part.
For metadata you first want to remove the obvious identifiers, phone numbers, names. You'd want to use something like anonymous@jabbim.pl for your IM account.
Next, you'd want to eliminate the IP-addresses from server, so you'd want to connect exclusively through Tor. So you'd set the IM client proxy settings to SOCKS5 localhost:9150 and run Tor client to force your client to connect that way. This is error-prone and stupid but let's roll with it for a second.
Now jabbim.pl won't be able to know who you are, but unless you registered your XMPP account without Tor Browser, you're SoL, they already know your IP.
A better strategy is to use a Tor Onion Service based XMPP server, say 4sci35xrhp2d45gbm3qpta7ogfedonuw2mucmc36jxemucd7fmgzj3ad.onion (not a real one), and you'd register to it via IM client. Now you can't connect to the domain without Tor, so misconfiguring can't really hurt.
So that covers name and IP. We'll assume the content was already end-to-end encypted so that leaks no data.
Next, we want to hide the social graph, and that requires getting rid of the server. After all, a server requires you to always route your messages through it and the service can see this account talks to this account, then to these ten accounts, and ten minutes later, those ten accounts talk to ten accounts. That sounds like a command structure.
So for that you want to get rid of the server entirely, which means going peer-to-peer. Stuff like Tox isn't Tor-only so you shouldn't use them.
For Tor-only p2p messaging, there's a few options
https://cwtch.im/ by Sarah Jamie Lewis (great, really usable, beautiful)
https://briarproject.org/ (almost as great, lots of interesting features like forums and blogs inside Tor)
https://onionshare.org/ by Micah Lee. Also has chats between user and hoster
https://github.com/maqp/tfc by yours truly, crude UX but the security is unparalleled.
>On a side note, I was just recommending Telegram as alternative to WhatsApp
Don't. Telegram and WhatsApp both leak meatadata, but WhatsApp is always end-to-end encrypted. Telegram is practically never end-to-end encrypted. I'd use WhatsApp over Telegram any day. But given that unlike WhatsApp, Signal is open source so you know the encryption works as advertised, it's the best everyday platform. The metadata free ones I listed above are for people in more precarious situations, but I'm sure a whistleblower is mostly safe when contacting journalists over Signal. Dissidents and activists might find Cwtch the best option however.
The fact that you can create a huge group and channels without sharing your phone and contacts is what made Telegram big.
You couldn't do that on WhatsApp until a few months ago. And it has been on Telegram for years. Why Hong Kong protesters used Telegram and not Whatsapp? read this: https://x.com/Pinboard/status/1474096410383421452
The fact that Telegram is massively used in both Ukraine and Russia shows that its model cannot be ignored.
>I am not specifically calling out Telegram for this, since the same problem exists with virtually every other social media network and private messenger.
In fact, https://simplex.chat/ is the only messenger with the least amount of metadata.
Again, the company lies about queues (a programming technique) being a privacy feature.
The application can not get rid of the metadata of server knowing which IPs are conversing, unless the clients explicitly connect to the service via Tor. The server must always know from which connection to which connection it routes packets. It's not a network hub, it's a switch, after all.
https://cwtch.im/ and https://briarproject.org/ route everything through Tor always, and they don't have server in the middle, which means there is no centralized authority to collect metadata. It's light years ahead of what Simplex pretends to offer.
Though it's old hat better to recycle this often so many know.
>Many systems use encryption in some way or another. However, when we talk about encryption in the context of modern private messaging services, the word typically has a very specific meaning: it refers to the use of default end-to-end encryption to protect users’ message content. When used in an industry-standard way, this feature ensures that every message will be encrypted using encryption keys that are only known to the communicating parties, and not to the service provider. From your perspective as a user, an “encrypted messenger” ensures that each time you start a conversation, your messages will only be readable by the folks you intend to speak with.
So and encrypted messaging app means to people the security that an end-to-end encrypted app provides.
He then explains how Telegram is not end-to-end encrypted.
* No end-to-end encryption by default
* No end-to-end encryption for groups, not even small groups.
To add, there's no end-to-end encryption for desktop chats either. And no end-to-end encrypted cross-platform chats either.
Your post reads like dollar-store damage control team post that didn't even read the article they're trying to discredit.
> 99.95% of messages on Telegram stored as plain text on their servers and only encrypted between client and telegram server.
Wrong and OP doesn't even mention plain text. The non-E2EE client-server data is stored encrypted sparsed out in various servers to different countries. https://telegram.org/privacy#3-3-1-cloud-chats
> End-to-end encryption only working for 1on1 chats, not available half of their clients and have terrible UX.
Wrong again. I actually recently checked this for myself their official clients on Android and Linux desktop have support for MTProto 2.0. Feel free to check if other OS don't support this feature. The only clients I know where this is not enabled are the web clients.
Telegram has had its own history of really weird issues with its encryption protocol, like the IGE, 2^64 complexity pre-computation attacks, IND-CCA vulnerability and whatever the hell this was https://words.filippo.io/dispatches/telegram-ecdh/
But these are not the big issues here. The issues Green's blog post highlighted were
* Telegram doesn't default to end-to-end encryption.
* It makes enabling end-to-end encryption unnecessarily hard
* It has no end-to-end encryption for groups
Those matter gazillion times more than e.g. a slightly older primitive would.
End-to-end encryption matters because Telegram is not just a social media or Twitter wall. It's used for purposes that deserve privacy, and Telegram isn't providing.
It's no wonder why WhatsApp and other apps don't face much heat from the government, they're already with the government.
https://core.telegram.org/reproducible-builds
Their custom encryption is questionable, but since it open source someone would find out by now if there was obvious backdoors.
you use it because you can use disposable phone number
nobody ever cares about encryption, it's a false flag
people care about no footprints
that's exactly why it was used to create civil unrest in Iran
https://www.wsj.com/articles/iranians-turn-to-telegram-app-a...
Does cloud server store the message and key.....
If answer is yes, ITS NOT FULLY ENCRYPTED!
Sounds contrary right?
If key and message is on server any LEO org can get it....for it to be fully encrypted cloud server should never store the keys....
So how many services claiming encryption have this flaw? All....
Why do you think Telegram has shell companies to avoid gov subpeonas?
Because it knows that its encryption is faulty to real world LEO and laws as it stores the keys on the cloud which means its can be subpoenaed for those keys and messages.
Telegram is actually one of the only apps I've seen to defend their super-duper secure storage of keys online. All lies of course.
The overwhelming majority of secure messaging apps have no way to recover user data if you drop your phone in the ocean. This includes Signal, Wire, Threema, Session, Element, iMessage etc.
Also it's not like Telegram dont have censorship. During last 3-4 years there was many cases where Durov blocked bots and channels that belong to protests and opposition in Russia, marked them as "fake" or just plain removed with no trace.
So it's just another case where some rich guy try to sell his own platform as some "freedom of speech" one even though it's just censored to his liking.
Of course for Telegram is much more convenient to not have end2end encryption. Given that they store everything on their servers, it means years of chat history that probably weights Gb for each user, contrary to what WhatsApp/Signal do, of course if 10 million people send eachother the same meme it's stupid to have 10 million copies of the same images on their servers just because it is end2end encrypted. They probably have a store where they index each media with its hash and avoid to have multiple copies, that is fine. This is the reason Telegram can offer you to have all your messages, including medias that can be up to 1Gb each, stored on a cloud for free.
As I user I prefer Telegram just because it's the only app that works perfectly synchronized among multiple devices (Android, Linux, macOS) with good quality native clients, without wasting space on my phone for data.
By the way, end2end encryption it's not that safe as they claim. Sure, the conversation can not be intercepted, however:
- you can put a backdoor on endpoints, that is compromise the user phone (something they do)
- you can make a MITM attack on the server (don't know if they do that, but technically possible)
- you can access the data that is backed up on other platforms (i.e. WhatsApp makes by default backups on Google Drive or Apple iCloud, trough which you can access all the conversations in clear text).
> - you can make a MITM attack on the server (don't know if they do that, but technically possible)
No it's not technically possible, by its very definition. The fundamental principle behind E2EE is that the server can be malicious or compromised all you want, but this does not impact message confidentiality or integrity.
Privacy is a human right. Everyone needs it. And Telegram advertises itself as an encrypted messenger. For every non-expert, that means end-to-end encryption. Only me and recipient can read the message. Users expect Telegram to be more secure than WhatsApp. Telegram claims its more secure than WhatsApp, and Telegram has attacked WhatsApp over its security. WhatsApp is always end-to-end encrypted, Telegram is not. So don't go putting words into peoples mouths.
>Given that they store everything on their servers, it means years of chat history that probably weights Gb for each user
It could be stored there with client-side encryption, Telegram doesn't need to have access to that data. Also who says chats that are ephemeral in nature need to be forever accessible. I save what I need from Signal or Telegram.
>This is the reason Telegram can offer you to have all your messages, including medias that can be up to 1Gb each, stored on a cloud for free.
It's not free. It comes with the price of your human right to privacy. You should get a job at Facebook with this marketing pitch.
>As I user I prefer Telegram just because it's the only app that works perfectly synchronized among multiple devices
It doesn't sync secret chats at all with multiple devices, not even desktop. Signal does.
>good quality native clients
Your script is seven years old https://signal.org/blog/standalone-signal-desktop/
>You can put a backdoor on endpoints, that is compromise the user phone (something they do)
Nirvana fallacy. Why is Telegram offering secret chats if all endpoints are compromised? If they're not always compromised, then it should offer end-to-end encryption for everything, always. Like Signal, Whatsapp, Wire, Threema, iMessage, Cwtch, Briar, Element, Session...
>you can make a MITM attack on the server
Which is why every messaging app worth its salt offers safety numbers https://support.signal.org/hc/en-us/articles/360007060632-Wh...
Even telegram has them, although their initial implementation of babby's first QR-code was a joke. How do you compare over the phone shades of a color matrix?
https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSUnBRB...
>you can access the data that is backed up on other platform
Oh, that would be horrible. Good thing Telegram doesn't have its data backed up in cloud, no wait, sorry, it does. ~Everything you ever do with the app is permanently stored in an ecosystem built by the Mark Zuckerberg of Russia, and his PhD in geometry bro Nikolai.
Shill harder.
If is is encrypted, then it aids terrorists and can be banned. So it is encrypted, whatever the technological details. It's a political decision.
This is why we have modern encryption. It converts the most beautiful poem in the world to complete noise and back with no loss of meaning. It allows sending images, books, videos -- culture, without spycraft that requires hours of learning. It's also more secure, given that humans aren't nearly as good at coming up with randomness and a computer's hardware RNG.
Moderation is what happens here on HN: Admins have some policies to keep the conversation on track, users voluntarily submit to them.
Censorship is when a third party uses coercion to force admins to submit to them and remove posts against their will.
Durov has been arrested for refusing to implement censorship, not for anything concerning moderation.
Censorship is when someone else dictates how we can run our respective groups.
You open the "Telegram nearby" feature anywhere and it's full of people selling drugs and scams. When I mistyped something in the search bar I ended up in some ISIS propaganda channel (which was straight up calling for violence/terrorism). All of this on unencrypted public groups/channels ofc (I'm pretty sure it's the same with CP, although I'm afraid to check for obvious reasons).
I think there is a line between "protecting free speech" and being complicit in crime. This line has been crossed by Telegram.
just turn off any discovery and suggestion features
What a weird hill to die on, given the whole context of this situation.
Do you see public recruitment of people into terrorist cells as a freedom of speech? Do you see publicly selling drugs as a freedom of speech? It isn't about censorship at all, it's about actual *illegal* activity.
Now it's up to Durov and his lawyers to prove that Telegram actually dealt with that. So far France doesn't seem convinced.
The problem I have is with requiring the chat service to police that or making its operators liable for the illegal conduct of its users.
It shouldn't be up to Durov to prove he did or didn't do anything, it's up to France to prove that he or his company actively participated such conduct. And no, people using the service to engage in the illegal acts isn't nearly enough, any more than Google's CEO should be liable for a drug dealer using Maps to navigate to the drug deal location, or Venmo should be liable for the buyer paying the seller with it.
The reason it's worth defending this "hill" is because allowing governments to use censorship as a convenient means of solving these problems always leads to more control and restrictions that infringe on the legitimate rights of everyone.
I understand the appeal of these tactics. Since we know that terrorist groups operating abroad will use chat services to incite locals to commit violence, it's tempting to search the chat service and stop that from happening by censoring the communication, preventing the radicalization. Since we know that drug sellers organize the sale of the contraband using the chat app, it's tempting to search the chat app and censor that speech, thus preventing the buyer from learning where to meet the seller. Or wait for enough speech to cross the line into conduct and then arrest them for it. Sounds great. If it would work, I'd support it.
The problem is that it won't work, and the only way to "fix it" will be to push more and more and more surveillance and control. It's already being pushed. Look at this chat control nonsense. Do you support that?
So what I'm saying, is let's just recognize that it's a basic human right for people to communicate freely and that operators of communication services shouldn't be held liable for the actions of their users.
Hacker news may 'moderate' illegal content on this website, but they don't have a choice in the matter, US or State authorities will shut them down if they do not, so it's technically censorship. Your view on whether this is good or bad will depend on many factors, one of which may be how you view the legal structure of your government, which is substantially different in France, the US, or Dubai (where Telegram is located).
As is mentioned in the article, Telegram is not simple a 'secure messaging app'. They are also serving a role similar to Facebook, Twitter, Instagram, or TikTok. They host publicly accessible channels or public group chats with thousands of members, which are all (apparently) unencrypted and accessible to the Telegram company. It may be reasonable (both legally and socially) to expect that a company which has knowledge of public, illegal speech to take steps to remove that content from their platform.
And Durov, by choosing to be a media company and not E2E encrypt all of his user's private communications, has walked right into a situation where he needs to abide by local laws moderating/censoring illegal content, everywhere.
What do you mean by users voluntarily submitting to these policies? This distinction seems key in your argument, but I don't see what alternatives to submitting I have here, making it involuntary, right?
If HN decided to ban all posts about Donald Trump that is moderation. Users voluntarily submit to this policy by participating in the site, and if they do not, they will be banned.
If the State of California required that all web sites run from their state are REQUIRED to ban all posts about Donald Trump, that is censorship.
Moderation is "your house, your rules" while censorship is someone else imposing their rules in your house.
Do you see what I'm saying? When France is talking about "moderation" of Telegram, what they actually mean is censorship.
So let's say a few child molesters create a chat service and use it to send the worst, most horrible child pornography amongst themselves. Removing it is censorship, not moderation.
Look, I'm not trying argue for legalization of child pornography here. That is illegal contraband, full stop. The intent of my comment is to say "let's just call it what it is"
I think the overwhelming consensus is that child pornography is so horrible that mere possession of it must be CENSORED.
I'm not arguing that censorship is always wrong. For instance, I don't want to see public billboards of graphic sex or violence. I think it's good that we censor that, so that we aren't forced to look at things like that when we don't want to.
What is bothering me is that proponents of censorship, and especially certain proponents of it who want to use it as a tool to suppress ideas they don't like, have recently started using the word "moderation" in order to sneak their plans into policy without raising objections. The reason is because when we hear the word "censorship" we immediately think, "Whoa, hold on there, censorship is very harsh, let's take a hard look and make sure this is serious enough that resorting to censorship is justified and appropriate", whereas when we hear the word "moderation" we think, "Of course, we all appreciate someone deleting the spam and trolls who annoy us", and we're less likely to think critically about exactly what kind of expression is being legally prohibited.
UPDATE: anyone who downvote, I invite to check for themselves.
Just a few known media:
1. https://www.aljazeera.com/amp/news/2024/8/25/telegram-messag...
2. https://www.washingtonpost.com/technology/2024/08/25/durov-t...
3. https://www.businessinsider.com/telegram-ceo-pavel-durov-arr...
4. https://www.theguardian.com/media/article/2024/aug/24/telegr...
However, indeed, I‘ve seen a few media that call it encrypted. This include France24, POLITICO, and The Times.
https://www.thetimes.com/world/europe/article/pavel-durov-te... “Chief executive of the encrypted messaging app reportedly detained at an airport near Paris over alleged failure to stop criminal activity on the platform”
https://www.tf1info.fr/high-tech/telegram-qui-est-pavel-duro... (one of the largest French newspaper) “Qui est Pavel Durov, le fondateur de la messagerie cryptée Telegram arrêté samedi en France ?”