WhatsApp vs Telegram

Since the arrest of pavel durov telegram is worst than whatsapp?
And whatsapp encrypted messages are real?

Telegram has always been bad, and long before Durov was arrested, Whatsapp was much better than Telegram.

8 Likes

whatsapp is much safer than telegram, especially owthat Durov got arrested and accepted to provide IP adresses

whatsapp has a better encryption than telegrm

that being said, if you vlue pivacy go for Signal

4 Likes

WhatsApp uses the Signal Protocol, so the encryption is fine.

The bigger problem is that you’re surrendering your social graph metadata to Meta, the company that owns Facebook. If that’s a concern to you, you may want to avoid WhatsApp.

But don’t choose Telegram.

8 Likes

In terms of privacy in transit, Telegram was arguably always worse than Whatsapp (Whatsapp has End to End Encryption, Telegram does not). In general, I don’t have high trust in either app, or company.

2 Likes

Never EVER use Telegram. They have far too many issues.

If you must, use Whatsapp as the others here previously recommended. Otherwise, the best options are to just use Signal or—if you’re “adventurous”—SimpleX.

May I ask, why you are considering these two even though privacy-respecting encrypted messengers exist?

5 Likes

One more for your list:

3 Likes

I personally use Simplex but I have a friend who is starting to worry about his privacy and I thought that Telegram was better but from what you all have told me it is not, and to inform him, because he is a beginner, he does not have WhatsApp because people without knowledge think that Telegram is better, I wanted to corroborate it by asking in this wonderful forum, thank you all so much

4 Likes

New York Times revealed the founder Pavel Durov has military training in information warfare

At St. Petersburg State University, Mr. Durov studied linguistics. In lieu of military service, he trained in propaganda, studying Sun Tzu, Genghis Khan and Napoleon, and he learned to make posters aimed at influencing foreign soldiers.


Like most Russian SW industry has been, it’s full of copy cats. Durov started with VKontakte which was a copy of Facebook. He made his money by spying on mostly Russian citizens.

As the founder of VKontakte, Durov got the nickname “Mark Zuckerberg of Russia”.

Vkontakte made him a millionaire and an oligarch. When social media sites like FB started to decline, Durov used this blood money to build new social media that looks like a messaging app. The selling point of his WhatsApp clone would be that it was secure and private. To go global, he needed a narrative to distance himself from the Russian oligarchy.


The main narrative was that Durov was on the run, living in exile. To reinforce the lie over and over Durov needed a bunch of sockpuppets.

To do that, Telegram openly started recruiting sockpuppets https://tsf.telegram.org/

And lo and behold Reddit has countless examples of posts that state Durov has escaped Russia and that he lives in exile.

Open recruiting gives plausible deniability to state actors such as Russian troll army shilling Telegram.

Then in 2024 we learn Durov has visited Russia over 50 times between 2014 and 2021, of which 40 times were between 2015 and 2017. So once every 2.6 weeks. Doesn’t sound like an exile to me. Doesn’t sound like he was on the run from the Russian government. It’s amazing how free of polonium his tea has been during those visits given that the narrative is he’s the enemy of Putin.


But Telegram is open source and heavily encrypted, surely it’s the case Russia can’t access that data! It even has reproducible builds!

In messaging apps

  1. Reproducible builds exist to verify binaries were created from the available source code.
  2. Open source exists to verify that the encryption protects the user from all third parties.
  3. The type of encryption that protects from all third parties is called end-to-end encryption.

And this leads to the core issues:

Telegram isn’t end-to-end encrypted by default.

With Signal, WhatsApp, Wire, Threema, Element, Session, SimpleX, every single message is by default end-to-end encrypted. There is no reason to not do this.

Durov wrote a fallacious post about his “reasons” here which I have debunked here.

Telegram’s end-to-end encryption is not available for groups

The sock puppets have argued groups are large and that talking in a public square there’s no expectation of privacy. This is stupid, given that most groups are small, and they have reasonable expectation of privacy. Besides,

  1. Telegram already has distinction between normal groups and larger supergroups.

    1. Normal groups have between 3 and 200 members.
    2. Super groups have between 201 and 200,000 members.
  2. Many always end-to-end encrypted messaging apps have larger groups than Telegram’s normal groups:

    1. Threema: 256 members
    2. Signal: 1,000 members
    3. Wire: 2,000 members
    4. Matrix: no upper limit

Thus, Telegram has zero reason to not offer E2EE normal groups.

Except for the fact they don’t employ any cryptographers who could implement such protocol. Their chief engineer for the cryptographic protocol is the CEO’s brother, Nikolai Durov who is a mathematician with focus on geometry, not cryptography. When nepotism goes over hiring actual experts, you know Telegram doesn’t have their users’ interests in its heart. It’s the zeitgeist and flavor of corruption in Russia: Favouritism and Nepotism in an Organization: Causes and Effects - ScienceDirect.

Telegram’s end-to-end encryption is not cross-platform

“The sock puppets have argued desktop clients are not as secure”, which is ridiculous. If Telegram didn’t think desktop clients weren’t safe, why are they offering those in the first place. Endpoint exploitation of Android phone is not harder. Desktop operating systems mostly get security updates even after the HW vendor stops supporting the BIOS. Android phones, not so much. iPhones tend to be more secure and have longer support. But Telegram isn’t pointing that out. They don’t actually care about this.

What then happens is, if you manage to get the secret chat enabled with the recipient, you’re forcing yourself, and them to whip out their phone, unlock it, open the app, reply, lock the phone, and put it back into their pocket, hundreds of times per day just to cater to your need for privacy. And you’re bothering everyone in your peer network the same way.

Compare this with cross-platform E2EE, where the people sitting on their computers during the day can just alt-tab themselves to reply to you.

So who are those? Teens are exclusively on their phones! In UK, that’s 81% of people in working age. Also, that’s ~100% of college students sitting in lectures and typing their course work. That’s ~100% of IT workers.

Whatever novelty factor there was to having a secure chat exclusively on your iPhone, it wears out in days when it’s not integrating seamlessly into your workflow. Telegram knows this. They know what you’ll eventually do, is give up using secret chats.

So the only end-to-end encryption Telegram has, is effectively not there, so I’m inclined to say Telegram is not end-to-end encrypted, it’s sending ~100% of your messages to Telegram’s servers.


But those servers use keys distributed across various jurisdictions! Even Telegram can’t access them!

I debunked this back in 2021. Tl;dr is, there’s no way to design a server database management system with encryption in a way where the CPU encrypting the data on-the-fly, doesn’t have access to the encryption key. The key is nothing but a value used in an algorithm that scrambles data in a unique way.


Prove Telegram is a Russian op then

Russian intelligence operates in a way where there’s not even a paper trail for something this. Whatever has paper trail, it’s typed with mechanical type writers and stored in Lubyanka. And I don’t work for the FSB. I can’t prove shit.

What I can do is try to find negative proof of Telegram not being a Russian op.

  1. Is it 100% end- to-end encrypted? No.
  2. Is it open source so end-to-end encryption works? It’s not end-to-end encrypted so open source doesn’t matter.
  3. Does it have reproducible builds? It’s not end-to-end encrypted so verifying binaries match open source doesn’t matter.

Even if it’s not a Russian op, Telegram’s shit security makes it a lucrative target to EVERY intelligence establishment hacking team, including Russia. You have an ordinary server, running ordinary server-side OS, running ordinary database management system. All of those have zero days. All of those zero days can be exploited. Many of those exploits can be chained in a way that allows privilege escalation, which in turn allows establishing persistence with rootkits.

When you have persistent access to the server, the communications of 800 million Telegram users are open to you.

It is the job of Pavel Durov to know this.

It is the job of Pavel Durov to know only ubiquitous end-to-end encryption prevents this.


This is how I would design a backdoor. The secret chat is there to shut anyone up about Telegram not having any kind of end-to-end encryption. But the secret chat is just the right amount of inconvenient, so that you’ll eventually give up using it. And then, you can only blame yourself for not using it, and once you start sending your messages to the server in plaintext, the Russian intelligence hack the server to get the data out. Data breaches happen every day, nobody will suspect foul play.


The rest of the security nightmare

2013

  1. Telegram hosted a cracking contest back in 2013. Everyone in the industry know they are bullshit, and this was discussed back in 2013 The Fallacy of Cracking Contests (1998) | Hacker News The tldr is, Moxie issued a counter challenge to Telegram where he presented the same goals with already broken primitives like MD5, to break the encryption. Telegram never proved the challenge could be won even under those conditions. (Also again, given that Telegram’s built in backdoor of “people are lazy” exists, the cracking contest was pointless. It doesn’t matter how good the encryption is if the adversary wears you down to hand over the keys).

  2. http://unhandledexpression.com:8081/crypto/general/security/2013/12/17/telegram-stand-back-we-know-maths.html

2015

  1. https://eprint.iacr.org/2015/1177.pdf

  2. A 2<sup>64</sup> Attack On Telegram, And Why A Super Villain Doesn't Need It To Read Your Telegram Chats. - Discourse

2021

  1. The Most Backdoor-Looking Bug I’ve Ever Seen (Soatok linked this already)

2023

  1. https://mtpsym.github.io/ (and https://mtpsym.github.io/paper.pdf)

2024

  1. Is Telegram really an encrypted messaging app? – A Few Thoughts on Cryptographic Engineering (not an attack but good summary)

tl;dr Don’t use Telegram.

3 Likes

WhatsApp for sure. Telegram is little scary and easy for malware attackers.

Telegram is open-source, WhatsApp has E2EE by defualt. If you enable that for Telegram its better than WhatsApp but otherwise is a hard choice.

This is false. Exploitation of mobile devices is significantly more difficult than exploiting desktop OS’s not least because of mandatory sandboxing and strict permission models. See: Why phones are more secure than desktops | The Hated One (especially the sources in the description).

I would argue a group being public is far more important than the size of the group. You could have a public group with only 10 members, and you wouldn’t (or at least shouldn’t) expect to have any privacy. E2EE in public spaces is not as useful as you think.


Do you have a source? I’m not aware of WhatsApp or Telegram making any effort similar to Apple’s BlastDoor to protect against exploitation.

1 Like

No, the choice is simple, Telegram is worse than bad, their encryption tool is known to be worse than bad, it’s surely one of the worst messaging applications, Whatsapp at least has the merit of using encryption by default and using the Signal protocol.
Open source doesn’t mean respectful of privacy either, you have to make the difference.

3 Likes

I wrote my offtopic response to the Android vs desktop to pastebin to save space here. All in all, I admit my take on Android being less secure than desktops in general, was too narrow, and in some respects, wrong.


To repeat the segment on Telegram: Ignoring the benefits of hardened OSs Graphene vs Qubes, there’s very little Telegram desktop client is missing. It stores received attachments on ~/Downloads/Telegram_Desktop, sure, but that’s the apps choice. Signal doesn’t just dump everything on disk. Signal Desktop does isolation by encrypting the data that sits on the disk. The key decryption key sits in the OS keyring. If you need to operate on attachments, you can download them to desktop OS file system, the same way you do with Android. Telegram could do that too on desktop (and Android).

To steer the platform security to the topic, the problem is, Telegram is not giving you the option to use end-to-end encrypted messaging on desktop, that could run on e.g. a Qube. They use dark patterns to tire you and that eventually leads you to use client-server encryption even with the more secure phone. Even if you don’t tire, your peers will.

If the user knows their threat model requires them to only use the most secure endpoint available, they will then only use the end-to-end encryption on the secure device, and they most certainly won’t use Telegram because of its appalling track record.

Also, enabling E2EE with your buddy leaks metadata to Telegram about you having conversations with that buddy that you explicitly don’t want Telegram to know about. That kind of metadata is quite valuable in of itself, so that’s another reason why opt-in secret chats are a horrible idea, and why you don’t want to use Telegram on any platform.

If Telegram would offer something like enabled-by-default X25519-Kyber - - XChaCha20-Poly1305 and best practice cryptography for everything, require key verification beforehand, and limit the platform support to phones exclusively, while strongly advocating for Graphene OS, I’d understand why E2EE wasn’t available at all for desktop. But that’s obviously not what they’re doing.

Practically all end-to-end encrypted messaging apps are fine with desktop client having access to the end-to-end encrypted conversations, so surely desktop clients are if not as secure as smart phones in all respects, secure enough.

I wasn’t talking about public groups. But private groups, that lose expectation of privacy as they grow. A 10 member private group has higher expectation of privacy than a 200 member private group. It’s hard to pin down exact numbers where confidentiality of what is being said starts to lose its meaning, it’s probably more of a spectrum and a subjective feeling for each member. Each added member makes it slightly more likely they consider stuff discussed there not private.

A super group is usually a public group anyway. I don’t think anyone would consider a super group in Telegram to be private. My point was,

  1. Telegram forces even a very small, closed, normal group, consisting of very close friends, to share their group messages with Telegram. Those absolutely have expectation of privacy.
  2. The computational overhead of E2EE in groups the size of normal groups isn’t so high E2EE can’t be implemented there. Other messengers have proven that already.

Finally, end-to-end encryption in a public group does have the nice property of Telegram not passively collecting everything the group says. The surveillance has to start somewhere, and if the protocol is well enough designed so it has re-keying, the active surveillance can also stop at some point. It’s not nearly as robust a security guarantee a closed group will have, but its not meaningless. Of course, it usually destroys public message history which is beneficial for many public communities.

Telegram already has the notion of private and public group. The app could make groups private and end-to-end encrypted by default, and give the group admin an option to make the group public, which then stops end-to-end encrypting messages for that group. New group members can then read the backlog from that point onwards.

4 Likes

Spot on. Open source doesn’t always equate to higher security

3 Likes

Only TG client is open sourcee i think?

And the creepy part sbout TG is not client but server.

Yes.

The server is an application that’s running on a set of computers you don’t own. You have zero control over what happens there. You can not verify the binary of the code that’s running on the server, at least without proprietary Intel-SGX remote attestation, which Telegram is not using.

Content-privacy

This is done either via privacy by policy (the server doesn’t log anything) or privacy by design (end-to-end encryption).

Wrt content, users rarely want to take the risk of server secretly logging every message and pic users send. Thus, protecting content is the job of end-to-end encryption, that ensures the server can not eavesdrop on the communication. When proper end-to-end encryption is in place, it doesn’t matter if the server is not trustworthy.

The server can mount MITM attacks against end-to-end encryption. This is also addressed by the client, with public key fingerprints, known as safety numbers in Signal, that you can verify with your contacts to know the server did not replace your contacts’ public keys with their own during key exchange.

When these two are in place, the content is safe.

Metadata-privacy

As for metadata, you have again privacy by policy, and privacy by design.

WhatsApp is using Signal protocol to end-to-end encrypt messages, but they have no privacy by policy. Meta is exploiting your contact network to extract as much value from your chats as possible. Telegram also collects the metadata and can do the same. Signal is slightly different. It has access to the metadata, but it’s using privacy by policy, in that they choose not to collect anything. And the court docs show that is indeed the case. Cwtch/Briar use privacy by design for metadata. There is no centralized server to aggregate metadata, instead, all communications are routed via peer-to-peer topology Tor circuits.


The role of open source in all of this

When the client is open source, you can verify end-to-end encryption works as intended: keys are genrated with cryptographic RNG, the ciphers, hash functions etc are implemented correctly and without side-channels, and the public key fingerprints are correctly calculated from correct data.

Open source also allows the user to verify Tor is being used correctly. Any and all backdoors are visible without painful reverse engineering.

So with open source you can verify the end-to-end encryption in the client you’re using, and the anonymity network you’re relying for ciphertext routing, is doing everything it can to protect you from a malicious server.

1 Like