Yes, I’m well aware of that. I’m a long time privacy activist, cypherpunk, and the author of TFC. I’ve also made small contributions to OnionShare, Tor, Cwtch, Tails, and (possibly) Signal, and I’ve called out a ton of snake oil products.
It is not a rage bait. Allow me to elaborate.
Apps that are E2EE by default can not reliably scan, and should not scan their user’s messages. Like I said yesterday in this comment, you can not reliably perform client-side scans in FOSS programs, anyone can disable the scanning. Trying to do it is to attempt a technical solution to societal problem and thus to sweep the actual problem under the rug.
E2EE programs can not have backdoors only the good guys can use, case in point this article from two days ago where a government backdoor was compromised by the Chinese.
Apps that are free public image/file hosting platforms in the form factor of a messaging app are different. You probably wouldn’t object to Facebook, Instagram, Imgur, Discord etc. scanning for CSAM in the public groups. The same goes for Telegram. It’s a social media that looks like messaging app.
This wouldn’t have been a problem if they had not become a social media, and if instead all chats had enabled E2EE. But Telegram chose to effectively backdoor their group chats, channels, all 1:1 chats on desktop and across platforms by default, and in my opinion, they thus became liable to monitor those segments of their platform.
Yet Telegram was facilitating the CSAM content for a decade without doing anything meaningful to remove it, although it could.
Is it though? Telegram claimed they couldn’t comply with any LEA requests because the keys were “distributed across various jurisdictions”, a stupid argument with no technical foundation, that I debunked back in 2021.
I’ve seen a ton of grass roots marketers across social media, mostly on Reddit, tout this key distribution claim, and don’t remember anyone contesting the debunk SO answer I made.
Telegram was always capable of scanning the non-E2EE messages, and they instead chose to lie about their capabilities to make their business easy, to create a false sense of security, and thus, to aggregate a ton of user data.
And they were willing to lie to the extent they allowed CSAM to fester on the platform.
They created an illusion of security, and offered secret chats to appear they cared about privacy by design. Yet, its secret chats are so narrowly available, and so inconvenient to use, I’d argue they exist only to shut up debate about Telegram having no meaningful, by default E2EE. And this is ignoring all the insane teething issues of Telegram’s home brew protocol their PhD geometrician nepo-“cryptographer” Nikolai yoloed in.
Thus, with nothing to show for proper privacy by design, I’m having really, really hard time to have anything tangible to show Telegram is not a Russian OP spying targeting Russian dissidents and the West. I’ve pointed this out multiple times in the past, and I’ve been said more times than I can count “Pavel [the Telegram CEO] was ousted from Russia and he’s fighting against Putin and living in exile.” And then we learn he’s been in Russia more than 50 times after 2014, and yet his tea has been amazingly free of polonium for an oligarch who seemingly betrayed Russian interests.
Even if Telegram wasn’t a Russian OP, they sure as hell have made a lot of data conveniently accessible to nation states’ hacking teams, who know enough to have never believed for a second, that all the juicy non-E2EE data wasn’t available with the first chain of zero-days that manages to inject the rootkit into Telegram’s servers.
These huge implications of the lie about key splitting beg the question: Why didn’t Pavel close the backdoor and at the same time get rid of the liability for hosting CSAM for anyone. To which I would argue either he didn’t give a damn, or he intentionally allowed it.