I spent the 3-day weekend reviewing the cryptography used by Signal. You can find my write-up here:
TL;DR - no vulnerabilities were found, but I think being able to follow along and understand what was looked at, how it was approached, and why each target was selected, is more valuable than an empty list of security vulnerabilities.
In my previous foray into encrypted messaging apps (indexed here), I typically took issue with the other apps’ use of cryptography in much less time than I dedicated to this review.
That said, happy to answer any questions anyone in the Privacy Guides community has about this write-up, about Signal, or about my approach to cryptography audits.
Thank you for your very detailed review of Signal. I thoroughly enjoyed your criticism of Matrix, it’s insightful and it taught a lot.
I would love to know what you think of Signal’s Mobilecoin integration, though.
I know it’s been very controversial (since no one really uses it and it’s unnecessary bloat), but does its inclusion in the app package alone have any provable negative impact on security against attacks?
There have been claims that Mobilecoin’s inclusion in Signal has increased attack surface. Since Signal is the gold standard of secure messaging apps, having this stain on its quality seems worrying.
Generally, more code, more problems. Every feature adds additional risk to a vulnerability being introduced.
However, I didn’t look at MobileCoin at all. I’m not personally very interested in cryptocurrency (except perhaps as a way for sex workers to thrive in spite of the anti-porn and anti-sex lobby in the US).
This is one of the things I love about open source… most people don’t have the skills to audit any kind of code, but the small few who do can look at the code and audit it because it’s open for anyone to see.
I’ve only read the first 3 parts so I don’t have many immediate questions yet, but thank you for explaining how audits work so thoroughly, because I feel like they are often misunderstood for all of the reasons you described in your post.
Unfortunately, sometimes you will see encrypted messaging apps proudly proclaim, “We were audited” when facing criticism, except:
Their last audit was 5+ years (and/or over 1000 commits) ago.
They only have the one public audit report.
The company and/or person that did the audit has no other online footprint, including other audits, and only seemed to pop up to opine about this one vendor.
The timebox for the audit is tiny compared to the quantity and complexity of the software in question.
It isn’t important that the company providing the audit be one of the more recognizable names (e.g., for cryptography: Cure53, Kudelski Security, Least Authority, NCC Group, Trail of Bits, and myriad blockchain / smart contract security firms that sometimes demonstrate real cryptography chops).
The next month or so of my life is quite busy, so I can’t make any promises about when I might be able to see if Molly’s changes improve or reduce security at all, but I’ll take that into consideration.
If a vendor don’t have any audits from the past 2 years, but their software has changed a lot since then, don’t fucking trust it–especially if their audit report is from a no-name security vendor and it claims the vendor’s product is perfect.
I find it ironic that you said that, then recommended Signal, which doesn’t publish all of their security audit reports, and their last public formal audit by some known name company was from more than a decade ago. Since then, Signal have changed A LOT.
their last public formal audit by some known name company was from more than a decade ago
Neither Jean-Philippe Aumasson (2016) nor Cas Cremers (2023) are no-names. Just because you’re ignorant of cryptographic work doesn’t mean their work doesn’t matter.
If you want an example of a no-name auditor, see my recent fediverse thread warning about xPal, which includes an “audit” from an unqualified source as evidence of it being “audited”.