I work on web apps a lot. But the writing in the first link in OP is a little hard to follow.
So using SSL with a valid cert eliminates some concerns? Who is the “attacker” in these scenarios? Browser vendors?
I guess I would need to see a full example of how an attacker would succeed at this when the app is used in an open-source browser over https.
Privacy Guides seems to agree with this: Common Threats - Privacy Guides
No. Consider Proton Mail, for example. They use SRP on the login form (in one password mode) so that when you log in your password is not sent to their servers.
Proton claims that this is to prevent someone who has compromised their servers from attacking you, however as the OP points out that is not true:
If you don’t trust the network to deliver a password, or, worse, don’t trust the server not to keep user secrets, you can’t trust them to deliver security code. The same attacker who was sniffing passwords or reading diaries before you introduce crypto is simply hijacking crypto code after you do.
If an attacker compromises Proton’s servers, all they would have to do is swap out the code that implements SRP with code that grabs your password when you log in. You’d have virtually no way of knowing whether this happened. This is why E2EE in websites and Progressive Web Apps are not adequate to protect you from service providers.
If server compromise isn’t a concern to you (i.e. you don’t deem it to be a realistic threat) then it’s possible to deploy secure in-browser encryption, albeit it being very difficult as the post says, because JS does not really have native security tools.
So, there are a few separate issues with web apps.
- If there is no TLS/SSL involved or the TLS/SSL is vulnerable to MITM attack (by TLS proxy, rogue CA or otherwise), software transmitted from server to browser can be swapped with a malicious version.
- Assuming a secure HTTPS tunnel, a browser receives whatever software the server chooses to deliver, without any software integrity or version checking possible by the browser or its user.
- Security of any software implemented in JavaScript is questionable, and this issue is critical for cryptographic code. JavaScript runtimes may be mallaeable or expose information to side-channel attacks. (Can WebAssembly mitigate these issues?)
This was also mentioned years ago by Signal and Nextcloud (also referring to Signal’s decision) when they were explaining why they will not release web version of their (encrypted) services (though NC’s E2EE is still not usable)
WhatsApp does have an extension that can verify WhatsApp’s web app code. It’s also open source. And here is Engineering at Meta’s blog about it. I still wouldn’t use a web app, but it’s cool that this is available.
CSP + SRI (Subresource Integrity), both widely supported standards, are a good start; though, not as comprehensive as “app signing” is unless other web services do something similar to WhatsApp’s Code Verify (mirror).