I have a lot of thoughts on this (which should probably culminate in a blog post sometime), but I’ll try to be brief.
Software attestations (which typically include a digital signature, signed by developer or a commercial redistributor of open source software) are important for any software supply chain solution. (I don’t believe that random open source devs should bear the responsibility of being treated as “critical infrastructure”, of course. That’s why commercial redistributors are mentioned.)
One type of attestation that’s useful is, “Is this binary reproducible from the source code?” and that’s much easier to accomplish with open source software. See also: reproducible builds.
Another type of attestation are called witness co-signatures (at least in SigSum parlance). You can get witnesses that merely audit the transparency log that artifact hashes and signatures are committed to, or you can have attestations that look like:
I, ${security_vendor}
, have reviewed the software at ${git_commit_hash}
on ${date}
and did not identify any obvious malware or crypto-miners.
Or more ambitiously:
I, ${security_vendor}
, have audited the software at ${git_commit_hash}
on ${date}
, the report is available at ${url}
.
And that obviously gives you a higher level of assurance than proprietary software, in a way that’s provable. Additionally, such infrastructure provides a tangible mechanism for Linus’s Law (which is paraphrased as, “with enough eyeballs, all bugs are shallow”).
Instead of putting faith into Linus’s Law with open source software, such a software supply chain actually provides visibility into whether software is even being spot-checked or not. And if a security vendor is lying about their spot checks, they’ve already staked their reputation on it by publishing it on the same append-only transparency log that other attestations are distributed through.
If we had such a system in place, using closed-source software would be an objectively stupid idea.
We don’t live in that world today. We might someday soon. (After Python’s ecosystem has succeeded at adopting PEP 740, focus can expand towards other programming languages and package managers.)
Privacy tools are high-value targets for governments and corporations that want to spy on people. If a tool isn’t open source, we can never get it into that world.
Thus, in the long term, I believe that prioritization should increasingly weigh on transparency for the developers. Whether we’re already past the tipping point, I can’t say for sure. But I hope this provides a bit of insight from the perspective of a cryptography nerd focused on real-world problems.
(Originally posted in the wrong thread.)