basically title. I have been trying to find as many FOSS and OSS apps out there with good privacy policies, but some personal essential pieces of software (mainly iOS which I use everyday, and steam and pretty much all videogames) are closed source with little to no alternatives. How could I possibly trust closed source software? How do you go about it?
Free software can spy. Not free software can not spy. The openness of the code does not prohibit software from spying.
How could I possibly trust closed source software?
You can’t. There is one exception: you wrote this closed code yourself and therefore trust it.
I think there is a kind of legal exception as well, if you have a big reputation, big legal standing and big bucks your closed source code might be worthwhile to people who don’t like this clause (picked from the GPLv3)
EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE.
I tried to get Claude 3 Opus to write a story about open vs closed banking apps and I think this kinda makes sense (see below)
Imagine a major global bank, BigBank, that offers a popular mobile banking app. While the app’s code is closed-source, BigBank has a strong privacy policy, touts its commitment to security in its marketing, and is subject to extensive financial regulations.
One day, a security researcher discovers a vulnerability in BigBank’s app that could allow attackers to access user account data. Before disclosing it publicly, she reaches out to BigBank.
Recognizing the severity of the issue, the potential regulatory penalties, and the damage to their reputation, BigBank quickly patches the vulnerability and issues an update. They also announce a compensation program for any users who can demonstrate their account was improperly accessed due to the vulnerability.
While no breaches are confirmed, a few users come forward with evidence suggesting their data may have been compromised. BigBank investigates and ends up paying out tens of thousands of dollars to these users based on their compensation policy. They are able to do this thanks to their substantial financial resources as an established bank.
In contrast, consider a small open-source fintech project developing an alternative mobile banking app. While their code is transparent, they have limited resources and operate in a regulatory gray area. If a vulnerability is found, they may struggle to patch it quickly. And as a volunteer-driven project, they likely wouldn’t have funds set aside to compensate impacted users.
So in this case, the closed-source app, backed by a regulated, well-resourced financial institution highly motivated to protect its reputation, is actually able to provide stronger privacy and security guarantees to its users. The open source fintech alternative, while well-intentioned, simply doesn’t have the means.
I look at settlements like this and maybe this makes sense?
I think if ALL ELSE IS EQUAL and you have MEGACORP A vs MEGACORP B offering the same software with the same guarantees from the same country then yes Open should win every time. But when it’s megacorp vs gang of children programmers the megacorp can still make more sense sometime.
So… you can trust it if you are sure you’re going to get paid if shit goes sideways… maybe?! idk man.
Here’s another one. a little esoteric.
I’m not sure if this is still the case but Signal had (in 2021 at least) closed source backend code for filtering out spam
from Signal >> Blog >> Improving first impressions on Signal (emphasis mine)
As our spam-fighting capabilities expand, so does the complexity and size of our spam-specific software. To prevent spam on Signal, we need to build this spam-battling logic in a separate server component. The interfaces to this code will be public, but the implementation will not be shared. We’re working hard to keep the amount of non-public code as small as we possibly can, and all other parts of the server software and all of its interactions with spam-fighting code remain available in the open. Keeping this small piece of software private helps us stay one step ahead of spammers and doesn’t change the fundamentals of Signal’s security model.
Do you think a Signal.org with a fully open backend would be more private or more trustworthy? I’m not sure.
Also consider Molly, which has a fully FOSS frontend for Signal, do you trust it more than the official app? I’m not sure.
This is a fun question. I hope we get some strong opinions.
The only reason for software to be proprietary is moneyyyyyyyyyy. So if you see a proprietary app, then know that money is its priority, not you as a user or customer
Do I trust software that prioritizes money over software freedom and the customer? No.
Apple is a publicly traded corporation, and the only goal of all of these corporations is to increase shareholder value.
Apple doesn’t care about your privacy, they just market themselves as privacy respecting to try and stand out from the others and sell more of their products and services.
Apple also doesn’t care about you as a consumer. They don’t care about your right to repair, they don’t care about your right to do whatever you want with the hardware you purchased, etc.
As a user of an iPhone, do you feel like you own the hardware that you have in your hands?
You can’t properly sideload like you can on Android, regardless of whether you’re in the EU or outside of it. When the corporation tries this hard to restrict your freedom and ownership even when the EU is holding them by their balls, it’s just crazy and shows their true faces.
You can’t install a different operating system like you can on basically any Android device, and the best example of doing this right are Google Pixel devices.
You can’t modify or root your OS, jailbreaking is literally exploiting your device and losing its warranty so you can gain a tiny bit more control over it.
Your phone is pretty much useless without an Apple ID, and you need to give Apple a lot of information to create one, including your phone number. The moment you create an Apple ID, it will be tied to your unique hardware identifiers forever.
One thing that I quickly need to throw in is that you can’t update the firmware of your AirPods if you have an Android device. This is one of the biggest “fuck you, you didn’t give us enough money” moments ever.
Would I trust such a corporation? No.
Some thoughts about games:
I think that games being proprietary is fine, they take an absolutely massive amount of money, time, and other resources to develop, and the only way to make up for or profit from them is if people buy them.
There are some (smaller) freeware programs, mostly for Windows, that have good privacy policy. The reason why the code is/was not open is usually developers’ decision to prevent their work is used and charged by someone else (e.g. Paint .NET, IrfanView)
There are plenty of windows software that I really like, sumatra pdf, jdownloader2, etc.
Really this thread doesn’t make sense.
I think I know what you’re asking.
One big benefit that sticks out to me is that closed source apps have a clear price. By paying the App author the App author has more reason to want to protect your data…
Apple is a decent model for this. You’re paying for both closed source hardware and closed source software and Apple are well known for protecting their user’s privacy.
I’m not a programmer and cannot read code.
So while I prefer open-source software if a closed-source software has been audited by a reliable third party organization then it’s functionally the same to me since I can’t read the code anyways. This is why I feel fine using Tresorit.
Additionally, there are ways to verify that software, whether closed or open source, is or isn’t doing what you want. I use a closed source email client that has a good enough privacy policy and I reviewed all of its inbound and outbound connections for a few months to verify it wasn’t connecting to anywhere unexpected. It wasn’t. It connects to my mail and contacts servers and that’s it.
(I hope to fully switch to Thunderbird if they ever improve how they handle email chains and conversations. Even after the recent updates it’s a mess).
There are questions of necessity. I have to run a closed-source operating system because the software and hardware I rely upon don’t run on Linux. So I do my best to improve what I can and otherwise am forced to accept the trade off.
Then there’s the threat model angle. I don’t consider email a private medium anyways. I use it for appointments, shopping, business, life logistics, etc. Personal communication goes into Signal. Same with the conversation that happens sometimes around music streaming. I don’t consider my music tastes to be highly sensitive personal information. Guarding it at all costs is not part of my threat model. My priorities lie elsewhere (messages, calendar, photos, location, etc.).
Exclusively using private, open-source software 100% of the time isn’t viable for most people, so it’s okay to focus our efforts on what’s most important for our threat model.
Unfortunately most of Apple’s pro-privacy marketing is B.S. I suppose relative to other big tech companies, Apple is the least invasive? But the bar is in hell. This sort of abuse and privacy invasion often goes hand in hand with proprietary software.
In the current day, in practice there could be caveats for when a proprietary product could have some benefits over a free/open product, but that is largely a result of economic reasons, the details of which would be off-topic for this forum. But as a general rule of thumb, free and open beats out proprietary on principle if you care about your privacy or security.
Free and open software still benefits non-programmers as it provides programmers in the general public the ability to view and modify code. There will likely always be programmers who’s interests align with yours and often times community-focused projects want to take feedback from their non-programmer users to make a better product.
Right - open source is always preferable to closed. But it isn’t the only consideration and there isn’t always a viable open source option available.
Ah I might’ve misinterpreted your “functionally same” comment to mean that audited proprietary software is equal to FOSS.
Nah, it’s not as good, but for us luddites that can’t read code anyways it can still be “good enough”
I understand that, but to OP’s point, I personally do not understand why privacy/security companies don’t make their software open source. Their reasons are not convincing to me.
I am thinking specifically here of 1Password.
I’ve asked them many times if they ever plan to go open source, and they said no. They said that they get audited regularly and therefore it’s not necessary.
With Proton Pass moving on their territory with actual innovations, I am curious as to how 1Password will step its game up. Going open source will not be enough, but it would certainly make a positive difference.
Probably because they don’t want competitors to use their code/take inspiration from it.
I used to think that too, but I’ve never seen people do that with popular FOSS apps to the point where the fork is more popular than the original. Although I could be wrong about this, I would even argue that that most FOSS apps don’t even get forked. I’ve never seen a FORK of VLC for example.
Also, if something is open source that doesn’t mean it’s not copyrighted, right?