I know that a bunch of us uses PrivacyTests.org to help decide which browsers to adopt.
Nevertheless, do you feel that the picture that it paints is fair?
For example, Brave, Mullvad and Librewolf comes out of the box with protections that makes some of those tests to present as better compared to others but normally people would add uBO and/or Arkenfox to some of the other options (Firefox, UngoogledChrome, Vivaldi, etc…) and it would help to get closer to the leading options.
Do we have maybe a comparison table that would take in consideration some manual obvious “hardening” that people would apply to the oob not so strong options?
It would be difficult for any checklist to provide nuance. However this is the first test of its kind I have seen which uses standardised tests to check for privacy and security features. I think it is far more useful than the promotional copy you will see on Edge, Brave and Firefox’s websites which tend to be more obviously biased.
Out of the box comparisons are definitely a good choice for the project. The less niche the browser is, the more reflective the default experience is of everyday browsing for most people.
There are indirect resources on browser privacy but high quality direct comparative ones are very rare, and the work involved to do that while also taking individual hardening changes into consideration would be infeasible. I haven’t seen anything that does better than PrivacyTests yet.
Me neither, I just feel that it kind of exaggerates in the advantages/disadvantages. The other day I was watching a YT video and the person was using this table and pointing the obvious advantages of Brave, Mullvad and Librewolf browsers because they had entire sections that passed without mentioning that the other browsers could pass some of those tests if they would be using things like uBO and Arkenfox.
Other than looking flashy with the checkmarks and crosses, I do not see much value this would provide in picking a browser. Having Mullvad and Tor Browser even in the comparison does not make much sense because they are in a completely different category compared to the rest.
I don’t trust PrivacyTests as it was created by a Brave employee and has a very clear bias towards Brave. Saying that, Brave is still my daily driver browser.
I don’t feel this is a very compelling reason to invalidate this project work and I know that you are an avid believer of Brave’s browser capacity (I followed some of yours previous comments) so I know you are not trying to invalidate the methodology here.
The tests can be replicated so is not like trusting a VPN provider.
I just wish that we have some sort of more real world scenario combination tests.
For accuracy’s sake, PrivacyTests was created by a Mozilla employee. After its creation they were hired by Brave. At the moment they work for neither. There may have been changes anywhere during that period that biased toward Brave’s implemented features/protections but the argument should always be about whether the implemented tests/sections are relevant to privacy.
Even so, you are free to distrust it as a resource.
My concern is more about the site’s bias and I can’t comment on its accuracy. Even in the way that it presents data to always list Brave as the first option. I haven’t done a deep dive into its testing methodology. I’m only suspicious that it could be presenting data in a way to make the user come to a certain conclusion and it could, as an example, omit other comparison criteria that would make Brave look not as good.