Accuracy of Sam Bent video criticizing TOR (and PrivacyGuides) on HTTP Header OS spoofing removal?

I don’t have the background knowledge here. I read the other Sam Bent thread, but its not clear to me the fundamental criticism has been explained: i.e. if you are using TOR with JS disabled (HTTP+CSS only), can your OS still be fingerprinted?

There’s also a sub-criticism here, where if you can’t be fingerprinted easily, putting the OS in the header will likely be logged by regular servers, enabling some entropy reduction / fingerprinting for servers that wouldn’t have that information otherwise. Might this be a concern for certain threat models?

Can anyone give an in-depth explanation about the technical nuance here? (I got the sense that Sam seemed to not steelman the opposing argument as much as he could have, so I’m hoping for someone who is an expert in the subject to weigh in.)

1 Like

Your OS could always be broadly fingerprinted (think buckets like: Windows vs macOS vs Linux vs Android) regardless of this change.

1 Like

Skimmed around his video, to busy to watch it all currently, but the guy seems just butthurt to me that people have a different view on this particular topic, and is now resorting to namecalling and conspiracies, not really a good look.

My personal view on the topic of OS spoofing is that is so trivially easy detect, both via known and unknown methods, that the paper thin protection that this header may or may not provide is simply not worth the user experience hit. Heck it is even quiet easy to detect the usage of a VM. People get already turned off on Tor because they get blocked a lot and because of the Tor networks speed, there is not to much of a buffer before people just give up and use something else which will likely provider worse protection :).

2 Likes

Lets say that you are utilizing Tails to visit the clearnet website of a banned newspaper.

If a server raid occurs they can indeed view the user agent. Assuming that you have disabled JavaScript and enabled HTTPS, the OS “category” available to them is not particularly useful for them compared to other metrics. You are not sending the exact version of Tails stored for everyone to see (that information is spoofed).

Now, lets add another scenario. The banned newspaper implements some anti-DDOS functionality because they are targeted by cyberattacks. Normally, visiting this website would lead to annoying bot checks discouraging users to adopt Tor.

Would you think it is worth breaking functionality for casual Tor users over a hypothetical scenario mostly mitigated by disabling JavaScript?

1 Like

I think it is probably good to preserve functionality for casual Tor users, but to allow for features that intensify fingerprinting resistance for those who may need it. Ideally with an easy toggle to go between modes with better usability vs better anonymity. I think much of the core of Sam’s complaints come from the removal of the toggle in the config that allowed for total spoofing of the user agent.

@jonah @Niek-de-Wilde You may be right about the UX hit, but what are the specific methods by which this is possible? (do you have paper links?) How might either a regular website, or a hidden service fingerprint an OS from a TOR browser in http-only mode and spoofed headers? And I will reiterate, if its not easy to fingerprint (i.e. requires involved CSS tricks or something), does it not make sense to allow the user to opt-in to total spoofing if they feel the cost-benefit analysis from reduced logging scrape risk suggests it might be a reasonable change to make?

1 Like

I will get into detail more later as its late and I am traveling. But personally i would argue that people with high threatmodels should be even better off without this. This is because this would lead to a false sense of security.

3 Likes

They do still spoof your user agent like they always have, they just don’t let you choose which operating system you’re appearing to be.

It does not make sense to allow users to choose this, because any change like this would make you appear significantly more unique. Yes, you can certainly argue that standardizing on Windows for all Safest mode users might make sense, @TorProject did not answer my question asking about that, so I do not know what their specific reason for not doing that is.

What I do know is that the actual increase in entropy here is so insignificant that it does not make a difference for the end user either way, which is why I’m not going to hound Tor Project for an answer or drama-post about them “gaslighting” their users, even though I am indeed curious. Your operating system is simply not an identifying trait, there’s only like 4 of them.

It is a significantly higher danger to include a function that promises some privacy benefit while being unable to deliver it in all contexts, than it is to not have that function in the first place. It creates a false sense of privacy/security that will land people in serious danger, which is why whether it is “easy to fingerprint” is not relevant.

At the end of the day, the Tor Project clearly believes they are unable to completely hide what OS you are using, and as a result they made the (IMHO) correct decision to not pretend like they sometimes can. I am confident that if they did believe they could spoof any OS to look like Windows then they would do so.


YouTubers like Sam Bent are of course financially incentivized to drum up drama (because calling out Tor Project gets clicks) and position themselves as the One True Source of educational information about privacy by promoting uncertainty about other educational sources like Privacy Guides.

Since we lack those incentives, we prefer to focus on the real mountains of privacy problems out there rather than closely inspect every proverbial molehill of a problem that shows up in projects like Tor.

11 Likes

Another thing I will add is that it is by design that the Tor Browser security slider does not enable privacy features, it only disables browser features that could make you more susceptible to browser exploits.

I think that putting privacy features like user agent spoofing in the mix would overly complicate how that slider is used without much actual gain. If a privacy feature is essentially foolproof (e.g. letterboxing which is applied in all 3 security modes) then Tor Browser should enable it for all users, and if it is not (like I said before, in this case) then I don’t think it makes sense to potentially over-promise privacy features, particularly to the users who may need the highest levels of privacy and security (safest-level users).

So… that probably answers my own question :slight_smile:

2 Likes

Didn’t watch the video, but was PG explicitly called out? And for what specifically?

The part we we apartly did “damage control”

I’m disappointed that he decided to opt to say it’s dumb rather than engage with the community. Every community has its warts, but if there are valid concerns, he should come here and say them so there is a healthy exchange to address them. The privacy community already deals with fractured communities, I don’t see the benefit of deepening the divide.

8 Likes

Agreed the dramatic cover has a bit of a marketing sensationalist appeal to it. We shouldn’t forget that although Sam seems like a cool dude with some scars to show, it doesnt make him all knowing or immune to using his credibility to stir up drama for the views so he gets paid.

It feels like its aligning with the adversarial nature of partison politics. He cites random comments on YouTube to counter claims from a dedicated group of privacy advocates in a nonprofit. We really need the privacy community to avoid petty wars like this and just acknowledge there are different opinions that concern different audiences.

I would absolutely love for Sam to come and make his claims with the rule to avoid presenting conclusions and just share concerns and challenges to either side.

5 Likes

Having just watched the Privacy Guides-specific section of his video, a lot of it is saying we made claims which we did not, like that we supposedly claimed HTTPS somehow mitigates this “issue” which doesn’t make a lot of sense in the first place. I don’t see any messaging from us which could be construed this way, but maybe we got our wires crossed and he misunderstands the issue.

In this section of the video he also appears to claim that .onion sites using http:// is a security issue, which I know to us here most people will recognize as a nonsensical claim, but it makes me think he just fundamentally does not understand the browser and security issues he is talking about, like I said from the beginning.

I did originally invite him to the forum thread to clear these things up directly back when he made that first video a few months ago, but it seems doubtful that’s happening. I assume he just has a difference audience from us and is catering to them, given his reactions to some of our other (non-Tor-related) posts on Twitter last month.

7 Likes

I just noted that later on in this video he gets mad at Tor Project thanking us for calling out an actual problem with Tor Browser security that we’d like to see fixed. What this tells me is that there’s no winning: defending legitimate privacy projects like Tor against trivial matters like this user agent spoofing thing is bad, and calling out Tor Project for larger security concerns is also bad, somehow. I don’t know what he wants from us :slight_smile:

Although thanks to Sam Bent this is the first I’d heard of Tor Project thanking us in their newsletter, so that’s very cool to know :slight_smile:

8 Likes

His video coverage on that issue was also conveniently days after PG posted it, so he likely himself learned about the issue from PG but never mentioned it.

2 Likes

Indeed to my knowledge nobody had posted anything publicly about that behavior before we did, although we did coincidentally get beaten to the punch by a few hours by someone here on our own forum (either way, PG :flexed_biceps:) while we were waiting to see if Tor Project would comment via email :laughing:

4 Likes

I don’t know their reasons, but I can think of this one:

1 Like

@TrippingCrash
that doesn’t apply to Tor

you can see this by visiting this page and clicking the new circuit button repeatedly: TCP/IP Passive Fingerprinting - BrowserLeaks

it changes by exit in use, if it was affected by your client it’d stay the same.

1 Like

Yeah, that is the most well known non-browser fingerprinting method which would certainly apply to Mullvad Browser and other browsers.

I didn’t mention it because I’m not completely sure how applicable that is to Tor, since Tor essentially reimplements the TCP/IP stack in the browser. In theory your host networking stack is only used to connect to your guard node, and I would think the service/website you’re visiting would see a genericized Tor connection, or perhaps would be able to fingerprint your exit node’s OS with this method, which is not a real problem for you at all.

You’re right, after looking into it, it seems that Tor extracts payload from packets and send only this payload, then adds TCP/IP header at exit node.