Privacy at a cost: the dark web’s main browser helps pedophile networks flourish

Additional info on Reddit: Reddit - The heart of the internet

It is astonishing and even comical how they portray Tor and the Tor Project. They are making it seem like it’s a centralized app, like Telegram or something.

Look:

The platform’s design makes it virtually impossible to remove harmful posts or illegal content, they say, and the organization behind Tor has resisted pressure to implement even basic safeguards.

The Canadian Centre for Child Protection (C3P) told the Guardian it had submitted more than 19,000 notifications to the Tor Project – the privacy-focused US non-profit that develops and maintains the network – flagging child sexual abuse material (CSAM) detected on its system. Yet the organization has taken no action, according to C3P.

“Tor is designed in a way such that they can’t remove anything and they refuse to change it, because the idea of any sort of censorship on there is bad,” said Lloyd Richardson, C3P’s director of technology. “Tor has a board of directors that makes decisions surrounding this, and I don’t think they’ve ever been held to account for any of this. They’re the only people who can essentially intervene, and they refuse to do so.”

Neither of Tor’s products has systems to detect, moderate or report CSAM, or mechanisms for processing user-generated content in the way mainstream social networks do.

The experts interviewed urged Tor to act to remove CSAM from its sites and stem the formation of pedophile groups.

8 Likes

Now The Guardian is spreading the “protect the children” narrative that Western governments are peddling.

They may claim to have written a balanced piece, and declared they operate their own onion service, but (via the headline, headings, focus, positioning of content, etc.) it overall brings criticism and stigma the Tor Project and the Tor network more than it supports Tor’s goals and benefits. Its main aim was to bring attention to illicit Tor use for CSAM purposes.

I agree the article’s wording is poor, however, it looks like some of the poor wording comes straight from Tor’s critics. The article also says

Tor is not considered an [electronic service provider] since it is engineered for privacy and anonymity, routing internet traffic to conceal user identities rather than hosting or moderating content.

To me the important points for the privacy community, us, are

  1. We should assume Tor is or will very soon be in the crosshairs of the “protect the children” agenda, like VPNs currently are in the UK.
  2. I believe we should acknowledge and discuss the problem of illicit use of privacy technologies, not ignore nor downplay it, while standing firm that privacy is a human right and crackdowns ought to be on illicit activities themselves and not technologies we use and depend upon.
2 Likes

It’s as absurd as saying DNS is responsible for the distribution of CSAM, but unfortunately in Europe that exact line of thinking has been successfully used in courts by media companies against torrent sites.

The experts interviewed

I guess “experts” means “some randos with an opinion” to The Guardian. Very useful info to know when reading their content!

6 Likes

No privacy for anyone = the children are finally safe

Privacy for anyone = everyone is a pedo

But seriously, yes this is how privacy (or rather, anonymity), works.

You give up some control over your people so that they have the right to privacy.

This means bad people can have privacy too, which obviously is not good for CSAM.

But what’s the alternative?

Well, government has complete control over it’s people. Every citizen’s digial message, conversation, photo, and written thought is accessed by the government so that they can potentially find some baddies.

These agencies wrongly belive the only way to 100% eradicate CSAM is for every single person in the world to give over every shed of privacy to the government.

There will always be another aspect of privacy to give up for “the children’s safety”.

What’s next? Cameras in our homes to stop child abuse? ID to connect to any Wi-Fi network?

It sounds a bit farfetched. But no government that is wanting to truly eradicate CSAM will be satisfied if Tor just vanished. There will be something else to lean on.

6 Likes

I don’t want people to be fooled into thinking this could even theoretically work. Even if we give up all privacy, there will still be CSAM and other problems in the world. These problems are not caused by having privacy.

13 Likes

Yep, totally agree. Edited it to clarify that it’s a bogus theory, but a theory that most governments hold.

I have sympathy for the activists or agencies with the goal to protect children against CSAM, but I don’t think throwing privacy out the window is even a remotely good solution.

5 Likes

Instead of privacy causing crime, I think that the argument is that privacy allows criminals to cover their tracks and not be caught ?

I hear they’re using something called “internet protocol” to send CSAM these days, maybe the government should ban that :thinking:

5 Likes

IETF has never been held accountable

1 Like

Okay, so here’s my opinion: we shouldn’t dismiss those concerns. However, let’s think about solutions and what they would do in practice.

  1. Ban Tor entry nodes

Everyone who wants to use TOR will now use a VPN, making the fact they use Tor hidden. Congrats, know law enforcement are even less capable of identitying those awful criminals.

  1. Ban Tor Organisation from developing Tor:

Development likely to continue underground, reduce accessibility of Tor. The later will mostly hurt legit users of Tor.

  1. Mandate Tor to ban domains

Not sure here, someone knowledgeable could pitch in whether this is possible and how.

In summary, I think we shouldn’t dismiss those concerns but let’s have constructive discussions, not hitpieces like this article. Often, Privacy get suppressed to counter evil actor X. Let’s not be in the trap of binary choices either, we can do better.

1 Like

This just sucks for the actual issue of CSAM as it is constantly used as a bad faith reason against legitimate tools and takes aways real efforts to curb the issue. That’s all without even thinking about the harms to privacy and other rights these bad faith efforts have.

3 Likes

The reality is that people who are actually involved in internet safety and privacy advocacy have been thinking about this problem for decades, and the consensus is that these proposals are untenable. The only people who disagree fall into two camps:

  1. People who have not thought about these problems for decades, but rather read about them and formulated a gut opinion in the last 5 minutes; and
  2. Lobbyists and politicians who have ulterior motives, like ultra-religious think tanks who know that once the regulations are in place it will be much easier to brand anything they don’t like as pedophilic.

These concerns are dismissed because the opposing viewpoint is not a logical opinion, it’s a belief no less absurd than believing the world is flat. The only way you could believe it is if you create the belief in your mind and then do not follow it to its logical conclusion or seek out any evidence that contradicts your worldview.

You immediately came up with some solutions, and then immediately followed them up with reasons they would not work. That is the line of logical thinking that I think we should encourage people in the first camp to also do, and that would be more helpful than entertaining a belief system that the only way crime can be solved is with an omnipresent government :slight_smile:

16 Likes

I am not sure what the solution is. I don’t think messing with Tor is it. At the same time, I was pretty startled to read that 1 out of every 10 searches on Tor is for CSAM

2 Likes

Not on Tor. On Tor-specific search engines.

Remember the default search engine in Tor Browser is DuckDuckGo, and Tor is largely used to browse clearnet sites. Some stats place hidden service traffic as low as 3%-8% of total Tor traffic, and you are talking about at most 1 in 10 of that.[1]


  1. Someone will read this and think “well, why don’t we just scrap hidden services entirely then?” and if that is you reading this then you should reply so I can argue against that point too lol ↩︎

6 Likes

According to The Guardian “A study published by Nature last year determined that CSAM is easily available using 21 out of the 26 most-used Tor search engines, and that at least 11% of users’ searches seek CSAM.” The Nature article itself contains paragraphs about overcoming CSAM addiction *“When we study the searches, we discover that there are a few hundred queries from people who want to cease viewing CSAM…” *Even if it’s just a small fraction compared to the number of queries from people who are looking for CSAM, Tor would still be the best online platform for these people. I have never typed “how do I overcome my attraction to children” on Google and I don’t want to find out what would happen if I do, but I will assume the worst. Best case scenario, I would find resources or therapists, but using Tor is the best way of avoiding false positives.

As for the “solution”, the Guardian article talks about Tor as a network for CSAM distribution, but it doesn’t talk about how CSAM material was created in the first place for distribution. “Research published in the Journal of Online Trust and Safety indicates that more than 40% of dark web users accessing CSAM go on to seek contact with children with the aim of pursuing sexual acts.” How are they contacting these children? Through Tor? That seems extremely unlikely.

They will most likely try to contact minors on online platforms with a large user base of children, pretends to be a child and groom them into creating CSAM material. These platforms supposedly have “systems to detect, moderate or report CSAM, or mechanisms for processing user-generated content” (something that’s lacking on Tor), but when these platforms are in the business of maximizing returns for their investors, their principles on user safety becomes less of a priority (if not abandoned). I would say that’s worse than not having these systems in place.

If the Guardian is serious about child safety, they (and other publishers) should push for more accountability from these online platforms that are facilitating the abuse of children, and the general public should cultivate around this issue as well instead of allowing their governments to slap a nonsensical regulation like the UK Online Safety Act. Of course, this is easier said than done. I hear Roblox is a haven for child predators too.

3 Likes

Good correction.

Overall, it is a catch-22 in that it is a smaller number of people relative to the rest of the internet, yet as a ratio of user searches, my (uneducated) guess is that 10% is +10x the rates on Google et al.

Hypothetically, if someone waved a magic wand and Tor had Chrome’s market share overnight, would we be OK with 10% of tor search-engine traffic looking up CSAM? What about 20%? 50%?

…I can see the slippery slope that leads to requiring IDs/banning VPNs and I don’t like it, but I too can see the slow emboldening of ‘niche’ interests as well when they can grow unchecked in the recesses of the internet. Is there a threshold where the cost of privacy is too great?

I do not have an answer, but it does remind me of the conversation around censorship in the US over the last decade… Whenever someone tries to deplatform hate speech or obvious misinformation (ie: RT News), there are people decry it as 1984/Big Brother. They have a clear example of a worse case scenario… OTOH, the ‘worst case scenario’ for unchecked misinformation/hate speech doesn’t have a clear analogy that we can point to. So instead, there is just a slow erosion of norms via a steady increase in an increase in hate crimes, an unsurety of what is real, that is further complicated by growing deep fakes/AI slop, a mistrust in expert opinions around science/government/health etc.

1 Like

No, [edit]: if this happens on Tor, then it’s likely they are contacting other adults on Tor who have children, not children directly.

My question for you is: how would this be implemented differently than the Online Safety Act?

2 Likes

Maybe I’m blissfully ignorant, but how does this lead to grooming compare to my Roblox example? Are the parents allowing their child to talk to other adults through Tor? Or is this more about trafficking?

So my Roblox example: they don’t have to do anything differently. They just have to demonstrate how they value the safety of their users with the safeguarding tools they already have. Do they respond to reports over growing concern of child predators?

That would be my assumption.

Research published in the Journal of Online Trust and Safety indicates that more than 40% of dark web users accessing CSAM go on to seek contact with children

I see now that I think you are interpreting “contact” as communication, but I think the study is actually referring to direct physical contact.

Edit: What the study actually says is that 42% of respondents who viewed CSAM on the dark web then went on to seek “direct contact with children through online platforms” afterward. Online platforms in this case is not necessarily Tor, to your point it could very well be Roblox, etc.

Maybe you know that and I misunderstood what you were originally asking this whole time lol. All I meant to say that if this is happening on Tor then it is likely between adults arranging this trafficking, but I agree it certainly is more likely to just be happening on children’s games and social media in many cases.

4 Likes