Privacy at a cost: the dark web’s main browser helps pedophile networks flourish

Additional info on Reddit: Reddit - The heart of the internet

It is astonishing and even comical how they portray Tor and the Tor Project. They are making it seem like it’s a centralized app, like Telegram or something.

Look:

The platform’s design makes it virtually impossible to remove harmful posts or illegal content, they say, and the organization behind Tor has resisted pressure to implement even basic safeguards.

The Canadian Centre for Child Protection (C3P) told the Guardian it had submitted more than 19,000 notifications to the Tor Project – the privacy-focused US non-profit that develops and maintains the network – flagging child sexual abuse material (CSAM) detected on its system. Yet the organization has taken no action, according to C3P.

“Tor is designed in a way such that they can’t remove anything and they refuse to change it, because the idea of any sort of censorship on there is bad,” said Lloyd Richardson, C3P’s director of technology. “Tor has a board of directors that makes decisions surrounding this, and I don’t think they’ve ever been held to account for any of this. They’re the only people who can essentially intervene, and they refuse to do so.”

Neither of Tor’s products has systems to detect, moderate or report CSAM, or mechanisms for processing user-generated content in the way mainstream social networks do.

The experts interviewed urged Tor to act to remove CSAM from its sites and stem the formation of pedophile groups.

3 Likes

Now The Guardian is spreading the “protect the children” narrative that Western governments are peddling.

They may claim to have written a balanced piece, and declared they operate their own onion service, but (via the headline, headings, focus, positioning of content, etc.) it overall brings criticism and stigma the Tor Project and the Tor network more than it supports Tor’s goals and benefits. Its main aim was to bring attention to illicit Tor use for CSAM purposes.

I agree the article’s wording is poor, however, it looks like some of the poor wording comes straight from Tor’s critics. The article also says

Tor is not considered an [electronic service provider] since it is engineered for privacy and anonymity, routing internet traffic to conceal user identities rather than hosting or moderating content.

To me the important points for the privacy community, us, are

  1. We should assume Tor is or will very soon be in the crosshairs of the “protect the children” agenda, like VPNs currently are in the UK.
  2. I believe we should acknowledge and discuss the problem of illicit use of privacy technologies, not ignore nor downplay it, while standing firm that privacy is a human right and crackdowns ought to be on illicit activities themselves and not technologies we use and depend upon.

It’s as absurd as saying DNS is responsible for the distribution of CSAM, but unfortunately in Europe that exact line of thinking has been successfully used in courts by media companies against torrent sites.

The experts interviewed

I guess “experts” means “some randos with an opinion” to The Guardian. Very useful info to know when reading their content!

2 Likes

No privacy for anyone = the children are finally safe

Privacy for anyone = everyone is a pedo

But seriously, yes this is how privacy (or rather, anonymity), works.

You give up some control over your people so that they have the right to privacy.

This means bad people can have privacy too, which obviously is not good for CSAM.

But what’s the alternative?

Well, government has complete control over it’s people. Every citizen’s digial message, conversation, photo, and written thought is accessed by the government so that they can potentially find some baddies.

These agencies wrongly belive the only way to 100% eradicate CSAM is for every single person in the world to give over every shed of privacy to the government.

There will always be another aspect of privacy to give up for “the children’s safety”.

What’s next? Cameras in our homes to stop child abuse? ID to connect to any Wi-Fi network?

It sounds a bit farfetched. But no government that is wanting to truly eradicate CSAM will be satisfied if Tor just vanished. There will be something else to lean on.

3 Likes

I don’t want people to be fooled into thinking this could even theoretically work. Even if we give up all privacy, there will still be CSAM and other problems in the world. These problems are not caused by having privacy.

7 Likes

Yep, totally agree. Edited it to clarify that it’s a bogus theory, but a theory that most governments hold.

I have sympathy for the activists or agencies with the goal to protect children against CSAM, but I don’t think throwing privacy out the window is even a remotely good solution.

Instead of privacy causing crime, I think that the argument is that privacy allows criminals to cover their tracks and not be caught ?

I hear they’re using something called “internet protocol” to send CSAM these days, maybe the government should ban that :thinking:

1 Like

IETF has never been held accountable

Okay, so here’s my opinion: we shouldn’t dismiss those concerns. However, let’s think about solutions and what they would do in practice.

  1. Ban Tor entry nodes

Everyone who wants to use TOR will now use a VPN, making the fact they use Tor hidden. Congrats, know law enforcement are even less capable of identitying those awful criminals.

  1. Ban Tor Organisation from developing Tor:

Development likely to continue underground, reduce accessibility of Tor. The later will mostly hurt legit users of Tor.

  1. Mandate Tor to ban domains

Not sure here, someone knowledgeable could pitch in whether this is possible and how.

In summary, I think we shouldn’t dismiss those concerns but let’s have constructive discussions, not hitpieces like this article. Often, Privacy get suppressed to counter evil actor X. Let’s not be in the trap of binary choices either, we can do better.

This just sucks for the actual issue of CSAM as it is constantly used as a bad faith reason against legitimate tools and takes aways real efforts to curb the issue. That’s all without even thinking about the harms to privacy and other rights these bad faith efforts have.

1 Like