It is astonishing and even comical how they portray Tor and the Tor Project. They are making it seem like it’s a centralized app, like Telegram or something.
Look:
The platform’s design makes it virtually impossible to remove harmful posts or illegal content, they say, and the organization behind Tor has resisted pressure to implement even basic safeguards.
The Canadian Centre for Child Protection (C3P) told the Guardian it had submitted more than 19,000 notifications to the Tor Project – the privacy-focused US non-profit that develops and maintains the network – flagging child sexual abuse material (CSAM) detected on its system. Yet the organization has taken no action, according to C3P.
“Tor is designed in a way such that they can’t remove anything and they refuse to change it, because the idea of any sort of censorship on there is bad,” said Lloyd Richardson, C3P’s director of technology. “Tor has a board of directors that makes decisions surrounding this, and I don’t think they’ve ever been held to account for any of this. They’re the only people who can essentially intervene, and they refuse to do so.”
Neither of Tor’s products has systems to detect, moderate or report CSAM, or mechanisms for processing user-generated content in the way mainstream social networks do.
The experts interviewed urged Tor to act to remove CSAM from its sites and stem the formation of pedophile groups.
Additional info on Reddit: