They should ban https to protect the children /s
Well, yeah, that’s common knowledge, but we’re talking about meeting people online.
That would be different. I’m not sure about trafficking, but the Guardian article was more about distribution of CSAM material.
Ah, the Guardian article worded it a bit differently, but I did guess it would be a popular online platform like Roblox.
Defaulting the accounts to private maybe? Restricting who can DM you by default?
Explaining the dangers to children (when setup, when DM, tutos)?
Disallowing phone phone number discovery for people who say they are under 18?
Encouraging parents (and users in general)to not post pictures about their children (or people who do not consent in general)?
Enforcing e2e DMs worldwide?
Not encouraging users as much to post pictures ?
Not asking for the real names of users, especially warning about using their real name in the username?
These are just some ideas ![]()
I would bet money they’re experts in opinion-holding, which is why they’re not named. Some windbag that had a headshot and not a resume.
Well, the author not only confindent enough to have a strong opinion about a technology she doesn’t understand — an opinion which uses fear and ignorance to give the oligarchy total control over communication — speaks to something dark. We are so far removed from true tyranny in the West that we, apparenly, are going to have to suffer through it again (and over come it again and again and again).
It’s the Anti-Enlightenment, boys and girls! The cycle begins again, I’m afraid. Sadly, I’m not old so I’m going to have to suffer through it.
Disappointing to see this yellow journalism from The Guardian.
I’m going to highlight a few of the ways this piece misrepresents research or is otherwise misleading:
Katie McQue, the article’s author, refers to “A study published by Nature” and “Research published in the Journal of Online Trust and Safety”. This makes it sound like two independent studies, but in reality it’s the same research project, conducted by the same people, in both journals. The study published in 2022 in the Journal of Online Trust and Safety is based on a subset of the data used in the study published in Nature in 2024.
The study’s lead author is Juha Nurmi, the operator of ahmia.fi. Ahmia is a special-purpose search engine - it only indexes, and returns results for, onion sites. The study’s data on search queries is drawn entirely from searches performed on Ahmia. As such, this data is not representative of most tor usage. The study authors specifically acknowledge this fact, stating, “Tor is mainly used to access legal clear web content, as only 6.7% of Tor users access hidden onion services (Jardine, Lindner, and Owenson 2020)”.
When McQue says “at least 11% of users’ searches seek CSAM” she’s creating the impression that 11% of tor users are searching for CSAM, but in reality what the study indicates is that 11% of search sessions on Ahmia include CSAM terms. The searches on Ahmia represent at most a single-digit percentage of all tor searches, and given their exclusive focus on onion sites, the portion of their searches aimed at CSAM is certain to be vastly higher than searches performed elsewhere using tor. The study authors do not attempt to estimate how many unique individuals, and how many bots, are responsible for the traffic on Ahmia.
Later in the article, McQue quotes Nurmi as a “cybersecurity expert and postdoctoral research fellow at Tampere University in Finland”, but makes no mention of the fact that he’s the operator of Ahmia and the author of the two studies she referenced earlier.
McQue uses the header “Millions of pedophiles”, apparently based on Europol reporting that Kidflix had 1.8 million users and another site’s claims of having a million users. But the anonymous nature of tor makes it very difficult to tell whether two different page loads, or even two different accounts on a darknet site belong to two different individuals, or whether they are in fact the same person. If it were possible to accurately distinguish between unique visitors using tor, then tor wouldn’t be blocked by cloudflare on half the web like it is today.
Good journalism would give readers an accurate understanding of the statistics.
Good journalism would make the reader aware of all the caveats and limitations in the methodologies.
Good journalism would clarify what tor is, what “dark web” means, and how those are different. It would make clear that tor is a transport, not a hosting platform, and would help the reader understand that the statement, “experts interviewed urged Tor to act to remove CSAM from its sites” makes about as much sense as “experts interviewed urged the highway to act to remove illegal substances from shops and storefronts”
Ending child abuse is a worthy cause. Using the issue as a bogeyman to undermine civil liberties is not.
McQue’s next article might well be titled: “Security at a cost: door locks help human traffickers imprison victims”.
Wow. This is straight up propaganda, then. Not surprising, but still.
This article is a pile of disinformation. Lets go into detail:
The Guardian:
Millions of child predators
No. There are only approximately 2 million daily Tor users, and as only a very small portion of them connect to hidden services, remember:
_____
News publications similarly inflate statistics on CP forums because average viewers don’t know better:
The Guardian:
The platform, which was hosted on Tor, had attracted 1.8 million users globally between April 2022 and March 2025, according to German law enforcement.
the larger sites have estimated user bases of between 300,000 and 600,000 each, said Richardson.
No.
Police typically count the number of registered accounts on websites, saying each is a user. However, the vast majority of registered accounts on anonymous and hidden forums are one-time signups (from LEA bots and people alike, mostly bots), which leads to extreme numbers. One-time signups are often encouraged as a measure to prevent identification. As such, a “moderately consistent” lurker can accumulate hundreds of accounts in a year. The communities are actually far, far smaller. Hidden services, unlike in the clearnet, typically do not require email confirmation or otherwise force users to identify themselves.
For instance, the popular Pokemon forum pokecommunity.com has less than 50 actual users online at the time of writing, but has 1,133,740 accounts. Even for a clearnet forum where you don’t need an account to lurk, and signups require email verification, the ratio of users to accounts is miniscule. Imagine how much smaller this ratio is, for intentionally anonymous forums with little/no identification and are bombarded by bot accounts.
___
The Guardian:
“The notion that just because the web addresses aren’t searchable by the surface web doesn’t mean that the administrators of Tor can’t find wrongdoing that is going on in their systems and remove it and report it to authorities,” said Peters, of ACCO.
That’s not how Tor works, lol.
___
The Guardian:
According to (unnamed) experts, these anonymous communities normalise child abuse, making it more likely that participants will go on to commit contact offenses against children they can access in real life.
No. This has proven to be false in many studies. Researchers have not found actual evidence that CP makes people more likely to commit contact offenses. In fact, the availability of child pornography has shown to somewhat decrease the rate of illegal sexual relations with minors:
"Following the effects of a new law in the Czech Republic that allowed pornography to a society previously having forbidden it allowed us to monitor the change in sex related crime that followed the change. As found in all other countries in which the phenomenon has been studied, rape and other sex crimes did not increase. Of particular note is that this country, like Denmark and Japan, had a prolonged interval during which possession of child pornography was not illegal and, like those other countries, showed a significant decrease in the incidence of child sex abuse. […]
Diamond, Milton; Jozifkova, Eva; Weiss, Petr (2011). “Pornography and Sex Crimes in the Czech Republic”, Archives of Sexual Behavior, 40(5), pp. 1037-1043
"Mere possession of child pornography (which includes images of sexually mature adolescents) – with charges often upgraded to “distribution” if one freely shares the images, or to “production” if one in any way alters or captions them – is one of the most rapidly growing offenses prosecuted in the U.S. federal system, earning those convicted an average sentence of 12.5 years (according to Department of Justice statistics). Often, the images in question are old and feature mere nudity, little different from what is found in naturist literature and legitimate art photography. This severe punishment is not based on the actual harm done to the minor portrayed, but the widespread perception that interest in such images is a mark of “pedophilic” (or hebephilic or ephebephilic – which categories together include most males) orientation and thus useful in detaining would-be sex criminals before they commit a contact offense. In this brief, a well-known criminologist and expert witness reviews the most recent scholarship on the question, and concludes that there is no credible link between viewing such images and committing hands-on offenses against minors.
Thompson, W. (2018). “Child pornography, pedophilia, and contact offending: the empirical research.”
Only about 0.8% (Endrass et al., 2009) to 1.3% (Seto & Eke, 2005) of adults convicted of watching child pornography “escalated” to sexual contact with actual minors. By contrast, around 5% of schoolteachers (both genders) have engaged in sexual behavior with students. Imprisoning a child porn viewer is significantly less likely to prevent real-life abuse than imprisoning a random teacher, or just any person who is around hormone-charged teenagers a lot.
Also consider reading:
"The policy and the initiative rely on a presumption that child pornography consumers are in reality undetected pedophiles and child molesters who are at high risk of sexually abusing children. This article challenges the presumption by comprehensively analyzing certain of the most commonly cited studies that purport to empirically support correlations between child pornography, pedophilia, and child molestation. It also highlights other empirical evidence, as well as some practical considerations, that instead tend to show that most child pornography offenders are at low risk of committing contact sexual offenses. In sum, the concentration on child pornography crimes appears to be a misinformed policy that fails to directly protect real children from harm.
Hamilton, M. (2012) The Child Pornography Crusade and its Net-widening Effect, Cardozo Law Review
In summary, mounds of evidence shows that viewing child porn does not inherently cause harm, nor does it “turn people into pedophiles”, or is an indicator someone will abuse children. Rather, its usually just used the way regular porn is, and in the case of the minority that of it that contains young children, it serves as an outlet for pre-existing desires that are best not to be tried in real life. Pedophiles often use anime porn (lolicon/shotacon) as a safe outlet, but as even this is becoming more criminalized due to the moral panic, some people are getting pushed into watching real CP. In the aftermath of crackdowns on Tor porn communities, some pedophiles have started recommending seeking nonsexual relationships with real children because CP/lolicon/AI CP is more dangerous.
__
The core of the privacy debate has shifted from terrorism to child pornography. As long as the majority of people buy into child pornography crusade and the pedophilia panic, everyone will continue to loose their rights. The moral panic is fueled by disinformation, fear, and ignorance about what actually happens with child porn. The majority of people don’t know how CP works, and are convinced that millions of people are paying crypto to satanic human traffickers who kidnap and abuse kids. That’s not remotely what happens.
For those who don’t know:
-
The vast majority of child/teen porn is essentially “pirated”; shared and viewed without payment and without anyone knowing.
-
Child/teen porn can largely be split into two categories;
-
90% of CP” is of teenagers, posing erotically or nude, typically who voluntarily decided to “sext” or post their nudes online.
-
Most CP is not easily distinguishable from legal erotica (Good luck telling a 17 year old from a 20 year old), which is part of the reason why tech companies employ hash-matching.
-
The rest of CP is mostly decades-old posing from eastern european or Japanese modeling studios that stopped existing a long time ago. (see the above quote)
-
The vast majority of CP does not include sex between adults and children.
-
Teen-teen sexting is often prosecuted as child pornography, and teenagers are the main demographic of people who get arrestedfor it. Approximately 40% (over 15,000) of the tens of thousands people who were charged for CP in the US in the last decade are teenagers themselves. (Source: FBI UCR)
-
CP where children were actually harmed are rare, but garner the most media attention and public outrage.
The pedophilia media panic deeply misrepresents what actually causes children to be abused or harmed. If you’re a parent - the chances of your child being blackmailed or kidnapped are thankfully exceedingly rare, but the chances of your child getting auto-reported by AI for consensual sexting with another teen and labeled a predator for life are much higher. Realistically, the child porn crusade is a greater threat to children than the people it prosecutes.
Fundamentally, child porn is information, and any attempts to mass-prosecute people based on sharing information will necessitate that everyone will loose their right to privacy. This is an uncomfortable fact that has to be faced. If the population was more well-informed they would understand that anti-privacy laws do not actually help children. If the crusade of child porn were to be stopped, then the anti-privacy would loose its main justification.
In the anonymous survey, they only asked the following question: «How often after viewing CSAM/illegal violent material have you sought direct contact with chlldren online?» (p. 11), They did not specify the methods, so it could have been outside Tor, or they could have lied. In that same report, they acknowledge this limitation: «Seto, Hanson, and Babchishin (2011) also point out that some self-reported offenses may be false confessions.» (p. 10).
She later cites that these findings are consistent with other studies that correlate CSAM consumption with child sexual abuse: «This is similar to Seto, Hanson, and Babchishin (2011)'s meta-analysis, which found that 55% of CSAM offenders admitted to contact sexual offense in self-report studies.» (p. 17), However, these studies do not control for the order in which events occur. The same author, Babchishin, later conducted a study finding that it is not common for CSAM offenses to be the first in a criminal career; in many cases, offenders actually escalate from child sexual abuse to CSAM consumption. This would be consistent with a recent meta-analysis that found that those convicted of CSAM very rarely (1.5%) reoffend with a child sexual abuse crime; they are more likely to reoffend with non-sexual crimes.
This does not imply that such content should be tolerated, but rather that we should reflect on the approach currently used to combat this problem. Funding prioritizes strategies to combat individuals who have already committed the crime, which is the lowest level of crime prevention and ignores other ways to combat this crime:
«Consequently, it has been argued that effective prevention initiatives require the development and implementation of interventions at the primary, secondary, and tertiary levels. Adopted from the field of public health, these three levels of prevention have been described as follows: (1) Primary prevention involves wide-scale initiatives aimed at the general public and implemented before the occurrence of sexual violence to prevent even initial incidents of CSA (e.g. general crime deterrence, public education, adequate sex education in schools); (2) Secondary prevention involves more targeted interventions for those at-risk of engaging in CSA, which address issues known to increase the risk of offending (e.g. anonymous helplines for people with sexual interest in children); and (3) Tertiary prevention is a reactionary approach after a sexual offence has occurred, which aims to prevent sexual recidivism (e.g. treatment programs for those who have engaged in CSA) (Laws, 2000; Wortley & Smallbone, 2006).» (P. 3-4)
Ironically, several people who have tried to prosecute child pornography are the very victims they claim to help. Instead of giving authorities more power to abuse their authority by funding measures that have not been proven effective in deterring crime (for example), they should invest more in raising awareness among the general public and in comprehensive sex education for young people so that they can navigate the web more safely:
This review found strong evidence for the effectiveness of child sex abuse prevention efforts in elementary school. Such programs typically use behavioral practice and role-play [80] and encourage parental involvement [81,82]. They teach about body ownership and children’s right to control their bodies [82] and about communication and self-protection [81,82]. A strong meta-analysis of 27 preschool through Grade 5 programs [80] and a systematic review of 24 K-5 programs [83] demonstrate significant effects on a wide range of outcomes, including behaviors in simulated at-risk situations. Another large systematic review concluded that, in general, parental involvement, opportunities for practice, repeated exposure, and sensitivity to developmental level were key characteristics of effective child sex abuse programs [81].
You should complain to the guardian. I agree that this is disappointing as most of the time, the Guardian seem to take facts seriously.
Do you know many pedos are on the clearnet? I was groomed on the clearnet. Plenty of pedos abuse kids on roblox and discord while those platforms do nothing about it. But no one cares because it doesn’t sound as “scary” as the dark web.
These corporations steal your data, use the children as an “excuse”, then do nothing to protect those children when they get harmed.
It’s sensationalist and designed to pull at emotional heart strings. You can usually tell with this topic when it’s vague, and makes assertions without an informed position.
As others of said, the “darknet” is not most of the internet, and most of the pedophiles will take path of least resistance and be on mainstream things like regular social networks.
I think this mission of trying close down and lock down everything so they will somehow be identified and excluded is never going to work. Others will of course argue that no cost should be spared, but is not a realistic position as all resources are limited. Nobody works for free in this world, and that includes law enforcement.
I think at this point more focus should be on what to do with them once they’re identified and how it even happens in the first place. Many would say “lock them all up forever” but that costs money too and would only be shifting a problem out of sight, and out of mind. Those offenders are also someone’s children too and I’m not entirely convinced warehousing people and forgetting about them is a solution either.
With young people, producing material of themselves or discovering something online a better approach would be harm reduction. Giving young people a criminal record for that serves absolutely no purpose but sending their lives potentially off-track and into isolation from the community, with the added potential for further offending.
Sorry, but I lost faith in them since one of their journalist leaked a password of an archive with Snowden leak material. It was a genuine mistake but still.