Concerns About Using Private and Secure Software for Bad Things

What are all of your thoughts regarding the legitimate concerns people have about people abusing private software to do awful things like selling illicit substances, t–ture, t–rorism, etc.

There has always been the case of eroding _____ for the sake of _____.

Its a tool available for normal people primarily. The norm should be that you allow and enable access to the general public for their own privacy/security benefit. If you take it away for normal people in fear of the ____ people, all the bad people will be wiser anyway and move on to things that will protect their privacy and security however legal or illegal it is and the innocent will be left alone without protection.

It is the same way a surgeon and a serial killer will use a blade differently. We should focus more on bringing more (scalpel) blades to surgeons rather than taking away kitchen knives in fear of serial killers.

The all good technology has always been meant for us.


Everything and anything in the world can be used in ‘‘bad’’ and ‘‘good’’ ways.Privacy software is no different.

1 Like

Agreed, however, a gun for example, can enable more “bad” to be done, compared to an unarmed man. My argument against the one I postulated above is that governments are much more dangerous because of the scale of their power. So in that regard privacy software is more akin to an unarmed man, that is, it will likely do less harm then good.

But, the counterargument is that such private platforms may attract a disproportionate amount of the bad people I am talking about, which seems to be the case currently, due to their lack of adoption by the majority of “normal” people.

The only solution I can see to this issue, is for platforms like Signal to somehow moderate malicious usage by selectively identifying and blocking these specific usage cases. In your own words, this would be taking away the scalpel from the bad people.

Some people may suggest this will present an issue of censorship of what is considered ‘bad’ by someone like Signal for example. However, this is BS, only objectively awful stuff would be blocked, not politicial opinions, i.e., t–rorism would be blocked. Moderation should be done independently and without pressure from governments or financial incentives. Nonetheless, these two are pressures always present, regardless of whether moderation is a feature or not.

You’re asking a social sciences question in a very STEM-y forum, you’re going to get answers that exclusively go “nuh uh!” in different forms lmao

but actual answer, yeah it’s just a balancing act of trying to foster the good uses of this kind of tech (e.g., whistleblowing to journalists, for a kinda extreme example) while discouraging the bad uses. Whether that relies (or should rely) on some technological method like Signal moderating their platform in a more heavy-handed way or a social behaviour change method to discourage people doing antisocial things is something up for debate (I like the latter method and it’s more effective usually anyway; STEMbros often eschew the human element as if you can just “facts and logic” your way out of every situation)


Bad people will do bad things regardless of there are tools available to do it. Even if available tools were illegal, they would use them anyway.

There is your social sciences answer.


Strange, it’s almost as if criminals don’t care about the law. :thinking:


I appreciate your response, however, taking your comment at face value, I am not sure how it addresses the fact that some tools allow bad people to do more bad than they could otherwise do without them. Of course bad people will always try and find alternatives, however, these alternatives may make it harder for them to achieve their goals.

In the theoretical case that you could effectively remove these tools from bad hands, then your comment would invalidated.

However you mention:

Which is more reflective of the real world, there is unfortunately an enormous supply of tools, whatever they be guns, software etc. In this case, if the attackers were sophisticated, they could just find alternatives to Signal, or even create their own software, which is a genuine concern. So that is why the last part of what you said is very valid.

However, at the end of the day moderation is still better than no moderation, in the same way that less guns or [insert bad thing, e.g., drugs] are better than more guns.

Is this a fact? By which I mean: is there evidence that increased availability of such tools has led to increased crimes (or is there even any positive correlation at all)?

Because my understanding is that most statistics show many types of crime on a downward trend since the 80s/90s, at least in the US, which makes me doubt the advent of personal computing has made crime any easier or more efficient for most criminals.

This of course is in contrast to gun ownership which has a very significant and positive correlation with violent crime and other gun-related deaths.

On the other hand, privacy tools have certainly enabled journalists/whistleblowers/etc to hold their respective governments and companies to far greater levels of accountability and transparency than ever before, which I will always argue is good.


Moderation can only happen in a public discussion like this forum, reddit, x/twitter, etc. I think private conversation between 2 persons is a sacred thing to uphold and does not need moderation at all because a moderator would be the third party and with a third party, it is no longer a private discussion.

Yes you should be allowed to talk about anything in a private conversation, even the horrible, terrible ones, because it is private and you should be allowed, or rather have the freedom to truly speak your mind between 2 intimate persons in an intimate discussion.


I agree with your conclusion, but just note that the correlation you cited is meaningless because there are so many factors that impact crime rates.

I’m not sure if it was mentioned on this forum, but I think the greatest defense for privacy and security focused software is that privacy is a civil right, and governments shouldn’t take away rights simply because of a few bad actors



I suppose I’m just positing that technology and privacy tools are not allowing crime to flourish regardless of those other factors.

as opposed to…

…guns, which continue to be an increasing problem in the US despite gun control laws being stricter than ever, because in that case nothing short of a total ban would be able to overcome the negative effects of widespread gun ownership on society. That’s an example of something increasing crime regardless of other factors, I mean. But now we’re venturing off topic lol


I deliberately used the vague term ‘bad’. Nonetheless, I have not been clear enough, what I mean is very simple, if you want to hurt someone, using a gun for example unarguably makes it easier. My argument is nothing more or nothing less.

Here is my two cents on this. I am also a social scientist, btw. Why don’t you think that there is none here?

States are always thought as a Leviathan, a monster or beasts with many heads. Their different heads already penetrate into our different aspects of lives such as finance, travel, documentation etc.

Breaking encryption btw people is like putting a camera at everyone’s home just because you may commit a crime. We need another head of Leviathan in our personal life.

Most of the time problem is not about tech tools, it’s loopholes, cover-ups and lobbies. Just think about that they did not lock up Epstein for years, but they extradite Assange and put him into a jail.

Low-level officials are not generally fully aware of this or they are too focused on small cases, and complain about tech tools.

Edit: Damn typos


I disagree, I think software should not make it easier for people to conduct awful things.

I am not arguing that at all, but I am arguing for only very specific things to be banned. A lot of Americans swear by freedom of speech, but awful things that I will not specify are also freedom of speech, that doesn’t make freedom of speech ok.

Basically what I am saying is that since we have the technological means to identify specific abuse cases, we should revoke the right of these people to use a tool for bad, irrespective of whether they will find alternate means, because something is better than nothing.

1 Like

Yes, but it depends on which software and tech tool(private messengers of deepfake with AI). What about the back doors allowing governments to commit bad things.
We all blame the US, but believe me it’s one of the better places in the world. There are many notorious regimes which hope the encryption or privacy will be broken by tech companies so that they can easily commit their witch hunt.

1 Like

The thing is I may be completely wrong because it may be that it is technologically not possible at the moment to encrypt chat and simultaneously moderate them. Actually it probably is but you would only want one party Signal to view them as an example.

1 Like

I agree, there are countries much, much, much worse off than the US.

1 Like

Not to mention, when anything is made “illegal” it actually has to be enforceable too. Banning something or regulating it in which nobody abides will then require the state to expend a huge amount of resources to make the requirement actually mean something. This would cost production and ultimately hurt the state more in the end.

What tends to happen in the case of China, is arbitrary barriers (their favorite one is resetting TCP connections with the GFW) are put in place to keep out international competitors, and thus domestic solutions are left to work as intended. As a result they’re more usable to the end user. Those local products then have specific requires, (for example Wechat). I doubt Tencent would ever be permitted to add E2EE to that product.

In the case of encryption, it’s used a lot to protect financial transactions which means the technology has to be available in order to be implemented. It is simply impossible to have a solution which nobody knows about that can only be used in certain circumstances.