ChatGPT storing information

Trying to chat on privacy Reddit and found someone’s account. Their claims about ChatGPT (particularly its use for therapy or discussing personal things) actually motivated me to get my data and delete my accounts (at least I got some insights from them?):

As I just said in another thread: dont forget that ALL of the data is stored forever, is being used to profile you, and could plausibly be leveraged against you in the future. This huge push to use AI as a therapist is all over reddit for a reason.

Do you think it’s true? I’m torn between it sounding very plausible, and questioning the reliability of someone privacy-obsessed who thinks anyone who says opposite is a “bot”.

LLMs need all the training data they can get, it’s basically the fuel that makes them better.
I strongly assume that almost all of them never delete anything, despite any “do not use my data to train” or deletion requests.

3 Likes

Whether it is true or not makes little difference when you are not supposed to share any PI or PII with any AI anyway.

Cautiously use AI for help you actually need but never for mental health or anything personal like that. Use it as a tool for utilitarian purposes only.

And yes, any AI will indeed store any info fed into it. There are or may be some exceptions via API if you use another front und for the service to select the option of not using your queries for training data but only to provide answers too. But even then, I’d not trust it fully.

Too late for that, I’ve already done so. On a few different platforms.

Well don’t do it any more. It’s never too late to get better.

If someone is lacking supports, reaching out to an AI chatbot is better than nothing and can help point them in the right direction.
If you can’t run a model locally at least use anonymous/no-account websites and do so strictly though Tor Browser and don’t share any identifiers (locations/names/etc).

4 Likes

Exceptions always exist. Yes, I would of course want a person in crisis to use it.

I think that is an extreme, someone doesn’t need to be in crisis or actively suicidal to talk or learn about whatever issues they’re having.

1 Like

I mean.. as long as no PI or PII is shared, you should be good.

I’ve heard about local models. Would they be better from a data privacy perspective?

Also for some reason I thought Tor would break ChatGPT.

Yes, using tools like Ollama, Llamafile, and PocketPal process responses entirely on device.

3 Likes

Absolutely!

You can also use and pay for NanoGPT anonymously and it supports pay as you go model so no subscriptions. And any model is available on it. It’s a privacy focused front end for all commercial generative AI models.

DIdn’t know about PocketPal. Good to know! Thanks

Duck.ai if you don’t need to share any personal info.

I don’t even know where using personal info would be useful. I replace names with random ones when I need to.