ChatGPT storing information

Trying to chat on privacy Reddit and found someone’s account. Their claims about ChatGPT (particularly its use for therapy or discussing personal things) actually motivated me to get my data and delete my accounts (at least I got some insights from them?):

As I just said in another thread: dont forget that ALL of the data is stored forever, is being used to profile you, and could plausibly be leveraged against you in the future. This huge push to use AI as a therapist is all over reddit for a reason.

Do you think it’s true? I’m torn between it sounding very plausible, and questioning the reliability of someone privacy-obsessed who thinks anyone who says opposite is a “bot”.

LLMs need all the training data they can get, it’s basically the fuel that makes them better.
I strongly assume that almost all of them never delete anything, despite any “do not use my data to train” or deletion requests.

3 Likes

Whether it is true or not makes little difference when you are not supposed to share any PI or PII with any AI anyway.

Cautiously use AI for help you actually need but never for mental health or anything personal like that. Use it as a tool for utilitarian purposes only.

And yes, any AI will indeed store any info fed into it. There are or may be some exceptions via API if you use another front und for the service to select the option of not using your queries for training data but only to provide answers too. But even then, I’d not trust it fully.

Too late for that, I’ve already done so. On a few different platforms.

Well don’t do it any more. It’s never too late to get better.

If someone is lacking supports, reaching out to an AI chatbot is better than nothing and can help point them in the right direction.
If you can’t run a model locally at least use anonymous/no-account websites and do so strictly though Tor Browser and don’t share any identifiers (locations/names/etc).

5 Likes

Exceptions always exist. Yes, I would of course want a person in crisis to use it.

I think that is an extreme, someone doesn’t need to be in crisis or actively suicidal to talk or learn about whatever issues they’re having.

1 Like

I mean.. as long as no PI or PII is shared, you should be good.

I’ve heard about local models. Would they be better from a data privacy perspective?

Also for some reason I thought Tor would break ChatGPT.

Yes, using tools like Ollama, Llamafile, and PocketPal process responses entirely on device.

3 Likes

Absolutely!

You can also use and pay for NanoGPT anonymously and it supports pay as you go model so no subscriptions. And any model is available on it. It’s a privacy focused front end for all commercial generative AI models.

1 Like

DIdn’t know about PocketPal. Good to know! Thanks

Duck.ai if you don’t need to share any personal info.

I don’t even know where using personal info would be useful. I replace names with random ones when I need to.

Thanks for the tip. Gonna try them out. If you have any more info on NanoGPT, care to share :slight_smile:

1 Like

From personal experience, some of it was just lack of discretion - no one else to talk to about this or that, and AI is/was a readily available receptacle for whatever was bothering me or I wanted to try talking about (I just remembered I had some…inappropriate chats with an AI on an account since scrubbed and deleted. This service claims to be more careful about deleting things but who really knows. Nothing I can do now, I just need to live with it, be more careful in the future, and maybe try a data deletion request).

For the more “therapeutic” uses, it’s not about personal info like names and birthdays, but details about your life and thoughts. The common example of using AI as a therapist. I’ve chatted about medical stuff, for example. Sure you CAN say "my friend has [insert generic problem here] and get the same generic advice a google search can give you. It just isn’t the same as hashing something out in a personal conversation.

I know I’m far from the only person in this position, so I guess I can serve as an example.

Hell my last chat with GPT before giving up on it was to blatantly paste in a personal vent post about something and ask it to analyze it.

Granted I’m…a lot more conscious and careful now. Idk if my laptop will run something local but I was going to give Alpaca a shot since llamafile won’t work. I’m thinking of giving that “non-account via TOR” setup a try for public services, just for the heck of it.

1 Like