You can pay and use it privately and anonymously. Plus, using through NanoGPT ensures the queries are not stored by the AI model being used. And these benefits are worth it if you ask me.
Well, I conflated usage with access. As long as you are not going to share any PII (which you won’t), here private usage = private access. But technically its only access if you do plan to share PII (which is never a good idea).
I mean, you’re choosing to use generative AI LLMs. You should know and seems like you do know (what you’re getting into) that any data you input will be used by the company for whatever purpose. But as long as you don’t share any PII, I see no reason to not use it if it helps you out.
Is it a deal breaker? Depends on what deal you’re expecting it to keep up with. From a privacy POV, this is the best you can expect because AI’s inherently privacy invasive but Lumo, Apple (with their private cloud compute) are the best solutions I can think of for such a tool.
If you have the hardware you could also consider running a local model. While local models may not be able to match the full prowess of chatgpt 5.1, for more common queries it would be good for privacy and then in situations that require the raw power of nanogpt you can use that.
Otherwise if cloud is the preferred option then Lumo, Apple, or Nanogpt are likely fine as long as you dont share PII. The caveat being you just have to be on top of yourself to not let things slip to these LLMs.