Private online AI (Venice, Duck, etc)

No problem, because theyre not sensitive documents. Theyre book pages I need to transcribe, both from PDF and from photos taken with my phone. I’ve tried different tools but havent found any local or web LLM that can handle images and documents the way ChatGPT does. Duck.ai, for example, lets you upload images but the resolution is awful.

And I know, I’ve tried Claude and DeepSeek for text transcription, but none of them works like ChatGPT5 or 4o.

NanoGPT should have many other LLMs for all kinds of purposes and specific use cases too. You’d find better luck there I feel.

Thanks for the replies! What makes NanoGPT superior to Duck.ai for instance?

Well, its a pay as you go model. No subscription. Any AI model is available including all premium models. You can pay, access, and use it as privately as you want and even anonymously. You don’t even need email and password to sign in, you can use a single sign in token like how Mullvad or IVPN has their accounts by default.

Having access to any and all AI premium models gives you immense freedom to use the one you want for the purpose you and use case you’re using it for.

Superior results and other privacy respecting things about (you can read up on yourself instead of me listing it all) makes it superior to Duck AI front end.

1 Like

Thanks for your insightful answer! From what I’m reading, it would be superior in terms of efficiency where you can pick any models you want, but strictly in terms of privacy, it seems to accomplish the same results as duck.ai, right?

I would say more. It feels more atleast. Check it out and give it a shot. Read their FAQs. It should give you a better idea of the privacy of NanoGPT as the front end for all models.

It’s more or less the same but NanoGPT is just better at it all if you ask me. I mean, I can use local LLMs and claim its even better than Duck or Nano but that doesn’t mean much when its useless for the quality results you want.

So.. you have to evaluate it accordingly.

1 Like

While not a private option, some services (such as vast.ai) allow you to rent GPU usage. These platforms provide access to isolated, unprivileged Linux containers where you can run inference tasks. Billed by the hour at relatively affordable prices, this setup is supossely designed to offer the same level of data protection that enterprises typically require.

1 Like

Could you develop more? If you don’t use PII, it’s fine no?

Also, the ones recommended in this thread all say stuff like:

Maple AI:

Your security is our primary concern. We’ve engineered Maple to ensure your data remains yours alone, protected by cutting-edge encryption:

  • All communications are encrypted locally on your device before being transmitted, ensuring your data is secured from the start.
  • Our servers can’t read your data. We use secure enclaves in confidential computing environments to verify our infrastructure integrity.
  • Even during AI processing, your data remains encrypted. The entire pipeline through to the GPU is designed with privacy as the priority.

Venice

  • Private AI: Our privacy architecture keeps your AI prompts 100% private. All data stays on your device, not our servers.
  • Privately Access Leading Open-Source Models
  • Venice was founded on the principle that civilization is best served by powerful machine intelligence when it respects the sovereignty of those who use it. Therefore, it must be private by default, it must permit free and open thought, and it must be based on the world’s leading open-source technologies. Here, your conversations and creations belong to you alone, not to corporations, not to governments, and not to us. In this, Venice stands alone among its peers.
  • We’ve created a platform that’s faster, simpler, and more responsive to the latest advancements in AI. No accounts required, no downloads, no data collection – just powerful AI at your fingertips.
  • We believe AI should enhance human capability while respecting human dignity. It should be a tool for unrestricted exploration and creation, not surveillance and control. Venice exists to make this vision a reality.
  • All your content - prompts, responses, images, document uploads - is encrypted in your local browser and never stored on Venice’s servers.

NanoGPT is similar as well.

Basically, from my understanding, this is all like duck ai?

Meaning, if you don’t use PII, it’s fine to use them, correct?

I think this would depend on how and where you stand with thinking about privacy.

I would say no. NanoGPT is objectively superior given all that it provides at higher quality, of course with a pay per use structure.

1 Like

I am currently paying 23 eur for ChatGPT plus. Been satisfied with its service the past 2 months. But, after looking at some options like duck.ai and NanoGPT (for the subscription model), I was surprised to see that they charged half the price with more AI models to offer (incl ChatGPT5 which I like the most)

So I am curious if there is a hidden fee or caveats to using these “AI aggregators” compared to paying ChatGPT directly.

1 Like

Not really. For example, NanoGPT is a pay as you go model. Plus, you get to pay and use it as privately and anonymously as you want. Subscribing directly to any LLM through the company providing it is never private. Give it a shot with a $5 purchase or something to see how it works and how much you want to use all LLMs (free, open source, premium, bleeding edge, etc.) made available.

I’m very confident you’ll be better off using this instead.

1 Like

Using it trough the API actually has some privacy advantage. At least under the Zero Data retention policy, no data is stored. This is what DuckDuckGo uses. It isn´t clear what NanoGPT uses, we don´t even know what their free model is.

For all other ways, including normal API calls, your data is indefinitely stored (even deleted chats) thanks to a judge order OpenAI confronts user panic over court-ordered retention of ChatGPT logs - Ars Technica

1 Like

Thanks both! I’ll give it a shot. But I’m also more curious about how they could manage to charge half what OpenAI charged. Is there a more limited call to OpenAi, longer response time, etc? Curious to hear your experience with it. In the meantime, I will also scour the net to see what others experience

Well, the ChatGPT + has an arbitrary price.

Here, they use the API. So they probably decided a price based on user average use.

The only downside I see is that ChatGPT + sometimes get early access. And the API is the last.

I just use the free options on duck.ai and it works perfectly for all my intents and purposes (mostly Linux tech support).

1 Like