Alternatives to Chatgpt and Bard

It was discussed several times on the forum, but I want to write more privacy friendly alternatives in one post. Keep in mind that chatgpt mostly requires a phone number, especially when you register with vpn.

:pushpin: GPT4ALL : This app helps you run AI models locally even without an internet connection.

:pushpin: Freechat: This is an macOS only and simplistic alternative app to Gpt4All.

:pushpin: LM Studio: This is a very user friendly app with a better UI than others above, but not open source. Still models are open source and run locally.

:pushpin: Perplexity.ai and perplexity labs: Perplexity lets you use their platform even WITHOUT an account. (Still can collect data) It also generates answers including citations to the original source.

http://perplexity.ai/

They also have a labs page, which offers advanced models for testing such as Llama 70b or perplexity’s own 70b model, which I found very successful.

https://labs.perplexity.ai/

(some models are experimental)

:pushpin: Leo: Leo is a built-in feature in Brave browser, powered by Llama. The free version can be used without an account.

:pushpin: Hugging Chat: Huggingface offers various models in its page without AN account. (Still can collect data) However, they have recently brought limits and may require an account after some use.

(some models are experimental)

:pushpin: Poe: If you want to test different modes in one platform, you can use Quora’s Poe by registering only with an email. (Still better than using chatgpt directly since it does not require phone number)

Edit: missing links added.

8 Likes

Still waiting for a easy to use open-source program that downloads AI models to use locally AND can optionally connect to Brave AI, ChatGPT and ClaudeAI

Like the one software to rule them all :upside_down_face:

I made an XMPP chatbot for LLaMa months ago.
It can do 1:1 chats or group chats.

5 Likes

I don’t know if you tried or not, but GPT4All, Freechat and LM Studio are exactly doing what you expect. The thing is if you have enough CPU, or GPU or not. You can discover many open source models.

Edit: Free version of Brave AI is using Llama2 13b, and I can run it in my macbook locally. For llama 70b, you can use perplexity labs or hugging chat or locally if you have the hardware. You can use ChatGPT with API Key on GPT4All, but that does not make much difference with the website. Leo (Brave AI) also offers paid option with Claude Instant, but it is already free if you use Poe.

That’s really cool!

I wanted to give Brave’s AI a quick try, which seems to be private as it runs locally and doesn’t collect any data, but then you have to accept this warning that appears on the sidebar:

This sentence in particular caught my attention:

Don’t submit sensitive or private info

If it’s local, and doesn’t collect nor share any data, why the warning? I mean, it’s obviously good advice anyway, and if Brave were really going after our data there’d be little reason to “scare” people away. But it seems out of place, almost feels like a license agreement rather than a “use responsibly” notice.

It also stops working when I disconnect from the network…

Leo does not run locally. You would notice, it’s many GBs large and needs a lot of RAM and computing power.

4 Likes

Well, that’s embarrassing, I understood “the AI assistant built natively in the Brave browser,” meant it run in the browser locally… Thanks for clearing that up!

1 Like

I really wish Brave Leo AI didn’t run in the Sidebar. I never use the sidebar as it takes up so much space. Just let me activate it with an icon in the address bar or something!

1 Like

When you type anything in the address bar the bottom-most choice is to “Ask Leo”.

1 Like

It cannot run locally in a browser, because running a model locally requires you to have appropriate hardware. I tried that model in a laptop with 8 gb, and it was too slow. Sometimes it even was frozen when ram usage is high, unlike using it on brave.

2 Likes

I did sound too good to be true, not sure why I was so fixated in believing that it would run 100% offline in my machine… I would love for PG to write some recommendations or articles on how to achieve that though.

1 Like