Disclaimer: Firefox Nightly is the alpha (pre-beta) release of Firefox, not all nightly experimental features make it into Firefox and the feature I am going to talk about, is still hidden even in Nightly (but exposed in about:config), so consider it pre-pre-beta, and a rough draft
TL;DR for those who don’t want to read my walls of text, opt-in ai integration included in Firefox nightly allows you to integrate your own llm, offline, on-device, and privately.
Good news today for Firefox Nightly testers interested in AI/ML.
Mozilla developers have introduced an optional and opt-in experiment available to Nightly testers which integrates an AI chatbot into the browser sidebar, This wouldn’t be good privacy news if not for this one single sentence mentioned briefly at the end of the announcement:
Nightly can be configured by advanced testers to use custom prompts and any compatible chatbot, such as llamafile, which runs on-device open models, including open-source ones. We are excited for the community to share interesting prompts, chatbots, and models as we make this a better user experience. We are also looking at how we can provide an easy-to-set-up option for a private, fully local chatbot as an alternative to using third-party providers.
If the above isn’t clear. Firefox is making it possible to integrate your own locally hosted, private LLM into the browser, running on your own hardware!
This is exciting news to me, and its one example of why I’ve been supportive and defensive of Mozilla’s choice to get involved in the AI space early on, and I hope other browsers will follow suit!
This works with Mozilla’s own format for offline LLMs called llamafile
as well as whatever other offline (and possibly custom online) llm software you use, you just have to point it at ip:port
. I’m currently testing it with ollama/open-webui which is the software I use and Llame-3-8B
Screeenshot from my testing (updated to correct my mistake):
more screenshots of the UI
Because it works with whatever backend you already use or set up, it offers a lot of flexibility and potential, and much more user control and privacy.
There are some other potentially cool things that this approach enables. It appears possible to get this setup with searxng, to integrate locally hosted search into the locally hosted LLM all accessible right from the browser!