Ente made a local LLM chat

Ente made a bunch of projects as part of its hackathon (most were fun one time use), but one which stood out was its local LLM chat - https://ensu.ente.io/

This basically downloads a llama model upon visit, making it easier for anyone to have on device AI chat.

Full announcement in discord as follows

Hey @everyone,

We held our first hackathon at Ente today. Sharing some of the projects that were built.

drip.ente.io

Swap fits, shades, and vibes to make one of 18,144 duckies.

:fire: roastmy.photo

You vs 4 AIs. No mercy, no filters. Vote for the roast that hurt the most.

:crossed_swords: versus.space

It’s no longer a poll, it is a WAR.

:speech_balloon: ensu.ente.io

Run LLMs locally in your browser. 100% private, on-device AI chat powered by WebGPU.

:framed_picture: ente offline

Android app that lets you experience all of Ente’s machine learning goodies, without an account.

:selfie: befa.ke

Post unhinged captions about your most mundane photos. Destroy Instagram.


Happy holidays!

6 Likes

PS: You can add custom models

It’s super cool that WebGPU exists, as I didn’t know about it, but it’s not very practical for most users. The ensu ente guys were nice enough to leave their app source unminified, so I can see exactly what they’re doing and it was not meant to be anything serious. I can see that they quickly vibe coded the site and it’s a very simple wrapper around: GitHub - mlc-ai/web-llm: High-performance In-browser LLM Inference Engine I see the web-llm guys have a much nicer demo site built out with their lib here: https://chat.webllm.ai/

I think the problem with WebGPU-based apps like this is that your handheld devices are too weak for local inference and if you’re on a desktop you may as well be using a more integrated native tool which will have advantages. I know this because I created my own local LLM desktop app ( Gerbil ). What I personally do is run an LLM locally through my app with the ā€œRemote Tunnelā€ (via cloudflared) setting enabled and then I can chat with it when I’m away.