Wether or not you decide to trust Apple is certainly a question worth asking, but this project might still be worth some awareness.
Apple devices come with a built-in feature called Apple intelligence, which is on-chip LLM inference to power Siri and other apple AI features. It includes a 3B SLM and someone build a CLI tool and API to access it directly.
The API is OpenAI compatible, which means you can use it alongside Open WebUI, Jan, or any other private LLM app. You can think of it as an Ollama alternative that gives you access to Apple’s on-device models.
Edit: Just for clarity Apple’s LLM is extremely limited, so Ollama with an open-weights model will be superior in almost every way. Mostly sharing it because I think it’s a cool project (not mine) and maybe someone else wants to play with it as well.