1 comments
- ddalcu 2 hours agoNative Zig server that runs MLX-format language models on Apple Silicon and exposes an OpenAI-compatible HTTP API. No Python.
Optional app MLX Claw, a macOS menu bar app with built-in chat, agent mode, and model management.
No dependencies 34MB, very low ram usage compared to other LLM runners.