Features Network Koe Voice Download NOU 日本語

Claude Code.
No API key.

Run Claude Code and Aider with a 122B model running entirely on your Mac. No OpenAI bill, no cloud, no data leaving your device. Install and go.

macOS 14+ Sonoma · Apple Silicon · Free & open source

🔒 100% Local ⚡ 42+ tok/s 🧠 Up to 122B params 🌐 P2P Mesh
Everything runs on your Mac

No API keys, no cloud bills, no data leaving your device. Just your hardware doing what it was designed for.

🔒

Zero Cloud

Your prompts, your data, your models. Nothing ever touches a server. No accounts, no telemetry, no tracking.

MLX + llama.cpp

Auto-benchmarks both runtimes and picks the fastest for your hardware. M5 Max hits 42 tok/s on 122B MoE.

🌐

Distributed Mesh

Bonjour auto-discovery connects your Macs. The big GPU becomes the brain, laptops become relays. Zero config.

🛠

OpenAI-Compatible

Drop-in replacement API on localhost:4001. Works with Claude Code, Aider, Cursor, Continue, and any OpenAI client.

🧠

Smart Router

Routes simple prompts to the fast 35B model, complex ones to the 122B. You just use "auto" as the model name.

📱

iOS Remote

QR-code pairing. Use your Mac's GPU from your iPhone. Voice input, chat UI, all powered by your own hardware.

Two lines to get started

Point any OpenAI-compatible tool at localhost:4001 and start coding.

# 1. Set two env vars export ANTHROPIC_BASE_URL=http://localhost:4001 export ANTHROPIC_API_KEY=sk-local # 2. Use Claude Code as usual claude ✓ Connected to NOU (🌲 Root, 128GB) ✓ Qwen3.5-122B-A10B via MLX — 42 tok/s # Works with Aider too OPENAI_API_BASE=http://localhost:4001/v1 aider
Every Mac has a role

NOU auto-detects your hardware tier and configures itself. Install on all your Macs — they find each other.

🌱

Seed

≤ 8 GB

iPhone, iPad. Chat with AI using your Mac's power — just pair and talk.

🌿

Branch

≤ 24 GB

MacBook Air. Run small tasks locally, offload heavy prompts to your bigger Mac.

🌳

Trunk

≤ 96 GB

MacBook Pro. Run most models yourself — code completion, analysis, chat.

🌲

Root

> 96 GB

Mac Studio / Pro. GPT-4-class output entirely on your desk. Serve your whole team.

Why NOU over Ollama?

NOU

  • Menu bar app — zero CLI needed
  • Auto-discovers other Macs on your network
  • MLX + llama.cpp with auto-benchmark
  • Smart routing (auto-picks model by complexity)
  • iOS companion with QR pairing
  • Built-in dashboard & DePIN network
  • Anthropic API compatible (Claude Code works)

Ollama

  • CLI + API only (GUI requires third-party apps)
  • Single machine — no multi-node networking
  • One runtime (no MLX, no auto-benchmark)
  • No smart routing — you pick the model manually
  • No iOS companion app
  • No dashboard or real-time stats UI
  • No Anthropic-format API (Claude Code needs extra proxy)

Local AI, your way

NOU and Koe are part of the Enabler family. Both run 100% locally, respect your privacy, and are free and open source. Use them together for a complete local AI experience.

More from EnablerDAO →

Get NOU

Free, open-source, yours forever. Drop into /Applications and go.

Also available: Koe v2.9.0