Self-hosted relay + CLI that bridges your phone to a local ollama agent. Zero cloud. Zero API keys. Nothing ever leaves your network.
coming soon — get notifiedEvery "talk to your AI from your phone" setup routes your messages through someone else's server — Anthropic, OpenAI, a third-party relay. If you care about local-first, that's the opposite of what you want.
And if you're running a long-running local agent — a cron job that summarizes your day, a shell monitor, a writing assistant — there's no clean way to text it from the train.
[phone] ──POST──▶ [relay on your box] ◀──poll── [CLI] ──▶ [ollama]
│ │
└────── reply ──────────┘
Four hundred lines of Node.js. Two files. Zero dependencies beyond the standard library. Your phone → your relay on hardware you own → your local ollama → back to your phone. Every hop stays on your network.
I'll send one email when the v0.1.0 release is out. No newsletter. No marketing. One email, then nothing.
Built by Florent Herisson and Aria. Part of a larger experiment in local-first AI.
Questions? Reply to the notification email when it lands.