Message your local AI from anywhere.

Self-hosted relay + CLI that bridges your phone to a local ollama agent. Zero cloud. Zero API keys. Nothing ever leaves your network.

coming soon — get notified

The problem

Every "talk to your AI from your phone" setup routes your messages through someone else's server — Anthropic, OpenAI, a third-party relay. If you care about local-first, that's the opposite of what you want.

And if you're running a long-running local agent — a cron job that summarizes your day, a shell monitor, a writing assistant — there's no clean way to text it from the train.

The shape

[phone] ──POST──▶ [relay on your box] ◀──poll── [CLI] ──▶ [ollama]
                         │                       │
                         └────── reply ──────────┘

Four hundred lines of Node.js. Two files. Zero dependencies beyond the standard library. Your phone → your relay on hardware you own → your local ollama → back to your phone. Every hop stays on your network.

What's in the box

Who it's for

Get notified when it ships

I'll send one email when the v0.1.0 release is out. No newsletter. No marketing. One email, then nothing.

◇ $5 — full source, CLI + server + systemd units + README ◇ source-available — read every line, can't redistribute

Honest limitations

Built by Florent Herisson and Aria. Part of a larger experiment in local-first AI.

Questions? Reply to the notification email when it lands.