Argo is a local-first, minimalistic desktop client for threaded conversations with your own local LLMs via Ollama. Built with Tauri, Rust, React, and SQLite, Argo runs entirely offline — prioritizing speed, privacy, and control.
This project assumes Ollama is installed and running locally with at least one model available (e.g., LLaMA3, Qwen).
argo_demo_720.mov
- ⚡ Streaming responses from any Ollama-supported local model
- 🧵 Threaded conversations with persistent local history
- 🧠 Local model picker (choose between your installed Ollama models)
- 📝 Markdown rendering for rich responses
- 🌗 Light/dark mode support
- 💾 SQLite persistence for threads and messages — fully offline
- Download the latest release for your OS from the Releases page.
- Unzip or install the app as appropriate:
.dmgfor macOS (Intel recommended).exeor.msifor Windows.AppImage,.deb, or.rpmfor Linux
- Ensure Ollama is installed and running locally with at least one model (e.g.,
llama3,qwen). - Launch the Argo app — you're ready to chat.
⚠️ macOS Apple Silicon builds (aarch64) may currently be flagged as "damaged" due to lack of code signing. Use the Intel build (x64) if needed.
-
Install Ollama and ensure it's running locally.
-
Install Tauri dependencies.
-
Clone this repo and install JavaScript dependencies:
pnpm install
-
Run the app in dev mode:
pnpm tauri dev
-
To build a local desktop binary:
pnpm tauri build
-
Follow Tauri's OS-specific setup instructions to install the built app.
| Layer | Stack |
|---|---|
| Frontend | React + TypeScript + Mantine + TanStack Query (inside Tauri) |
| Backend | Rust + Tauri commands +sqlxfor async SQLite |
| Model API | Ollama — local LLMs like LLaMA3, Qwen, etc. |
| Storage | SQLite — for threads, messages, and (soon) vector embeddings |
What's next? These features are planned or in-progress:
- Downloadable app releases (installers per OS)
- Chat with uploaded
.txtand.mdfiles - RAG (Retrieval-Augmented Generation) on uploaded content
- Settings sidebar to tune model parameters (temperature, top_p, etc.)
- MCP integration for tool calls using the Model Context Protocol
Argo is designed to be:
- Small — fast and lean with no server-side dependencies
- Private — everything runs locally; your data never leaves your machine
- Transparent — SQLite, open APIs, no tracking, no mystery boxes
It's a thoughtful tool designed to help you explore ideas, reflect, and create — entirely on your own terms.
Contributions are welcome! See the roadmap or open an issue to suggest features or improvements.
MIT