Readme
oli - Open Local Intelligent assistant
oli is an open-source alternative to Claude Code, built in Rust to provide powerful agentic capabilities for coding assistance. It features:
A flexible TUI interface for working with code
Support for both cloud APIs (Anthropic Claude Sonnet 3.7 and OpenAI GPT4o) and local LLMs (via Ollama)
Strong agentic capabilities including file search, edit, and command execution
Tool use support across all model providers (Anthropic, OpenAI, and Ollama)
⚠️ This project is in a very early stage and is prone to bugs and issues! Please post your issues as you encounter them.
Installation
Using Cargo
cargo install oli-tui
Using Homebrew (macOS)
brew tap amrit110/oli
brew install oli
From Source
# Clone the repository
git clone https://github.com/amrit110/oli
cd oli
# Build and run
cargo build --release
cargo run
Environment Setup
Cloud API Models
For API-based features, set up your environment variables:
# Create a .env file in the project root
echo "ANTHROPIC_API_KEY=your_key_here" > .env
# OR
echo "OPENAI_API_KEY=your_key_here" > .env
Using Anthropic Claude 3.7 Sonnet (Recommended)
Claude 3.7 Sonnet provides the most reliable and advanced agent capabilities:
Obtain an API key from Anthropic
Set the ANTHROPIC_API_KEY environment variable
Select the "Claude 3.7 Sonnet" model in the UI
This implementation includes:
Optimized system prompts for Claude 3.7
JSON schema output formatting for structured responses
Improved error handling and retry mechanisms
Using Ollama Models
oli supports local models through Ollama:
Install Ollama if you haven't already
Start the Ollama server:
ollama serve
Pull the model you want to use (we recommend models with tool use capabilities):
# Examples of compatible models
ollama pull qwen2.5-coder:14b
ollama pull qwen2.5-coder:3b
ollama pull llama3:8b
Start oli and select the Ollama model from the model selection menu
Note: For best results with tool use and agent capabilities, use models like Qwen 2.5 Coder which support function calling.
Usage
Start the application:
cargo run
Select a model:
Cloud models (Claude 3 Sonnet, GPT-4o) for full agent capabilities
Local models via Ollama (Qwen, Llama, etc.)
Make your coding query in the chat interface:
Ask for file searches
Request code edits
Execute shell commands
Get explanations of code
Examples
Here are some example queries to try:
"Explain the codebase and how to get started"
"List all files in the project"
"Summarize the Cargo.toml file"
"Show me all files that import the 'anyhow' crate"
License
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.
Acknowledgments
This project is inspired by Claude Code and similar AI assistants
Uses Anthropic's Claude 3.7 Sonnet model for optimal agent capabilities
Built with Rust and the Ratatui library for terminal UI
Special thanks to the Rust community for excellent libraries and tools