-
genai
Multi-AI Providers Library for Rust. (OpenAI, Gemini, Anthropic, xAI, Ollama, Groq, DeepSeek, Grok)
-
llm
unifying multiple LLM backends
-
devcat
A micro-version control system for your AI development loop
-
ollama-file-find-cli
A command-line interface for the Ollama file find library
-
llm-stack
Core traits, types, and tools for the llm-stack SDK
-
deepsearch
A CLI for AI-powered research assistance, using local LLMs to decompose questions, search the web, and synthesize answers
-
ollama-kernel
Ollama Jupyter Kernel
-
weavex
Weave together web search and AI reasoning - an autonomous research agent powered by local LLMs
-
octoroute
Intelligent multi-model router for self-hosted LLMs
-
tenere
TUI interface for LLMs written in Rust
-
immich-analyze
AI-powered image description generator for Immich photo management system
-
mermaid-cli
Open-source AI pair programmer with agentic capabilities. Local-first with Ollama, native tool calling, and beautiful TUI.
-
api_ollama
Ollama local LLM runtime API client for HTTP communication
-
ollama2llama
constructing and managing a llama-swap configuration file with Ollama models
-
ai-mate
audio ai conversation system
-
ochat
A chatbot cli that uses Ollama AIs
-
worksplit
CLI tool that orchestrates local Ollama LLM for code generation and verification
-
a3s-power
A3S Power - Local model management and serving with OpenAI-compatible API
-
siumai-provider-ollama
Ollama provider implementation and protocol standard for siumai
-
cooklang-import
importing recipes into Cooklang format
-
ocommit
quickly create a git commit message with ollama and commit, locally
-
howfast
Small CLI tool that measures token metrics and completion tokens per second for an Ollama model response
-
ariste
An AI Agent framework with tool calling and multi-agent collaboration support
-
ggufy
Unified GGUF wrapper for llama.cpp and Ollama
-
mojentic
An LLM integration framework for Rust
-
ratatalk
A terminal chat client for Ollama, built with Rust and ratatui
-
zoey-core
ZoeyAI core runtime and types — privacy-first AI agent framework optimized for local models
-
mirror
unifying multiple LLM backends
-
ollana
Ollama over LAN - Auto-discover your Ollama server on your local network with hassle-free ease
-
mono-ai
Provider-agnostic Rust AI library
-
lazyllama
A lightweight TUI client for Ollama with markdown support and smart scrolling
-
agentic
Support library for building Agentic MCP and Agent2Agent based systems
-
llm-link
universal LLM proxy supporting 10 providers (OpenAI, Anthropic, Zhipu, Aliyun, Volcengine, Tencent, Longcat, Moonshot, Minimax, Ollama) with dynamic model discovery API, hot-reload configuration…
-
bunnysql
🐰 Bunny SQL Assistant is a CLI tool that converts natural language into SQL
-
machi
A Web3-native AI Agent Framework
-
reagent-rs
building AI agents with MCP & custom tools
-
product-os-models
Product OS : Models provides direct implementation using Candle or indirect implementation using Ollama of different models, focused on LLMs
-
protopolis
A multi-agent Ollama simulation in Rust
-
omniference
A multi-protocol inference engine with provider adapters
-
simple_llama_rs
Ollama API
-
hiramu
AI Engineering Toolbox to Access Ollama, AWS Bedrock
-
piramid
Vector database for agentic applications
-
ollama-proxy
Proxy server for Ollama
-
mecha10-nodes-llm-command
Natural language command parsing via LLM APIs (OpenAI, Claude, Ollama)
-
ghost-lib
Ghost Librarian — ultra-lightweight local-LLM RAG engine with Context Distillation
-
rllm
Unifies multiple LLM backends in Rust
-
mecha10-ai-llm
Large Language Model integration for Mecha10 - Claude, GPT-4, Gemini, and local models
-
yammer
ollama-compatible client library
-
nexus-orchestrator
Distributed LLM model serving orchestrator - unified API gateway for heterogeneous inference backends
-
ait
terminal based chat interface for interacting with large language models from various providers
-
gcomm
Generate AI-powered Git commit messages from staged changes using a local Ollama model
-
yo
Ask your terminal anything using AI (OpenAI or Ollama)
-
prime
A CLI tool for interacting with LLMs
-
aether-ai
AI provider integrations for aether-codegen
-
llm-stack-anthropic
Anthropic Claude provider for the llm-stack SDK
-
llm-stack-openai
OpenAI GPT provider for the llm-stack SDK
-
bashelp
Natural language to shell commands. Local-first, provider agnostic.
-
ollama-native
A minimalist Ollama Rust SDK that provides the most basic functionality for interacting with Ollama
-
ollama-proxy-rs
A lightweight Rust proxy for Ollama that intelligently adjusts request parameters to match each model's training configuration
-
llm-stack-ollama
Ollama local model provider for the llm-stack SDK
-
oli-tui
blazingly fast TUI based AI coding assistant
-
ollama-inquire
Query any LLM found on Ollama from the terminal!
-
ollama-sdk
An idiomatic, unofficial Rust client for the Ollama API with support for streaming, tool calling, and custom transports
-
rip-video
Terminal UI pipeline to download media audio, transcribe it with ffmpeg-whisper, and generate minutes locally
-
kowalski
Rust-based agent for interacting with Ollama models
-
pdf-renamer-ai
Intelligently rename PDF files using local LLMs. Supports multiple languages, automatic translation, and meaningful name generation.
-
agentic_optio_rs
Production-grade multi-agent framework with minimal dependencies - Rust implementation
-
ask_ai
interacting with various AI frameworks
-
enki-llm
LLM integration for the Enki agent framework
-
aleph_ollama
Aleph Ollama Code Translator
-
vex-llm
LLM provider integrations for VEX
-
modelfile
A parser for Ollama Modelfiles
-
ochat-iced
A chatbot application that uses Ollama AIs
-
easy_llm
Provide an asynchronous LLM caller that can integrate with various LLM services, including Ollama
-
ask-ollama
Query any LLM found on Ollama from the terminal!
-
caliber-llm
Vector abstraction layer and LLM provider traits for CALIBER
-
moxie-ai
Bold AI chatbot API for website integration - unified interface to Ollama, OpenAI, and Anthropic
-
ulm
AI-powered manpage assistant using local LLM
-
ai_rs
One sdk for all AI platforms
-
ollamars
ollama sdk using async rust
-
ochat-types
Types used between ochat packages and binaries
-
ollama-file-find
Ollama model file inspection and discovery
-
rexis-llm
Rexis LLM - Multi-provider LLM client with streaming, tool calling, and JSON schema support
-
vibelang
Programmatically instantiate Web Agents from Vibelang files
-
ruchat
ollama/chroma command-line AI chat tool
-
orch_response
Models for orch Executor responses
-
ochat-common
Common functionality for frontend ochat apps
-
commitgenius
An AI-powered CLI tool that generates conventional commit messages using local LLMs via Ollama
-
rmcp-ollama
MCP server for Ollama local LLM management
-
kproc-llm
Knowledge Processing library, using LLMs
-
ollama_code
Coding assistant with an ollama backend
-
ricecoder-providers
AI provider abstraction and integration
-
kowalski-code-agent
Kowalski Code Agent: A Rust-based agent for interacting with Ollama models
-
kowalski-web-agent
Kowalski Web Agent: A Rust-based agent for interacting with Ollama models
-
ai-shell
A CLI tool that turns natural language into shell commands!
-
ollama_td
ollama cli tool downloader
-
promptbox
A CLI tool for managing and executing LLM prompt templates
-
orch
Language model orchestration library
-
agent-chain
Agent chain library
-
regula-llm
LLM client integrations for REGULA framework
-
mcpcrs
MCP client library in pure rust
-
kowalski-academic-agent
Kowalski Academic Agent: A Rust-based agent for interacting with Ollama models
-
kowalski-data-agent
Kowalski Date Agent: A Rust-based agent for interacting with Ollama models
-
rsllm
Rust-native LLM client library with multi-provider support and streaming capabilities
-
lib-client-ollama
Ollama API client library
-
mini-openai
An OpenAI API client with minimal dependencies
-
rtwo
CLI interface for Ollama written in Rust
-
prosa-ollama
ProSA processor for Ollama
-
kowalski-federation
Kowalski Federation: A Rust-based agent for interacting with Ollama models
-
samvadsetu
LLM API for commonly used LLM services including Gemini, ChatGPT, and Ollama. The name implies a bridge for dialogue since the library facilitates communication and interaction between…
-
richard
modular chatbot
-
kowalski-agent-template
Kowalski Template of Agent: A Rust-based agent for interacting with Ollama models
-
mood-msg
Generate witty, mood-based git commit messages
-
ollama_models_info_fetcher
ollama models information fetcher
-
kowalski-tools
Kowalski Tooling: A Rust-based agent for interacting with Ollama models
-
mybinder
build API
-
llm-stack-core
Core traits, types, and tools for the llm-stack SDK
-
kowalski-cli
Kowalski CLI Interface: A Rust-based agent for interacting with Ollama models
-
korah
A CLI utility for natural language queries
-
ofc
A command-line Ollama Function Caller
-
kazama
an ollama wrapper in rust
-
sollama
A CLI Tool to Search and summarize the results with Ollama models in your terminal
-
mono-ai-macros
procedural macros for tools in mono-ai
-
omama_manager
omama manager for managing conversation with ollama models
-
ricecoder-local-models
Local model management for RiceCoder (Ollama integration)
-
vibeland-conf
Shared configuration and LLM client harness for Vibeland project
-
ollama_translator
Ollama Translator for natural language
-
kowalski-core
Kowalski Core Module: A Rust-based agent for interacting with Ollama models
-
chatti
Terminal-based chat application that interfaces with Ollama
-
ollama-rs-macros
Procedural macros for ollama-rs
-
ollama-bash-command-error
call to Ollama when you mis-spell a bash command
-
olinker
natively linking ollama and rust code
-
hunnigan
An intelligence agent providing assistance directly where you need it. A CLI swiss army tool to interact with an Ollama hosted LLM.
-
ollama-sdk-macros
An idiomatic, unofficial Rust client for the Ollama API with support for streaming, tool calling, and custom transports
-
bunny-sql-assistant
🐰 Bunny SQL Assistant is a CLI tool that converts natural language into SQL using local LLMs like Ollama
-
orch_response_derive
Derive macros for orch Executor responses
-
ollama-oxide
integrating with Ollama's native API, providing low-level primitives and high-level conveniences
-
mimir-dm-ai
AI integration with Ollama for Mimir
Try searching with DuckDuckGo.