2 releases
| 0.1.3 | Jan 3, 2026 |
|---|---|
| 0.1.2 | Jan 1, 2026 |
#1089 in Machine learning
Used in 4 crates
(2 directly)
125KB
2.5K
SLoC
Enki LLM
LLM (Large Language Model) integration layer for the Enki agent framework.
This crate provides a unified interface for interacting with various LLM providers including OpenAI, Anthropic, Ollama, Google, and many others.
Supported Providers
- OpenAI (GPT-4, GPT-4o, GPT-3.5)
- Anthropic (Claude 3, Claude 3.5)
- Ollama (local models: Llama, Mistral, Gemma, etc.)
- Google (Gemini models)
- DeepSeek
- xAI (Grok)
- Groq
- Mistral
- Cohere
- Phind
- OpenRouter
Quick Start
use enki_llm::{UniversalLLMClient, LLMConfig};
// Create client with model name (API key from environment)
let client = UniversalLLMClient::new("openai::gpt-4o", None)?;
// Or with explicit configuration
let config = LLMConfig::new("anthropic::claude-3-sonnet-20240229")
.with_temperature(0.7)
.with_max_tokens(1000);
let client = UniversalLLMClient::new_with_config(config)?;
Dependencies
~18–37MB
~444K SLoC