7 unstable releases (3 breaking)

0.4.0 Apr 18, 2026
0.3.0 Apr 12, 2026
0.2.2 Apr 7, 2026
0.1.2 Mar 29, 2026

#17 in #batching


Used in 4 crates

MIT/Apache

585KB
12K SLoC

vil_embedder

High-performance text embedding engine for VIL.

Supports API-based providers (OpenAI) and local model inference (ONNX, planned). Features concurrent batch processing and SIMD-friendly vector similarity functions.

Quick start

use std::sync::Arc;
use vil_embedder::{OpenAiEmbedder, BatchEmbedder, EmbedProvider};
use vil_embedder::similarity::cosine_similarity;

let provider = Arc::new(OpenAiEmbedder::new("sk-..."));
let batcher = BatchEmbedder::new(provider.clone());

let texts = vec!["hello world".to_string(), "goodbye world".to_string()];
let embeddings = batcher.embed_all(&texts).await.unwrap();

let sim = cosine_similarity(&embeddings[0], &embeddings[1]);
println!("similarity: {sim}");

vil_embedder

VIL Embedding Engine — high-performance text embeddings with batching and SIMD similarity

Part of VIL

This crate is part of VIL — a process-oriented language and framework for building zero-copy, high-performance distributed systems.

License

Licensed under either of Apache License 2.0 or MIT License.

Dependencies

~29–48MB
~684K SLoC