5 stable releases
| 2.3.5 | Apr 25, 2026 |
|---|---|
| 2.1.1 | Apr 16, 2026 |
| 2.1.0 | Apr 15, 2026 |
| 2.0.1 | Apr 13, 2026 |
#2473 in Parser implementations
Used in 2 crates
1MB
20K
SLoC
lip-core
Rust library crate for LIP — Linked Incremental Protocol, a persistent, incremental code intelligence daemon.
This crate contains the daemon runtime, query graph, WAL journal, Tier 1/2/3 indexers, MCP/LSP wire protocol, semantic embedding layer, and registry client. It is the engine behind the lip-cli binary.
Usage
Add to your Cargo.toml:
[dependencies]
lip = { package = "lip-core", version = "2.0.1" }
The package alias lip keeps import paths clean:
use lip::daemon::LipDaemon;
use lip::query_graph::{ClientMessage, ServerMessage};
use lip::schema::OwnedSymbolInfo;
Links
License
MIT
lib.rs:
LIP — Linked Incremental Protocol
Rust reference implementation of LIP v1.3.
LIP is a language-agnostic protocol for streaming, incremental code intelligence. It is designed as a successor to LSP (runtime queries) and SCIP (static snapshots), combining lazy query graphs with a content-addressed dependency registry.
Quick start
# Start the daemon (watches files, persists to WAL journal)
lip daemon --socket /tmp/lip.sock
# Index a directory (Tier 1, tree-sitter)
lip index ./src
# Query definition at a position
lip query definition file:///src/main.rs 42 10
# Start the LSP bridge (standard LSP, no editor plugin needed)
lip lsp --socket /tmp/lip.sock
# Start the MCP server (AI agents: Claude Code, CKB, Cursor, …)
lip mcp --socket /tmp/lip.sock
# Semantic nearest-neighbour search (requires LIP_EMBEDDING_URL)
lip query embedding-batch file:///src/auth.rs
lip query nearest file:///src/auth.rs --top-k 5
Crate layout
| Module | Spec ref | Description |
|---|---|---|
schema |
§2 | Owned types mirroring schema/lip.fbs; LipUri |
query_graph |
§3.1 | Revision-based incremental query database |
indexer |
§3.3 | Tree-sitter Tier 1 indexer (Rust, TS, Python, Dart) |
daemon |
§6, §7.1 | Unix-socket IPC daemon + session loop |
daemon::embedding |
— | OpenAI-compatible HTTP embedding client |
bridge |
§10.1 | LIP-to-LSP bridge (tower-lsp) |
registry |
§3.4, §11 | Dependency slice cache + registry HTTP client |
Confidence tiers
| Tier | Score | Source |
|---|---|---|
| 1 | 1–50 | Tree-sitter (this crate, indexer module) |
| 2 | 51–90 | Compiler / type-checker (external, fed in via daemon) |
| 3 | 100 | Federated CAS registry (registry module) |
Semantic embeddings (v1.3)
When LIP_EMBEDDING_URL points to an OpenAI-compatible endpoint (e.g. Ollama,
OpenAI, Together AI), the daemon can store dense embedding vectors per file and
answer cosine nearest-neighbour queries:
LIP_EMBEDDING_URL=http://localhost:11434/v1/embeddings
LIP_EMBEDDING_MODEL=nomic-embed-text # optional; defaults to text-embedding-3-small
Use ClientMessage::EmbeddingBatch to populate vectors, then
ClientMessage::QueryNearest or ClientMessage::QueryNearestByText to search.
Embedding is optional — all other LIP features work without it.
Wire format
All IPC uses 4-byte big-endian length-prefixed JSON over a Unix domain
socket. The FlatBuffers zero-copy path (daemon::mmap) is implemented
but unused until v0.2.
Dependencies
~142MB
~3.5M SLoC