Skip to content

Context-Engine: MCP retrieval stack for AI coding assistants. Hybrid code search (dense + lexical + reranker), ReFRAG micro-chunking, local LLM prompt enhancement, and dual SSE/RMCP endpoints. One command deploys Qdrant-powered indexing for Cursor, Windsurf, Roo, Cline, Codex, and any MCP client.

License

Notifications You must be signed in to change notification settings

m1rl0k/Context-Engine

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CI npm version VS Code Marketplace

Documentation: Getting Started · README · Configuration · IDE Clients · MCP API · ctx CLI · Memory Guide · Architecture · Multi-Repo · Kubernetes · VS Code Extension · Troubleshooting · Development


Context-Engine

The open-source, self-improving code search that gets smarter every time you use it.

Context-Engine Usage

Why Context-Engine?

AI coding assistants are only as good as the context they retrieve. Most solutions chunk your code into large blocks and hope for the best—returning whole files when you need a single function, or missing the relevant code entirely. Context-Engine takes a different approach: ReFRAG-inspired micro-chunking returns precise 5-50 line spans, hybrid search combines semantic and lexical signals with cross-encoder reranking, and the system adapts to your codebase over time. No cloud dependency, no vendor lock-in—just a Docker Compose stack that works with any MCP-compatible tool.

What makes it different

Feature What it does
Precision Retrieval Returns exact code spans (5-50 lines), not whole files
Hybrid Search Dense vectors + lexical matching + cross-encoder reranking
MCP Native Dual transport (SSE + HTTP) for any AI coding tool
Works Locally Docker Compose, runs on your machine
Adaptive (optional) Enable learning mode to improve ranking from usage patterns

Quick Start

1. Start the stack

git clone https://github.com/m1rl0k/Context-Engine.git && cd Context-Engine
docker compose up -d

That's it. The stack is running.

2. Index your code

Option A: VS Code Extension (recommended)

Install Context Engine Uploader from the VS Code Marketplace. Open your project, click "Upload Workspace". Done.

The extension auto-syncs changes and configures your MCP clients.

Option B: CLI

# Index any project
HOST_INDEX_PATH=/path/to/your/project docker compose run --rm indexer

3. Connect your IDE

HTTP endpoints — for Claude Code, Windsurf, Qodo, and other RMCP-capable clients:

{
  "mcpServers": {
    "qdrant-indexer": { "url": "http://localhost:8003/mcp" },
    "memory": { "url": "http://localhost:8002/mcp" }
  }
}

stdio via npx (recommended) — unified bridge with workspace awareness:

{
  "mcpServers": {
    "context-engine": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "@context-engine-bridge/context-engine-mcp-bridge",
        "mcp-serve",
        "--workspace", "/path/to/your/project",
        "--indexer-url", "http://localhost:8003/mcp",
        "--memory-url", "http://localhost:8002/mcp"
      ]
    }
  }
}

See docs/IDE_CLIENTS.md for Cursor, Windsurf, Cline, Codex, Augment, and more.


Supported Clients

Client Transport
Claude Code SSE / RMCP
Cursor SSE / RMCP
Windsurf SSE / RMCP
Cline SSE / RMCP
Roo SSE / RMCP
Augment SSE
Codex RMCP
Copilot RMCP
AmpCode RMCP
Zed SSE (via mcp-remote)

Endpoints

Service URL
Indexer MCP (SSE) http://localhost:8001/sse
Indexer MCP (RMCP) http://localhost:8003/mcp
Memory MCP (SSE) http://localhost:8000/sse
Memory MCP (RMCP) http://localhost:8002/mcp
Qdrant http://localhost:6333
Upload Service http://localhost:8004

VS Code Extension

The Context Engine Uploader extension provides:

  • One-click upload — Sync your workspace to Context-Engine
  • Auto-sync — Watch for changes and re-index automatically
  • Prompt+ button — Enhance prompts with code context before sending
  • MCP auto-config — Writes Claude/Windsurf MCP configs for you

See docs/vscode-extension.md for full documentation.


MCP Tools

Search (Indexer MCP):

  • repo_search — Hybrid code search with filters
  • context_search — Blend code + memory results
  • context_answer — LLM-generated answers with citations
  • search_tests_for, search_config_for, search_callers_for

Memory (Memory MCP):

  • store — Save knowledge with metadata
  • find — Retrieve stored memories

Indexing:

  • qdrant_index_root — Index the workspace
  • qdrant_status — Check collection health
  • qdrant_prune — Remove stale entries

See docs/MCP_API.md for complete API reference.


Documentation

Guide Description
Getting Started VS Code + dev-remote walkthrough
IDE Clients Config examples for all supported clients
Configuration Environment variables reference
MCP API Full tool documentation
Architecture System design
Multi-Repo Multiple repositories in one collection
Kubernetes Production deployment

How It Works

flowchart LR
  subgraph Your Machine
    A[IDE / AI Tool]
    V[VS Code Extension]
  end
  subgraph Docker
    U[Upload Service]
    I[Indexer MCP]
    M[Memory MCP]
    Q[(Qdrant)]
    L[[LLM Decoder]]
    W[[Learning Worker]]
  end
  V -->|sync| U
  U --> I
  A -->|MCP| I
  A -->|MCP| M
  I --> Q
  M --> Q
  I -.-> L
  I -.-> W
  W -.-> Q
Loading

The VS Code extension syncs your workspace to the stack. Your IDE talks to the MCP servers, which query Qdrant for hybrid search. Optional features include a local LLM decoder (llama.cpp), cloud LLM integration (GLM, MiniMax M2), and adaptive learning that improves ranking over time.


Language Support

Python, TypeScript/JavaScript, Go, Java, Rust, C#, PHP, Shell, Terraform, YAML, PowerShell


License

MIT

About

Context-Engine: MCP retrieval stack for AI coding assistants. Hybrid code search (dense + lexical + reranker), ReFRAG micro-chunking, local LLM prompt enhancement, and dual SSE/RMCP endpoints. One command deploys Qdrant-powered indexing for Cursor, Windsurf, Roo, Cline, Codex, and any MCP client.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 9