Skip to content
#

llm-inference

Here are 1,376 public repositories matching this topic...

🚀 Detect anomalies in structured datasets with this AI-driven ETL pipeline, ensuring data quality through seamless ingestion and machine learning insights.

  • Updated Dec 17, 2025
  • Jupyter Notebook

Context-Engine: MCP retrieval stack for AI coding assistants. Hybrid code search (dense + lexical + reranker), ReFRAG micro-chunking, local LLM prompt enhancement, and dual SSE/RMCP endpoints. One command deploys Qdrant-powered indexing for Cursor, Windsurf, Roo, Cline, Codex, and any MCP client.

  • Updated Dec 17, 2025
  • Python

✈️ Plan your trips effortlessly with the AI Travel Planner Agent, which generates detailed itineraries from natural language queries using multiple APIs.

  • Updated Dec 17, 2025
  • Python

Bud AI Foundry - A comprehensive inference stack for compound AI deployment, optimization and scaling. Bud Stack provides intelligent infrastructure automation, performance optimization, and seamless model deployment across multi-cloud/multi-hardware environments.

  • Updated Dec 17, 2025
  • Python

Improve this page

Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."

Learn more