Skip to content
#

local-llm

Here are 291 public repositories matching this topic...

A fully offline NLP pipeline for extracting, chunking, embedding, querying, summarizing, and translating research documents using local LLMs. Inspired by the fictional mystery of Dr. X, the system supports multi-format files, local RAG-based Q&A, Arabic translation, and ROUGE-based summarization — all without cloud dependencies.

  • Updated Dec 14, 2025
  • Python

Local Deep Research achieves ~95% on SimpleQA benchmark (tested with GPT-4.1-mini). Supports local and cloud LLMs (Ollama, Google, Anthropic, ...). Searches 10+ sources - arXiv, PubMed, web, and your private documents. Everything Local & Encrypted.

  • Updated Dec 13, 2025
  • Python

Improve this page

Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."

Learn more