Harness LLMs with Multi-Agent Programming
-
Updated
Jul 20, 2025 - Python
Harness LLMs with Multi-Agent Programming
Local Deep Research achieves ~95% on SimpleQA benchmark (tested with GPT-4.1-mini) and includes benchmark tools to test on your own setup. Searches 10+ sources - arXiv, PubMed, GitHub, web, and your private documents. Everything Local.
Free, high-quality text-to-speech API endpoint to replace OpenAI, Azure, or ElevenLabs
Your fully proficient, AI-powered and local chatbot assistant🤖
Local, OpenAI-compatible text-to-speech (TTS) API using Chatterbox, enabling users to generate voice cloned speech anywhere the OpenAI API is used (e.g. Open WebUI, AnythingLLM, etc.)
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include multi-server, dynamic model switching, streaming responses, tool management, human-in-the-loop, thinking mode, full model parameters configuration, custom system prompt and saved preferences. Built for developers working with local LLMs.
LLM story writer with a focus on high-quality long output based on a user provided prompt.
A python package for developing AI applications with local LLMs.
Custom TTS component for Home Assistant. Utilizes the OpenAI speech engine or any compatible endpoint to deliver high-quality speech. Optionally offers chime and audio normalization features.
Openai-style, fast & lightweight local language model inference w/ documents
A python script designed to translate large amounts of text with an LLM and the Ollama API
Recipes for on-device voice AI and local LLM
Project Jarvis is a versatile AI assistant that integrates various functionalities.
Python library for the instruction and reliable validation of structured outputs (JSON) of Large Language Models (LLMs) with Ollama and Pydantic. -> Deterministic work with LLMs.
A lo-fi AI-first note taker running locally on-device
A flexible free and unlimited PDF Translator for Human with Local-LLM or ChatGPT
Test your local LLMs on the AIME problems
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."