local-llm
Here are 112 public repositories matching this topic...
mychatgpt is a small and useful Python package that provides utils to create OpenAI's GPT conversational agents. This module allows users to have interactive chat with GPT models and keeps track of the chat history. Useful in Python projects as Copilot agent.
-
Updated
Feb 1, 2025 - Jupyter Notebook
ToolAgents is a lightweight and flexible framework for creating function-calling agents with various language models and APIs.
-
Updated
Feb 1, 2025 - Python
Harness LLMs with Multi-Agent Programming
-
Updated
Feb 1, 2025 - Python
Prism Chat (local) is a WebUI interface that allows you to easily use the local DeepSeek R1 model running directly with Ollama.
-
Updated
Feb 1, 2025 - TypeScript
Workday Copilot: A privacy-focused Chrome extension that automates job applications on Workday using local LLMs. Built with WXT, it intelligently fills forms, manages resume data, and handles navigation - all while keeping your data on your machine.
-
Updated
Feb 1, 2025 - TypeScript
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
-
Updated
Feb 1, 2025 - JavaScript
The TeamAI application allows users to create a team of AI powered assistants with individual capabilities, personas. The AI assistants will solve the task requested by the user as a team effort, each bot contributing with its respective capabilities. Supported providers are Ollama and OpenAI.
-
Updated
Jan 31, 2025 - JavaScript
A high-level Rust interface for language models powered by the Candle ML framework. It provides ergonomic and efficient APIs for intuitive language model interactions.
-
Updated
Jan 31, 2025 - Rust
Unified C# client for LLM providers.
-
Updated
Jan 31, 2025 - C#
C# client for Oobabooga.
-
Updated
Jan 31, 2025 - C#
C# client for LM Studio.
-
Updated
Jan 31, 2025 - C#
C# client for KoboldCpp.
-
Updated
Jan 31, 2025 - C#
Chat offline with open-source LLMs like deepseek-r1, nemotron, qwen, llama and more all through a simple R package powered by Shiny and Ollama. 🚀
-
Updated
Jan 30, 2025 - R
Chrome extension that summarizes web page contents using Gemini Nano. Features include secure in-browser summarization, markdown output, and translation options.
-
Updated
Jan 30, 2025 - TypeScript
A simple Unity script that sends a request to an LM Studio server, extracts content from the response, and logs the result.
-
Updated
Jan 30, 2025 - C#
Improve this page
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."