local-llm
Here are 8 public repositories matching this topic...
Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
-
Updated
Sep 2, 2024 - TypeScript
Code with AI in VSCode, but you get to choose the AI.
-
Updated
Nov 15, 2024 - TypeScript
Structured inference with Llama 2 in your browser
-
Updated
Nov 1, 2024 - TypeScript
Run local LLM from Huggingface in React-Native using onnxruntime.
-
Updated
Sep 21, 2024 - TypeScript
Chrome extension that summarizes web page contents using Gemini Nano. Features include secure in-browser summarization, markdown output, and translation options.
-
Updated
Nov 23, 2024 - TypeScript
Simple Ollama web UI
-
Updated
Nov 20, 2024 - TypeScript
Improve this page
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."