22 Nov 25
Run Qwen LLMs locally in your browser with WebGPU. Zero installation, instant AI chat.
03 Nov 25
https://news.ycombinator.com/item?id=45798193
02 Nov 25
Web Search MCP Server for use with Local LLMs A TypeScript MCP (Model Context Protocol) server that provides comprehensive web search capabilities using direct connections (no API keys required) with multiple tools for different use cases.
Features Multi-Engine Web Search: Prioritises Bing > Brave > DuckDuckGo for optimal reliability and performance Full Page Content Extraction: Fetches and extracts complete page content from search results Multiple Search Tools: Three specialised tools for different use cases Smart Request Strategy: Switches between playwright browesrs and fast axios requests to ensure results are returned Concurrent Processing: Extracts content from multiple pages simultaneously
27 Oct 25
Hyperlink is a local-first AI agent that understands your files privately—PDFs, notes, transcripts, and more. No internet required. Data stays secure, offline, and under your control. A Glean alternative built for personal or regulated use.
24 Oct 25
Psst, kid, want some cheap and small LLMs? This blog post provides a comprehensive guide on how to set up and use llama.cpp, a C++ library, to efficiently run large language models (LLMs) locally on consumer hardware.
16 Jul 25
10 Apr 24
This is Dot, a standalone open source app meant for easy use of local LLMs and RAG in particular to interact with documents and files similarly to Nvidia’s Chat with RTX. Dot itself is completely standalone and is packaged with all dependencies including a copy of Mistral 7B, this is to ensure the app is as accessible as possible and no prior knowledge of programming or local LLMs is required to use it.
05 Apr 24
18 Dec 23
Our goal is to make open source large language models much more accessible to both developers and end users. We’re doing that by combining llama.cpp with Cosmopolitan Libc into one framework that collapses all the complexity of LLMs down to a single-file executable (called a “llamafile”) that runs locally on most computers, with no installation.