22 Nov 25

Run Qwen LLMs locally in your browser with WebGPU. Zero installation, instant AI chat.

by tmfnk 2 months ago

27 Oct 25

Hyperlink is a local-first AI agent that understands your files privately—PDFs, notes, transcripts, and more. No internet required. Data stays secure, offline, and under your control. A Glean alternative built for personal or regulated use.

by tmfnk 3 months ago
Tags:

24 Oct 25

Psst, kid, want some cheap and small LLMs? This blog post provides a comprehensive guide on how to set up and use llama.cpp, a C++ library, to efficiently run large language models (LLMs) locally on consumer hardware.

by tmfnk 3 months ago

10 Apr 24

This is Dot, a standalone open source app meant for easy use of local LLMs and RAG in particular to interact with documents and files similarly to Nvidia’s Chat with RTX. Dot itself is completely standalone and is packaged with all dependencies including a copy of Mistral 7B, this is to ensure the app is as accessible as possible and no prior knowledge of programming or local LLMs is required to use it.

by chrisSt 1 year ago