- France
AI
getting pprof format ingestable by LLMs
Development Plugin for Avalonia
Fastest, smallest, and fully autonomous AI assistant infrastructure written in Zig
Repository for skills to assist AI coding agents with .NET and C#
Burning through your subscriptions too fast? Paying for stuff you never use? Stop guessing. OpenUsage is free and open source.
Lightweight coding agent that runs in your terminal
MLX-VLM is a package for inference and fine-tuning of Vision Language Models (VLMs) on your Mac using MLX.
A portable open-source operating system for agents. ~6 ms coldstarts, 32x cheaper than sandboxes. Powered by WebAssembly and V8 isolates.
Claude Code skills and sub-agents for .NET Developers
⚡ Native MLX Swift LLM inference server for Apple Silicon. OpenAI-compatible API, SSD streaming for 100B+ MoE models, TurboQuant KV cache compression, MACOS + iOS iPhone app.