Starred repositories
The fastest repo in history to surpass 50K stars ⭐, reaching the milestone in just 2 hours after publication. Better Harness Tools that make real things done. Now writing in Rust using oh-my-codex.
OmX - Oh My codeX: Your codex is not alone. Add hooks, agent teams, HUDs, and so much more.
TimesFM (Time Series Foundation Model) is a pretrained time-series foundation model developed by Google Research for time-series forecasting.
Universal CLAUDE.md - cut Claude output tokens by 63%. Drop-in. No code changes.
A framework for collecting and analyzing prediction market data, including the largest publicly available dataset of Polymarket and Kalshi market and trade data.
scikit-learn: machine learning in Python
The absolute trainer to light up AI agents.
A machine learning primer built from first principles. For engineers who want to reason about ML systems the way they reason about software systems.
Native macOS Wayland Compositor written in Rust using Smithay. Experience seamless Linux app streaming on macOS without XQuartz.
Emulate Arduino, ESP32 & Raspberry Pi. in your browser. Write code, compile, and run on 19 real boards — Arduino Uno, ESP32, ESP32-C3, Raspberry Pi Pico, Raspberry Pi 3, and more. No hardware, no c…
A high-throughput and memory-efficient inference and serving engine for LLMs
Self-referential self-improving agents that can optimize for any computable task
AI coding workstation: Claude Code + web UI + 5 AI CLIs + headless browser + 50+ tools
fully automated NixOS CLI installer
The agent harness performance optimization system. Skills, instincts, memory, security, and research-first development for Claude Code, Codex, Opencode, Cursor and beyond.
The fastest macOS package manager. Written in Zig. 3ms warm installs.
Running a big model on a small laptop
Project N.O.M.A.D, is a self-contained, offline survival computer packed with critical tools, knowledge, and AI to keep you informed and empowered—anytime, anywhere.
Examples and guides for using the OpenAI API
tiktoken is a fast BPE tokeniser for use with OpenAI's models.
Light, fluffy, and always free - AWS Local Emulator