Highlights
- Pro
Stars
Code for "EgoX: Egocentric Video Generation from a Single Exocentric Video"
GPU-optimized framework for training diffusion language models at any scale. The backend of Quokka, Super Data Learners, and OpenMoE 2 training.
StreamDiffusion: A Pipeline-Level Solution for Real-Time Interactive Generation
📚LeetCUDA: Modern CUDA Learn Notes with PyTorch for Beginners🐑, 200+ CUDA Kernels, Tensor Cores, HGEMM, FA-2 MMA.🎉
slime is an LLM post-training framework for RL Scaling.
TheBoringNotch: Not so boring notch That Rocks 🎸🎶
Anthropic's original performance take-home, now open for you to try!
Fetch source code for npm packages to give AI coding agents deeper context
Browser automation CLI for AI agents
Minimal Claude Code alternative. Single Python file, zero dependencies, ~250 lines.
~950 line, minimal, extensible LLM inference engine built from scratch.
MoE training for Me and You and maybe other people
My learning notes for ML SYS.
A framework for the evaluation of autoregressive code generation language models.
Code for the paper "Efficient Training of Language Models to Fill in the Middle"
Code for the paper "Evaluating Large Language Models Trained on Code"
This repo contains the source code for RULER: What’s the Real Context Size of Your Long-Context Language Models?
Official repository for LiteTracker: Leveraging Temporal Causality for Accurate Low-latency Tissue Tracking; published at MICCAI 2025.
DeepGEMM: clean and efficient FP8 GEMM kernels with fine-grained scaling
Tensors and Dynamic neural networks in Python with strong GPU acceleration
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
a small protein language model based off of nanochat
PyTorch native quantization and sparsity for training and inference
FlashAttention written in metal-cpp headers
The simplest, fastest repository for training/finetuning small-sized VLMs.
A PyTorch native platform for training generative AI models
PyTorch building blocks for the OLMo ecosystem