Stars
fmchisel: Efficient Compression and Training Algorithms for Foundation Models
AdalFlow: The library to build & auto-optimize LLM applications.
GLM-4.5: Agentic, Reasoning, and Coding (ARC) Foundation Models
本仓库包含对 Claude Code v1.0.33 进行逆向工程的完整研究和分析资料。包括对混淆源代码的深度技术分析、系统架构文档,以及重构 Claude Code agent 系统的实现蓝图。主要发现包括实时 Steering 机制、多 Agent 架构、智能上下文管理和工具执行管道。该项目为理解现代 AI agent 系统设计和实现提供技术参考。
An extremely fast Python package and project manager, written in Rust.
slime is an LLM post-training framework for RL Scaling.
Official PyTorch implementation for "Large Language Diffusion Models"
Official implementation of "Fast-dLLM: Training-free Acceleration of Diffusion LLM by Enabling KV Cache and Parallel Decoding"
3x Faster Inference; Unofficial implementation of EAGLE Speculative Decoding
1 min voice data can also be used to train a good TTS model! (few shot voice cloning)
Robust Speech Recognition via Large-Scale Weak Supervision
Train transformer language models with reinforcement learning.
LightLLM is a Python-based LLM (Large Language Model) inference and serving framework, notable for its lightweight design, easy scalability, and high-speed performance.
Production-tested AI infrastructure tools for efficient AGI development and community-driven innovation
🐳 Efficient Triton implementations for "Native Sparse Attention: Hardware-Aligned and Natively Trainable Sparse Attention"
verl: Volcano Engine Reinforcement Learning for LLMs
High-performance inference framework for large language models, focusing on efficiency, flexibility, and availability.
A fast communication-overlapping library for tensor/expert parallelism on GPUs.
Collection of best practices, reference architectures, model training examples and utilities to train large models on AWS.
Best practices & guides on how to write distributed pytorch training code
LLM training parallelisms (DP, FSDP, TP, PP) in pure C
Curated coding interview preparation materials for busy software engineers
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
A Telegram bot to recommend arXiv papers