-
SkyWork
- ChengDu
- www.giantpandacv.com
Lists (1)
Sort Name ascending (A-Z)
Stars
Autonomous GPU kernel optimization system driven by AI agents.
A PyTorch native platform for training generative AI models
If you want to purchase Panzhihua Mi Yi Pipa, please contact me.
Automated CUDA kernel performance diagnostics from NVIDIA Nsight Compute (NCU) CSV exports.
Autoresearch for GPU kernels. Give it any PyTorch model, go to sleep, wake up to optimized Triton kernels.
VeOmni: Scaling Any Modality Model Training with Model-Centric Distributed Recipe Zoo
Terminal UI for NVIDIA Nsight Systems profiles — timeline viewer, kernel navigator, NVTX hierarchy
Humanizer 的汉化版本,Claude Code Skills,旨在消除文本中 AI 生成的痕迹。
The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, e…
An agentic skills framework & software development methodology that works.
FlashInfer: Kernel Library for LLM Serving
《开源大模型食用指南》针对中国宝宝量身打造的基于Linux环境快速微调(全参数/Lora)、部署国内外开源大模型(LLM)/多模态大模型(MLLM)教程
FlashInfer Bench @ MLSys 2026: Building AI agents to write high performance GPU kernels
High performance RMSNorm Implement by using SM Core Storage(Registers and Shared Memory)
Accelerating MoE with IO and Tile-aware Optimizations
A compact implementation of SGLang, designed to demystify the complexities of modern LLM serving systems.
A PyTorch-native inference engine with hybrid cache acceleration and massive parallelism for DiTs.
GPU programming related news and material links
Train speculative decoding models effortlessly and port them smoothly to SGLang serving.
Expert Specialization MoE Solution based on CUTLASS
Utility scripts for PyTorch (e.g. Make Perfetto show some disappearing kernels, Memory profiler that understands more low-level allocations such as NCCL, ...)
LightLLM is a Python-based LLM (Large Language Model) inference and serving framework, notable for its lightweight design, easy scalability, and high-speed performance.