Stars
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
🦜🔗 The platform for reliable agents.
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Vim-fork focused on extensibility and usability
Gin is a high-performance HTTP web framework written in Go. It provides a Martini-like API but with significantly better performance—up to 40 times faster—thanks to httprouter. Gin is designed for …
Models and examples built with TensorFlow
📚 Collaborative cheatsheets for console commands
uBlock Origin - An efficient blocker for Chromium and Firefox. Fast and lean.
Lightweight coding agent that runs in your terminal
👩💻👨💻 Awesome cheatsheets for popular programming languages, frameworks and development tools. They include everything you should know in one single file.
An open-source framework for making universal native apps with React. Expo runs on Android, iOS, and the web.
Claude Code is an agentic coding tool that lives in your terminal, understands your codebase, and helps you code faster by executing routine tasks, explaining complex code, and handling git workflo…
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Ray is an AI compute engine. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
TensorFlow code and pre-trained models for BERT
Carbon Language's main repository: documents, design, implementation, and related tools. (NOTE: Carbon Language is experimental; see README)
🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch.
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
Generative Models by Stability AI
Library for fast text representation and classification.
Fully open reproduction of DeepSeek-R1
Official inference repo for FLUX.1 models
Code for the paper "Language Models are Unsupervised Multitask Learners"
Graph Neural Network Library for PyTorch