Starred repositories
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
VQVAEs, GumbelSoftmaxes and friends
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Neural Networks: Zero to Hero
Minimal, clean code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API
Neural Network primitives with multiple backends
Julia implementation of various rigid body dynamics and kinematics algorithms
A Control Systems Toolbox for Julia