21 Lessons, Get Started Building with Generative AI
-
Updated
Nov 10, 2025 - Jupyter Notebook
21 Lessons, Get Started Building with Generative AI
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
🧑🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Ongoing research training transformer models at scale
Easy-to-use and powerful LLM and SLM library with awesome model zoo.
OpenVINO™ is an open source toolkit for optimizing and deploying AI inference
CVPR 2025 论文和开源项目合集
AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Private AI platform for agents, assistants and enterprise search. Built-in Agent Builder, Deep research, Document analysis, Multi-model support, and API connectivity for agents.
Semantic segmentation models with 500+ pretrained convolutional and transformer-based backbones.
This repository contains demos I made with the Transformers library by HuggingFace.
A PyTorch-based Speech Toolkit
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, DeepSeek, Mixtral, Gemma, Phi, MiniCPM, Qwen-VL, MiniCPM-V, etc.) on Intel XPU (e.g., local PC with iGPU and NPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, vLLM, DeepSpeed, Axolotl, etc.
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
The Hugging Face course on Transformers
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."