Highlights
- Pro
Stars
Your own personal AI assistant. Any OS. Any Platform. The lobster way. 🦞
Tensors and Dynamic neural networks in Python with strong GPU acceleration
Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
A playbook for systematically maximizing the performance of deep learning models.
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
Fast and memory-efficient exact attention
Generative Agents: Interactive Simulacra of Human Behavior
Open source code for AlphaFold 2.
LLM Council works together to answer your hardest questions
High-Resolution Image Synthesis with Latent Diffusion Models
An open source implementation of CLIP.
Enjoy the magic of Diffusion models!
PyTorch package for the discrete VAE used for DALL·E.
A MIT-licensed, deployable starter kit for building and customizing your own version of AI town - a virtual town where AI characters live, chat and socialize.
Official PyTorch Implementation of "Scalable Diffusion Models with Transformers"
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
Open-source implementation of AlphaEvolve
PyTorch Lightning + Hydra. A very user-friendly template for ML experimentation. ⚡🔥⚡
Evolutionary Scale Modeling (esm): Pretrained language models for proteins
GLIDE: a diffusion-based text-conditional image synthesis model