Highlights
- Pro
Stars
Code for the ProteinMPNN paper
Fast and memory-efficient exact attention
Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
Tensors and Dynamic neural networks in Python with strong GPU acceleration
An open source implementation of CLIP.
PyTorch code and models for VJEPA2 self-supervised learning from video.
A high-performance, GPU-accelerated library for key computational chemistry tasks, such as molecular similarity, conformer generation, and geometry relaxation.
Intelligent Router for Mixture-of-Models
A partially latent flow matching model for the joint generation of a protein’s amino acid sequence and full atomistic structure, including both the backbone and side chain.
Enjoy the magic of Diffusion models!
ProtTrans is providing state of the art pretrained language models for proteins. ProtTrans was trained on thousands of GPUs from Summit and hundreds of Google TPUs using Transformers Models.
AI-powered ab initio biomolecular dynamics simulation
This API provides programmatic access to the AlphaGenome model developed by Google DeepMind.
Circular visualization in Python (Circos Plot, Chord Diagram, Radar Chart)
A trainable PyTorch reproduction of AlphaFold 3.
Transformer-based sequence correction method for genome assembly polishing
[ICLR 2025 Oral] Block Diffusion: Interpolating Between Autoregressive and Diffusion Language Models
Open-source implementation of AlphaEvolve
Implementation for SE(3) diffusion model with application to protein backbone generation
Proteina is a new large-scale flow-based protein backbone generator that utilizes hierarchical fold class labels for conditioning and relies on a tailored scalable transformer architecture.
Genome modeling and design across all domains of life
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction of AlphaFold 2
Protein Ligand INteraction Dataset and Evaluation Resource