-
Peking University
- Beijing, China
-
06:39
(UTC +08:00)
Lists (1)
Sort Name ascending (A-Z)
Stars
Your own personal AI assistant. Any OS. Any Platform. The lobster way. 🦞
🍃 MINT-1T: A one trillion token multimodal interleaved dataset.
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
PyTorch code for the paper "Model-Based Imitation Learning for Urban Driving".
Implementation of Alphafold 3 from Google Deepmind in Pytorch
FastKAN: Very Fast Implementation of Kolmogorov-Arnold Networks (KAN)
tiktoken is a fast BPE tokeniser for use with OpenAI's models.
Open-Sora: Democratizing Efficient Video Production for All
Stable Diffusion web UI
A latent text-to-image diffusion model
PyTorch extensions for high performance and large scale training.
SciAssess is a comprehensive benchmark for evaluating Large Language Models' proficiency in scientific literature analysis across various fields, focusing on memorization, comprehension, and analysis.
[ICLR 2023 Spotlight] Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs
A simple, nice and easily extensible biblography / citation style for Chinese LaTeX users
An open-source platform for developing protein models beyond AlphaFold.
LaTeX template for dissertations in Peking University
MMseqs2: ultra fast and sensitive search and clustering suite
Foldseek enables fast and sensitive comparisons of large structure sets.
Denoising Diffusion Probabilistic Models
Implementation of Nougat Neural Optical Understanding for Academic Documents
Supplementary code and data to "Improving Attacks on Round-Reduced Speck32/64 Using Deep Learning"
Implementation for SE(3) diffusion model with application to protein backbone generation
Making Protein Design accessible to all via Google Colab!
Code for the ProteinMPNN paper
Universal Structure Alignment of Monomeric and Complex Structure of Nucleic Acids and Proteins
To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released