Stars
[NeurIPS 2025 Spotlight] E2Former: An Efficient and Equivariant Transformer with Linear-Scaling Tensor Products
A framework for training energy-based diffusion models for sampling and energy estimation.
CHARMM and AMBER forcefields for OpenMM (with small molecule support)
[ICML 2024] Official implementation for "Beyond ELBOs: A Large-Scale Evaluation of Variational Methods for Sampling".
AIMNet2: Fast and accurate machine-learned interatomic potential for molecular dynamics simulations
Code for computing vibration data and Gibbs corrections from vasp OUTCARS and with fairchem open catalyst machine learning potentials..
A high-performance, GPU-accelerated library for key computational chemistry tasks, such as molecular similarity, conformer generation, and geometry relaxation.
Muon is an optimizer for hidden layers in neural networks
Thermal and photochemical reaction path optimization and discovery
Source codes for paper "Harnessing Machine Learning to Enhance Transition State Search with Interatomic Potentials and Generative Models"
deepprinciple / pyGSM
Forked from ZimmermanGroup/pyGSMThermal and photochemical reaction path optimization and discovery
deepprinciple / pysisyphus
Forked from bytedance/pysisyphusPython suite for optimization of stationary points on ground- and excited states PES and determination of reaction paths.
React-OT is a generative transition state search model developed by DeepPrinciple, which uses Optimal Transport (OT) methods to generate deterministic transition state structures from reactants and…
Interpolation of molecular geometries through geodesics in redundant internal coordinate hyperspace for complex transformations
Normalizing-flow enhanced sampling package for probabilistic inference in Jax
PyTorch Implementation of Diffusion Schrodinger Bridge Matching
PyTorch Implementation of DSB for Score Based Generative Modeling. Experiments managed using Hydra.
code for "Adjoint Sampling: Highly Scalable Diffusion Samplers via Adjoint Matching"
Official Implementation of paper "Training Neural Samplers with Reverse Diffusive KL Divergence"
Code for “FlowMM Generating Materials with Riemannian Flow Matching” and "FlowLLM: Flow Matching for Material Generation with Large Language Models as Base Distributions"
Reward fine-tuning for Stable Diffusion models based on stochastic optimal control, including Adjoint Matching
[ICLR 2025] Official implementation for "Underdamped Diffusion Bridges with Applications to Sampling".