Fast and embedded solvers for nonlinear optimal control and nonlinear model predictive control
-
Updated
Mar 13, 2026 - C
Fast and embedded solvers for nonlinear optimal control and nonlinear model predictive control
Pytorch-based framework for solving parametric constrained optimization problems, physics-informed system identification, and parametric model predictive control.
A library for differentiable nonlinear optimization
TorchOpt is an efficient library for differentiable optimization built upon PyTorch.
Safe robot learning
Official repo for the paper "SAGE: SLAM with Appearance and Geometry Prior for Endoscopy" (ICRA 2022)
Official code repository for ∇-Prox: Differentiable Proximal Algorithm Modeling for Large-Scale Optimization (SIGGRAPH TOG 2023)
Mathematical Programming in JAX
Implementation and examples from Trajectory Optimization with Optimization-Based Dynamics https://arxiv.org/abs/2109.04928
Differentiable curve and surface similarity measures.
A library for soft differentiable relaxations of common JAX functions.
[L4DC 2025] Automatic hyperparameter tuning for DeePC. Built by Michael Cummins at the Automatic Control Laboratory, ETH Zurich.
Preliminary code for the paper "Learning Deterministic Surrogates for Robust Convex QCQPs".
Tutorial on Deep Declarative Networks
A fully vectorized PyTorch implementation of BLEU scores optimized for training neural networks.
A library for soft differentiable relaxations of common PyTorch functions.
Decision-Focused Learning (DFL) for day-ahead scheduling of Underground Pumped Hydro Energy Storage (UPHES).
Collection of differentiable methods for robotics applications implemented with Pytorch.
A fully vectorized PyTorch implementation of ROUGE scores optimized for training neural networks.
🔍 Monitor your Mac’s system and network health with Net Bar—track speeds, CPU usage, and more in a customizable menu bar dashboard.
Add a description, image, and links to the differentiable-optimization topic page so that developers can more easily learn about it.
To associate your repository with the differentiable-optimization topic, visit your repo's landing page and select "manage topics."