-
Massachusetts Institute of Technology(MIT)
- Cambridge,MA
Stars
Show Julia profiling data in an explorable HTML page
LinearSolve.jl: High-Performance Unified Interface for Linear Solvers in Julia. Easily switch between factorization and Krylov methods, add preconditioners, and all in one interface.
Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable i…
A component of the DiffEq ecosystem for enabling sensitivity analysis for scientific machine learning (SciML). Optimize-then-discretize, discretize-then-optimize, adjoint methods, and more for ODEs…
MIT unofficial thesis template from overleaf, updated for 2023
18.065/18.0651: Matrix Methods in Data Analysis, Signal Processing, and Machine Learning
High-performance and differentiation-enabled nonlinear solvers (Newton methods), bracketed rootfinding (bisection, Falsi), with sparsity and Newton-Krylov support.
Testing out Nonlinear Solvers that Automatically Switch between Discrete and Continuous Variants
Automatic Finite Difference PDE solving with Julia SciML
Simulation framework for nonsmooth dynamical systems
OptNet: Differentiable Optimization as a Layer in Neural Networks
Various experiments for learning through Linear Complementarity Problems
On efficient training and inference of Neural Differential Equations
Mixed complementarity problems parameterized by "runtime"-parameters with support for implicit differentiation.