Skip to content
View georgedahl's full-sized avatar

Block or report georgedahl

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Showing results

Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

Python 33,921 3,234 Updated Nov 8, 2025

Gaussian processes in JAX and Flax.

Python 549 69 Updated Nov 8, 2025

A machine learning compiler for GPUs, CPUs, and ML accelerators

C++ 3,659 679 Updated Nov 8, 2025

Flax is a neural network library for JAX that is designed for flexibility.

Jupyter Notebook 6,900 754 Updated Nov 8, 2025

MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvements in both training algorithms and models.

Python 400 77 Updated Nov 8, 2025
Python 79 9 Updated Nov 7, 2025
C++ 121 23 Updated Nov 7, 2025

Differentiable, Hardware Accelerated, Molecular Dynamics

Jupyter Notebook 1,321 225 Updated Nov 7, 2025

Hardware accelerated, batchable and differentiable optimizers in JAX.

Python 1,002 73 Updated Oct 9, 2025

Rax is a Learning-to-Rank library written in JAX.

Python 334 12 Updated Sep 4, 2025

Research language for array processing in the Haskell/ML family

Haskell 1,647 114 Updated Jan 25, 2025

Mathematical operations for JAX pytrees

Python 202 8 Updated Dec 5, 2024
Python 285 21 Updated Jul 15, 2024

A playbook for systematically maximizing the performance of deep learning models.

29,355 2,399 Updated Jun 18, 2024
HTML 12 5 Updated Apr 27, 2024

Flax is a neural network library for JAX that is designed for flexibility.

Python 1 Updated Jan 31, 2022

Robust Bi-Tempered Logistic Loss Based on Bregman Divergences. https://arxiv.org/pdf/1906.03361.pdf

Python 147 27 Updated Dec 22, 2021

Concise deep learning for JAX

Python 183 14 Updated Oct 21, 2020

Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

Python 3 Updated Aug 2, 2019