Skip to content
View georgedahl's full-sized avatar

Block or report georgedahl

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
12 stars written in Python
Clear filter

Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

Python 33,903 3,232 Updated Nov 6, 2025

Hardware accelerated, batchable and differentiable optimizers in JAX.

Python 1,002 73 Updated Oct 9, 2025

Gaussian processes in JAX and Flax.

Python 549 69 Updated Oct 30, 2025

MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvements in both training algorithms and models.

Python 400 76 Updated Nov 6, 2025

Rax is a Learning-to-Rank library written in JAX.

Python 334 12 Updated Sep 4, 2025
Python 285 21 Updated Jul 15, 2024

Mathematical operations for JAX pytrees

Python 201 8 Updated Dec 5, 2024

Concise deep learning for JAX

Python 183 14 Updated Oct 21, 2020

Robust Bi-Tempered Logistic Loss Based on Bregman Divergences. https://arxiv.org/pdf/1906.03361.pdf

Python 147 27 Updated Dec 22, 2021
Python 79 9 Updated Nov 5, 2025

Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more

Python 3 Updated Aug 2, 2019

Flax is a neural network library for JAX that is designed for flexibility.

Python 1 Updated Jan 31, 2022