Stars
12
stars
written in Python
Clear filter
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Hardware accelerated, batchable and differentiable optimizers in JAX.
MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvements in both training algorithms and models.
Robust Bi-Tempered Logistic Loss Based on Bregman Divergences. https://arxiv.org/pdf/1906.03361.pdf
georgedahl / jax
Forked from jax-ml/jaxComposable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
georgedahl / flax
Forked from google/flaxFlax is a neural network library for JAX that is designed for flexibility.