Lists (5)
Sort Name ascending (A-Z)
Stars
An open-source AI agent that brings the power of Gemini directly into your terminal.
An open-source library for GPU-accelerated robot learning and sim-to-real transfer.
Nonlinear optimisation (root-finding, least squares, ...) in JAX+Equinox. https://docs.kidger.site/optimistix/
JAX-accelerated Meta-Reinforcement Learning Environments Inspired by XLand and MiniGrid 🏎️
A collection of guides and examples for the Gemma open models from Google.
Simple single-file baselines for Q-Learning in pure-GPU setting
Deep-Learning-with-Jax / day_07_exercise_brain_decode
Forked from Machine-Learning-Foundations/day_13_exercise_brain_decode_jaxExercise on convolutional neural networks about how use them to decode brain waves.
Deep-Learning-with-Jax / day_06_exercise_cnn
Forked from Machine-Learning-Foundations/day_12_exercise_cnn_jaxExercise on the convolution operation and convolutional neural networks.
Deep-Learning-with-Jax / day_04_exercise_statistics
Forked from Machine-Learning-Foundations/exercise_03_statistics_probExercise on statistics and distributions: mean and variance, correlation, gaussians.
Deep-Learning-with-Jax / day_05_exercise_neural_networks
Forked from Machine-Learning-Foundations/exercise_10_neural_networksExercise on the MNIST-data set, artificial neurons, forward and backward pass.
Deep-Learning-with-Jax / day_08_exercise_interpretability
Forked from Machine-Learning-Foundations/day_14_exercise_interpretability_jaxExercise on interpretability with integrated gradients.
Deep-Learning-with-Jax / day_09_exercise_sequence_processing
Forked from Machine-Learning-Foundations/day_15_exercise_sequence_processing_jaxExercise on generative language modelling in Jax.
Deep-Learning-with-Jax / day_03_exercise_algebra
Forked from Machine-Learning-Foundations/exercise_02_algebraExercise on basics of algebra, curve fitting and singular value decomposition.
Deep-Learning-with-Jax / day_02_exercise_optimization
Forked from Machine-Learning-Foundations/exercise_01_optimizationExercise on gradient descent by hand and via autograd in Jax.
Deep-Learning-with-Jax / day_01_exercise_intro
Forked from Machine-Learning-Foundations/exercise_01_introExercise on an introduction to the python development framework.
Unsupervised text tokenizer for Neural Network-based text generation.
Gemma open-weight LLM library, from Google DeepMind
The official PyTorch implementation of Google's Gemma models
lightweight, standalone C++ inference engine for Google's Gemma models.
Linear solvers in JAX and Equinox. https://docs.kidger.site/lineax
Adversarial Attacks on GPT-4 via Simple Random Search [Dec 2023]
Implementation of Forward Laplacian algorithm in JAX
S2FFT: Differentiable and accelerated spherical transforms
RuLES: a benchmark for evaluating rule-following in language models
Security scanner detecting Python Pickle files performing suspicious actions