On the Variance of the Adaptive Learning Rate and Beyond
-
Updated
Jul 31, 2021 - Python
On the Variance of the Adaptive Learning Rate and Beyond
Educational deep learning library in plain Numpy.
This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"
CS F425 Deep Learning course at BITS Pilani (Goa Campus)
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
Google Street View House Number(SVHN) Dataset, and classifying them through CNN
Lion and Adam optimization comparison
PyTorch/Tensorflow solutions for Stanford's CS231n: "CNNs for Visual Recognition"
Reproducing the paper "PADAM: Closing The Generalization Gap of Adaptive Gradient Methods In Training Deep Neural Networks" for the ICLR 2019 Reproducibility Challenge
Toy implementations of some popular ML optimizers using Python/JAX
A collection of various gradient descent algorithms implemented in Python from scratch
This library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
A compressed adaptive optimizer for training large-scale deep learning models using PyTorch
Modified XGBoost implementation from scratch with Numpy using Adam and RSMProp optimizers.
The project aimed to implement Deep NN / RNN based solution in order to develop flexible methods that are able to adaptively fillin, backfill, and predict time-series using a large number of heterogeneous training datasets.
Lookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
Implementation of Adam Optimization algorithm using Numpy
From linear regression towards neural networks...
Add a description, image, and links to the adam-optimizer topic page so that developers can more easily learn about it.
To associate your repository with the adam-optimizer topic, visit your repo's landing page and select "manage topics."