LSTM and QRNN Language Model Toolkit for PyTorch
-
Updated
Feb 12, 2022 - Python
LSTM and QRNN Language Model Toolkit for PyTorch
NFNets and Adaptive Gradient Clipping for SGD implemented in PyTorch. Find explanation at tourdeml.github.io/blog/
Ternary Gradients to Reduce Communication in Distributed Deep Learning (TensorFlow)
Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers
Lua implementation of Entropy-SGD
PyTorch implementation of Federated Learning algorithms FedSGD, FedAvg, FedAvgM, FedIR, FedVC, FedProx and standard SGD, applied to visual classification. Client distributions are synthesized with arbitrary non-identicalness and imbalance (Dirichlet priors). Client systems can be arbitrarily heterogeneous. Several mobile-friendly models are prov…
Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers https://arxiv.org/abs/1802.00124
Distributed Learning by Pair-Wise Averaging
Implementation of key concepts of neuralnetwork via numpy
Automatic and Simultaneous Adjustment of Learning Rate and Momentum for Stochastic Gradient Descent
Amortized version of the differentially private SGD algorithm published in "Deep Learning with Differential Privacy" by Abadi et al. Enforces privacy by clipping and sanitising the gradients with Gaussian noise during training.
Code for paper "On Sampling Strategies for Neural Network-based Collaborative Filtering"
vector quantization for stochastic gradient descent.
EnsLoss: Stochastic Calibrated Loss Ensembles for Preventing Overfitting in Classification
A sentiment classifier on mixed language (and mixed script) reviews in Tamil, Malayalam and English
Add a description, image, and links to the sgd topic page so that developers can more easily learn about it.
To associate your repository with the sgd topic, visit your repo's landing page and select "manage topics."