My java implementation of scalable on-line stochastic gradient descent for regularized logistic regression
-
Updated
Nov 22, 2016 - Java
My java implementation of scalable on-line stochastic gradient descent for regularized logistic regression
Stagewise training accelerates convergence of testing error over sgd
This repo is for the ML class Assignment 1 @ IIT Jodhpur Post Grad Sem 1.
Implements the NNDL book's "network.py" code (Chapter 1) to improve its functionality and performance.
R package implementing Gradient Descent and its variants for regression tasks
Set-Get-Do commands allow you to configure the settings on printers. This app demos how to use SGD commands to get and set the configuration variables of the printer using the Zebra Link-OS SDK.
My beautiful Neural Network made from scratch and love. It plays the game Flappy-Birds flawlessly, in 3 to 9 generations!!
An python implementation of Support Vector Machine from Scratch
A custom example of how the stochastic gradient descent algorithm works, applied to a linear regression model.
Statistical and Mathematical Methods (UniBo) - PCA, SVD, GD alg
Comparative study of User-Based Collaborative Filtering vs Matrix Factorization for recommendation systems. Performance analysis on MovieLens 100k dataset evaluating RMSE, training time, and prediction latency.
Implementation of the algorithms described in the papers "ZO-AdaMM: Zeroth Order Adaptive Momentum" by Chen et al., "Stochastic first- and zeroth-order methods" by Ghadimi et al. and "SignSGD via zeroth- order oracle" by Liu et al.
NLP and Sentiment Analysis
Fake news detection on LIAR-PLUS dataset using traditional machine learning techniques and deep learning techniques. Used a normal LSTM network and also contextual attention (with justification) for deep learning techniques.
Movie rating prediction using collaborative filtering - XGBoost, SVD, SVD++, KNNBaseline on the Netflix Prize dataset (~100M ratings). Includes EDA, feature engineering, model stacking, and a visual results report.
Deep_Learning: Stochastic Gradient Noise heavy tail distribution Analysis
Comparison of Gradient Based Optimization Methods
A lightweight, bare-metal Go implementation of Stochastic Gradient Descent (SGD) and the Adam optimizer. Built from the ground up with exactly one dependency. Includes a pure-Go MNIST example.
Add a description, image, and links to the sgd topic page so that developers can more easily learn about it.
To associate your repository with the sgd topic, visit your repo's landing page and select "manage topics."