Skip to content

ibrahim1023/ml-fundamentals

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

10 Commits
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ“˜ ML Fundamentals – Math to Code

This repository documents my journey of turning Stanford CS229 math notes into working machine learning projects using Python (Google Colab).

Instead of stopping at equations, each notebook:

  • Starts with the math derivation
  • Translates it into step-by-step code
  • Builds visualizations to connect theory with practice

The goal: strengthen ML foundations by seeing how math directly powers real models.

πŸš€ Completed Projects

Linear Regression with Gradient Descent (Housing Prices)

  1. Derived the Mean Squared Error (MSE) cost function
  2. Implemented the gradient descent update rule from scratch
  3. Plotted Cost vs Iterations to show convergence
  4. Built a simple predictor for housing prices

Open In Colab

Logistic Regression β€” Binary Classification Example

  1. Implemented sigmoid function
  2. Derived and coded cross-entropy loss + gradient descent
  3. Visualized loss curve and decision boundary
  4. Compared with scikit-learn’s LogisticRegression

Open In Colab

🧠 Learning Method

Each project follows the same structure:

  1. Start with Math – review CS229 derivation
  2. Code Implementation – translate step by step
  3. Visualization – graphs/plots to show error minimization
  4. ML Intuition – how this improves predictions

πŸ“– Topics

  • βœ… Linear Regression
  • βœ… Logistic Regression
  • Regularization (L1 / L2)
  • Softmax Regression
  • Neural Networks (Backprop basics)
  • Support Vector Machines
  • PCA & Dimensionality Reduction

✍️ Author

Ibrahim Arshad – exploring ML theory & practice πŸš€

πŸ”— Connect with Me

Medium LinkedIn GitHub

image

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published