This repository documents my journey of turning Stanford CS229 math notes into working machine learning projects using Python (Google Colab).
Instead of stopping at equations, each notebook:
- Starts with the math derivation
- Translates it into step-by-step code
- Builds visualizations to connect theory with practice
The goal: strengthen ML foundations by seeing how math directly powers real models.
- Derived the Mean Squared Error (MSE) cost function
- Implemented the gradient descent update rule from scratch
- Plotted Cost vs Iterations to show convergence
- Built a simple predictor for housing prices
- Implemented sigmoid function
- Derived and coded cross-entropy loss + gradient descent
- Visualized loss curve and decision boundary
- Compared with scikit-learnβs LogisticRegression
Each project follows the same structure:
- Start with Math β review CS229 derivation
- Code Implementation β translate step by step
- Visualization β graphs/plots to show error minimization
- ML Intuition β how this improves predictions
- β Linear Regression
- β Logistic Regression
- Regularization (L1 / L2)
- Softmax Regression
- Neural Networks (Backprop basics)
- Support Vector Machines
- PCA & Dimensionality Reduction
Ibrahim Arshad β exploring ML theory & practice π