Repository for the Machine Learning Based Systems Design course, offered as an elective in the Computer Engineering undergraduate program at UFRN.
| Title & Authors | Date | Link |
|---|---|---|
| Muhammad Asad and Iqbal Khan NLP with Hugging Face Transformers: Practical Applications using Language Models |
May, 2025 | π Link |
| Chip Huyen AI Engineering: Building Applications with Foundation Models |
Jan, 2025 | π Link |
| Paul Lusztin and Maxime Labonne LLM Engineer's Handbook |
Oct, 2024 | π Link |
| Jay Alammar and Maarten Grootendorst Hands-On Large Language Models: Language Understanding and Generation |
Sep, 2024 | π Link |
| Chip Huyen Designing Machine Learning Systems: An Iterative Process for Production-Ready Applications |
May, 2022 | π Link |
| Daniel Voigt Godoy Deep Learning with PyTorch Step-by-Step: A Beginnerβs Guide |
Feb, 2022 | π Link |
| Resource | Description |
|---|---|
| Hugging Face Docs | Model store and documentation |
| MLflow | Experiment tracking |
| Weights & Biases | Machine learning monitoring |
Week 01
Course Outline
- GitHub Education Pro: Get access to the GitHub Education Pro pack by visiting GitHub Education
- π Learning Resources
- GitHub Learning Game: Check out the interactive Git learning game at GitHub Learning Game
- Michael A. Lones. How to avoid machine learning pitfalls: a guide for academic researchers Arxiv
Visualizing Gradient Descent
- Understanding and visualizing the five core steps of the Gradient Descent algorithm:
- initializing parameters randomly
- performing the forward pass to compute predictions
- calculating the loss
- computing gradients with respect to each parameter
- updating the parameters using the gradients and a predefined learning rate.
- Understanding and visualizing the five core steps of the Gradient Descent algorithm:
Week 02
Rethinking the Training Loop (Part I)
From data deneration to make predictions
- Implement a clear
train()function with custom dataset andDataLoader. - Apply mini-batch gradient descent and track performance.
- Add persistence: save checkpoints and enable training resumption/deployment.
- Implement a clear
Going Classy
- Build a dedicated training class with a well-structured constructor.
- Use proper method scoping (public/protected/private).
- Consolidate earlier code into the class.
- Run the full pipeline through the class interface.
Week 03
Week 04:
Rethinking the Training Loop (Part II)
A simple classification problem:
- build a model for binary classification
- understand the concept of logits and how it is related to probabilities
- use binary cross-entropy loss to train a model
- use the loss function to handle imbalanced datasets
- understand the concepts of decision boundary and separability
Challenge (bonus: 2.5 points)
Week 05
Machine Learning and Computer Vision - Part I
From a shallow to a deep-ish clasification model:
- data generation for image classification
- transformations using torchvision
- dataset preparation techniques
- building and training logistic regression and deep neural network models using PyTorch
- focusing on various activation functions like Sigmoid, Tanh, and ReLU
Week 06
Machine Learning and Computer Vision - Part II
Kernel
Convolutions
- In this lesson, weβve introduced convolutions and related concepts and built a convolutional neural network to tackle a multiclass classification problem.
- Activation function, pooling layer, flattening, Lenet-5
- Softmax, cross-entropy
- Visualizing the convolutional filters, features maps and classifier layers
- Hooks in Pytorch
Week 07
Machine Learning and Computer Vision - Part III
Rock, Paper and Scissors:
- Standardize an image dataset
- Train a model to predict rock, paper, scissors poses from hand images
- Use dropout layers to regularize the model
Week 08