π Explore NLP with hands-on notebooks for RNNs and Transformers, featuring clear experiments and practical tools for your language models.
-
Updated
Dec 17, 2025 - Jupyter Notebook
π Explore NLP with hands-on notebooks for RNNs and Transformers, featuring clear experiments and practical tools for your language models.
Daily ML practice notebooks covering tabular data, deep learning, and weekend LLM fine-tuning experiments.
Prompts, notebooks, and tools for generative pre-trained transformers.
An repository containing all the LLM notebooks with tutorial and projects
Complete AI/ML curriculum: From Python basics to production systems. 800+ notebooks covering transformers, embeddings, RAG, vector DBs, MLOps, NLP, computer vision & more.
A collection Deep Learning project Jupyter Notebook using PyTorch, TensorFlow and Transformers frameworks
Daily ML practice notebooks covering tabular data, deep learning, and weekend LLM fine-tuning experiments.
Comprehensive TensorFlow 2.15+ learning hub with 22+ hands-on notebooks covering computer vision, NLP, and generative AI. Features production-ready model optimization, multi-format deployment (TFLite, ONNX), distributed training, and complete MLOps pipelines. Includes pre-trained models, Docker support, and automated testing.
Practical course about Large Language Models.
An interactive Jupyter notebook leveraging IPython widgets for the UI and Diffusers to generate Stable Diffusion XL images without Gradio and Stable Diffusion WebUI.
Parameter-efficient fine-tuning toolkit for LLMs using LoRA and QLoRA. Colab-ready notebooks with detailed theory, code explanations, and production deployment guides. Supports Llama-2, Mistral, and other open-source models.
Comprehensive university-level study guide for LLMs, transformers, RLHF, and generative AI. 335+ pages (actively expanding), 47 visualizations, 4 notebooks. Regular updates with enhanced sections, new implementations, and expanded coverage.
Material y notebooks del curso "TΓ³picos Avanzados en AnalΓtica Computacional". Cubre Deep Learning, NLP, Sistemas de RecomendaciΓ³n, MLOps y Geometric Deep Learning.
This repository guides you through the process of building a GPT-style Large Language Model (LLM) from scratch using PyTorch. The structure and approach are inspired by the book Build a Large Language Model (From Scratch) by Sebastian Raschka.
natural language processing notebooks
Deep learning notebooks and model implementations, exploring CNNs, RNNs, LSTMs, Transformers, and more during my AI learning journey.
A walkthrough that builds a Transformer from first principles inside Jupyter notebooks β focusing on clarity, correctness, and intuition.
Corpus building and NLP analysis for Persian Telegram channel messages. Includes a notebook to parse and clean channel_messages.json, stopword normalization, word cloud with a silhouette mask, and CSV outputs (filtered_messages.csv, final_results.csv). Reproducible pipeline for EDA and basic modeling.
Implementations and Quantization Notebooks of models for Edge AI!
Explore advanced neural networks for crafting captivating headlines! Compare LSTM π and Transformer π models through interactive notebooks π and easy-to-use wrapper classes π οΈ. Ideal for content creators and data enthusiasts aiming to automate and enhance headline generation β¨.
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."