Awesome Knowledge Distillation
-
Updated
Mar 22, 2026
Awesome Knowledge Distillation
Images to inference with no labeling (use foundation models to train supervised models).
🚀 PyTorch Implementation of "Progressive Distillation for Fast Sampling of Diffusion Models(v-diffusion)"
PyTorch Implementation of Matching Guided Distillation [ECCV'20]
Mechanistically interpretable neurosymbolic AI (Nature Comput Sci 2024): losslessly compressing NNs to computer code and discovering new algorithms which generalize out-of-distribution and outperform human-designed algorithms
Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)
The Codebase for Causal Distillation for Language Models (NAACL '22)
Repository for the publication "AutoGraph: Predicting Lane Graphs from Traffic"
Few-step diffusion for audio-driven talking head generation making diffusion models speak faster without losing their composure.
AI Community Tutorial, including: LoRA/Qlora LLM fine-tuning, Training GPT-2 from scratch, Generative Model Architecture, Content safety and control implementation, Model distillation techniques, Dreambooth techniques, Transfer learning, etc for practice with real project!
The Codebase for Causal Distillation for Task-Specific Models
LUPI-OD - A novel methodology to improve object detection accuracy without increasing model size or complexity. | 2025 European Workshop on Visual Information Processing (EUVIP) | M.Sc. ICT (By Research) Dissertation | University of Malta
Provide academic paper search and curation across platforms using the Model Context Protocol for seamless multi-client integration.
Awesome Deep Model Compression
Use AWS Rekognition to train custom models that you own.
GBM-to-GLM distillation for insurance pricing - surrogate factor tables for Radar/Emblem rating engines
Use LLaMA to label data for use in training a fine-tuned LLM.
Autodistill Google Cloud Vision module for use in training a custom, fine-tuned model.
A repo for distilling a large teacher into a small vision-language model for efficient embodied spatial reasoning and action planning.
Train cheap models on expensive ones. Automatically. With receipts.
Add a description, image, and links to the model-distillation topic page so that developers can more easily learn about it.
To associate your repository with the model-distillation topic, visit your repo's landing page and select "manage topics."