Stars
OneTrainer is a one-stop solution for all your stable diffusion training needs.
Prodigy and Schedule-Free, together at last.
optimizer & lr scheduler & loss function collections in PyTorch
Official Implementation of "ADOPT: Modified Adam Can Converge with Any β2 with the Optimal Rate"
A simple way to keep track of an Exponential Moving Average (EMA) version of your Pytorch model
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Convert every image file in a directory to JXL format
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
Stable Diffusion web UI
Lora beYond Conventional methods, Other Rank adaptation Implementations for Stable diffusion.
[ACL 2023] The official implementation of "CAME: Confidence-guided Adaptive Memory Optimization"