[CVPR2025] Unveil Inversion and Invariance in Flow Transformer for Versatile Image Editing
-
Updated
Aug 23, 2025 - Python
[CVPR2025] Unveil Inversion and Invariance in Flow Transformer for Versatile Image Editing
Minimal DDPM/DiT-based generation of MNIST digits
(ICCV 2025) 🎨 Lay-Your-Scene: Natural Scene Layout Generation with Diffusion Transformers
This repository implements multiple generative diffusion frameworks (EDM, Consistency Models, etc.). It also implements some architectures (U-Net, Diffusion Transformers, etc.).
Official training code for MUG-V 10B video generation model. Built on Megatron-LM (v0.14.0) with production-ready distributed training for 10B DiT.
DiffInk: Glyph- and Style-Aware Latent Diffusion Transformer for Text to Online Handwriting Generation
Flux Schnell diffusion transformer model fine tuning across hardware configurations
Derf (Dynamic erf) - Normalization-Free Transformer Activation. Reimplementation of arXiv:2512.10938
Port of TOF-MRA to CTA image-to-image translation in C
InvarDiff: Cross-Scale Invariance Caching for Accelerated Diffusion Models
TQ-DiT: Efficient Time-Aware Quantization for Diffusion Transformers
Torchsmith is a minimalist library that focuses on understanding generative AI by building it using primitive PyTorch operations
[NeurIPS2024 (Spotlight)] "Unified Gradient-Based Machine Unlearning with Remain Geometry Enhancement" by Zhehao Huang, Xinwen Cheng, JingHao Zheng, Haoran Wang, Zhengbao He, Tao Li, Xiaolin Huang
Pytorch and JAX Implementation of Scalable Diffusion Models with Transformers | Diffusion Transformers in Pytorch and JAX
Unofficial LeetArxiv Implementation of the paper Scalable Diffusion Models with Transformers
Leverage SANA's capabilities using LitServe.
A diffusion transformer implementation in Flax
Implementation of Latent Diffusion Transformer Model in Tensorflow / Keras
Democratising RGBA Image Generation With No $$$ (AI4VA@ECCV24)
Add a description, image, and links to the diffusion-transformer topic page so that developers can more easily learn about it.
To associate your repository with the diffusion-transformer topic, visit your repo's landing page and select "manage topics."