1.58-bit FLUX
… Efficiency measurements on the vision transformer component of FLUX and 1.58-bit
FLUX. The measurements are based on generating a single image with 50 inference steps. (a) …
FLUX. The measurements are based on generating a single image with 50 inference steps. (a) …
ParetoQ: Improving scaling laws in extremely low-bit LLM quantization
… Our empirical analysis indicates that quantization at 1.58-bit, 2-bit, and 3-bit offers a
superior trade-off between accuracy and effective quantized model size compared to 4-bit, …
superior trade-off between accuracy and effective quantized model size compared to 4-bit, …
Plug-and-play 1. x-bit kv cache quantization for video large language models
… • For value cache, we propose a 1.58-bit quantization scheme while selecting a few … bit
and 1.58-bit precision, with almost no accuracy drop compared to the FP16 counterparts. …
and 1.58-bit precision, with almost no accuracy drop compared to the FP16 counterparts. …
Plug-and-Play 1. x-Bit KV Cache Quantization for Video Large Language Models
… • For value cache, we propose a 1.58-bit quantization scheme while selecting a few … bit
and 1.58-bit precision, with almost no accuracy drop compared to the FP16 counterparts. …
and 1.58-bit precision, with almost no accuracy drop compared to the FP16 counterparts. …
Memory-Efficient Generative Models via Product Quantization
… Additionally, we exclude 1.58-bit FLUX [65] due to the lack of publicly available implementation
details. A key limitation of existing compression methods is that they fall short in reducing …
details. A key limitation of existing compression methods is that they fall short in reducing …
Quantized DiT with hadamard transformation: A technical report
Diffusion Transformers (DiTs) combine the scalability of transformers with the fidelity of
diffusion models, achieving state-of-the-art image generation performance. However, their high …
diffusion models, achieving state-of-the-art image generation performance. However, their high …
Architectural and Performance Analysis
JMP Sinha, S Choudhary, S Rawat - Proceedings of Data …, 2025 - books.google.com
… Flux is an advanced text-to-image generation model by Black … Considering the example
of 1.58-bit FLUX a pivotal … The Flux model excelled Big sleep by achieving a lower FID score…
of 1.58-bit FLUX a pivotal … The Flux model excelled Big sleep by achieving a lower FID score…
Architectural and Performance Analysis of Text-to-Image and Text-to-Video Generative Models
J Masiwal, P Sinha, S Choudhary, S Rawat - International Conference on …, 2025 - Springer
… Flux is an advanced text-to-image generation model by Black … Considering the example
of 1.58-bit FLUX a pivotal … The Flux model excelled Big sleep by achieving a lower FID score…
of 1.58-bit FLUX a pivotal … The Flux model excelled Big sleep by achieving a lower FID score…
Dense2moe: Restructuring diffusion transformer to moe for efficient text-to-image generation
… Compared to FLUX.1-lite, our 5.2B acitvated parameter FLUX.1-MoE-L model achieves
better … it is the 3.19B activated FLUX.1MoE-S or the 2.64B activated FLUX.1-MoE-XS, signifi- …
better … it is the 3.19B activated FLUX.1MoE-S or the 2.64B activated FLUX.1-MoE-XS, signifi- …
[HTML][HTML] Evaluating the FLUX. 1 synthetic data on YOLOv9 for AI-powered poultry farming
S Cakic, T Popovic, S Krco, I Jovovic, D Babic - Applied Sciences, 2025 - mdpi.com
… A hybrid dataset was created by combining real images of chickens with 400 FLUX.1 [dev]
generated synthetic images, aiming to reduce reliance on extensive manual data collection. …
generated synthetic images, aiming to reduce reliance on extensive manual data collection. …