Stars
This paper is currently under review by IEEE TCSVT, and the diffusion framework of the FedDiff algorithm part will be disclosed.
This repository contains the source code for our MICCAI 2024 paper titled 'CAR-MFL: Cross-Modal Augmentation by Retrieval for Multimodal Federated Learning with Missing Modalities'
(AAAI-24) Federated Learning via Input-Output Collaborative Distillation
[CVPR 2024] The official repository of our paper "LEAD: Learning Decomposition for Source-free Universal Domain Adaptation"
Implementation for FedConv: A Learning-on-Model Paradigm for Heterogeneous Federated Clients
[NeurIPS 2023] "FedFed: Feature Distillation against Data Heterogeneity in Federated Learning"
A new simple method for dataset distillation called Randomized Truncated Backpropagation Through Time (RaT-BPTT)
[ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baharan Mirzasoleiman
ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching
[AAAI24] Official implement of <Beyond Prototypes: Semantic Anchor Regularization for Better Representation Learning>
High-Resolution Image Synthesis with Latent Diffusion Models
The is the official implementation of ICCV 2023 paper "No Fear of Classifier Biases: Neural Collapse Inspired Federated Learning with Synthetic and Fixed Classifier".
[ICLR 2023] Multimodal Federated Learning via Contrastive Representation Ensemble
[AAAI 2024] FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning
The Code for "Federated Recommender with Additive Personalization"
Train transformer language models with reinforcement learning.
You only need to configure one file to support model heterogeneity. Consistent GPU memory usage for single or multiple clients.
Official PyTorch codes of CVPR2022 Oral: Exact Feature Distribution Matching for Arbitrary Style Transfer and Domain Generalization
A PyTorch toolbox for domain generalization, domain adaptation and semi-supervised learning.
Domain Generalization with MixStyle (ICLR'21)
[ACML 2022] Towards Data-Free Knowledge Distillation
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content…