Skip to content

lxr2/akd-pmp

Repository files navigation

Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging

arXiv Python 3.8+

⚠️ Disclaimer

Experimental code for reproducing results from our paper "Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging" (CVPR). The code is provided as-is and not be optimized. It is intended primarily for academic research and paper replication purposes.

Setup

Install dependencies similar to CLAM plus Lightning Fabric:

conda create -n cmil python=3.8
conda activate cmil
# Follow CLAM installation, then:
pip install lightning

Data Preparation

  • Skin Cancer: Download from MICIL, update data_root in configs/csc_*.yaml
  • Camelyon-TCGA: Follow CLAM preparation, update data_root in configs/c16_*.yaml
  • You need to read the source code and place the extracted WSI features and data splits in the appropriate locations

Training

Skin Cancer

# TransMIL
python main_cl.py --preset configs/csc_transmil_cl.yaml --cl_method prev --buffer_size 42 --exp_name csc_transmil_cl_buf42_attn_logit

# CLAM
python main_cl.py --preset configs/csc_clam_cl.yaml --cl_method prev --buffer_size 42 --exp_name csc_clam_cl_buf42_attn_logit

Camelyon-TCGA

# TransMIL
python main_cl.py --preset configs/c16_lung_rcc_transmil_cl.yaml --cl_method prev --buffer_size 300 --buffer_slide_size 0.1 --distill_method maxrand --exp_name c16_lung_rcc_transmil_cl_pbbuf300_010_attn_logits_maxrand

# CLAM
python main_cl.py --preset configs/c16_lung_rcc_clam_cl.yaml --cl_method prev --buffer_size 300 --buffer_slide_size 0.1 --distill_method maxrand --exp_name c16_lung_rcc_clam_cl_pbbuf300_010_attn_logits_maxrand

Results saved in logs/<exp_name>/

Citation

@inproceedings{li2025advancing,
  title={Advancing Multiple Instance Learning with Continual Learning for Whole Slide Imaging},
  author={Li, Xianrui and Cui, Yufei and Li, Jun and Chan, Antoni B},
  booktitle={Proceedings of the Computer Vision and Pattern Recognition Conference},
  pages={20800--20809},
  year={2025}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages