Skip to content
forked from Wangwang-Jia/Moal

Knowledge Memorization and Rumination for Pre-trained Model-based Class-Incremental Learning

Notifications You must be signed in to change notification settings

Zi-Jian-Gao/Moal

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 

Repository files navigation

Knowledge Memorization and Rumination for Pre-trained Model-based Class-Incremental Learning (CVPR25)

Abstract

Class-Incremental Learning (CIL) enables models to continuously learn new classes while mitigating catastrophic forgetting. Recently, Pre-Trained Models (PTMs) have greatly enhanced CIL performance, even when fine-tuning is limited to the first task. This advantage is particularly beneficial for CIL methods that freeze the feature extractor after first-task fine-tuning, such as analytic learning-based approaches using a least squares solution-based classification head to acquire knowledge recursively. In this work, we revisit the analytical learning approach combined with PTMs and identify its limitations in adapting to new classes, leading to sub-optimal performance. To address this, we propose the Momentum-based Analytical Learning (MoAL) approach. MoAL achieves robust knowledge memorization via an analytical classification head and improves adaptivity to new classes through momentum-based adapter weight interpolation, also leading to forgetting outdated knowledge. Importantly, we introduce a knowledge rumination mechanism that leverages refined adaptivity, allowing the model to revisit and reinforce old knowledge, thereby improving performance on old classes. MoAL facilitates the acquisition of new knowledge and consolidates old knowledge, achieving a win-win outcome between plasticity and stability. Extensive experiments on multiple datasets and incremental settings demonstrate that MoAL significantly outperforms current state-of-the-art methods.

Run experiment

To get started, set up a conda environment and install the requirements listed by our repo.

conda env create -f environment.yml

Datasets

We have implemented the pre-processing datasets as follows:

  • CIFAR100: will be automatically downloaded by the code.
  • ImageNet-R: Google Drive: link or Onedrive: link
  • ImageNet-A: Google Drive: link or Onedrive: link
  • OmniBenchmark: Google Drive: link or Onedrive: link
  • Car196: link
  • CUB: link

Pre-trained weights

When training , you should specify the folder of your dataset in utils/data.py.

  def download_data(self):
     assert 0,"You should specify the folder of your dataset"
     train_dir = '[DATA-PATH]/train/'
     test_dir = '[DATA-PATH]/val/'

When using the pre-trained weights of iBOT and DINO, you should download the corresponding pre-trained weights to the Moal/checkpoints/ folder.

  def vit_base_patch16_224_adapter_dino(pretrained=False, **kwargs):
      model = VisionTransformer(patch_size=16, embed_dim=768, depth=12, num_heads=12, mlp_ratio=4, qkv_bias=True,
          norm_layer=partial(nn.LayerNorm, eps=1e-6), **kwargs)
      ckpt = torch.load('Your_path', map_location='cpu')

  def vit_base_patch16_224_adapter_ibot(pretrained=False, **kwargs):
      model = vit_base_patch16_224_adapter(False, **kwargs)
      ckpt = torch.load('Your_path', map_location='cpu')['state_dict']

To Train

python main.py --config=exps\MoAL_[dataset_name].json

Acknowledgments

We thank the following repos providing helpful components/functions in our work.

Citation

If you found our work useful for your research, please cite our work:

@inproceedings{moal, author = {Gao, Zijian and Jia, Wangwang and Zhang, Xingxing and Zhou, Dulan and Xu, Kele and Feng, Dawei and Dou, Yong and Mao, Xinjun and Wang, Huaimin}, booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, title = {Knowledge Memorization and Rumination for Pre-trained Model-based Class-Incremental Learning}, year = {2025} }

About

Knowledge Memorization and Rumination for Pre-trained Model-based Class-Incremental Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%