eTag: Class-Incremental Learning via Hierarchical Embedding Distillation and Task-Oriented Generation
This repository contains the key training and evaluation codes for the AAAI-2024 paper titled "eTag: Class-Incremental Learning via Hierarchical Embedding Distillation and Task-Oriented Generation".
To run the code, ensure the following dependencies are installed:
- Python 3.8.5
- PyTorch 1.7.1
- torchvision 0.8.2
Before running the code, ensure the dataset is properly downloaded or softly linked in the ./dataset
directory.
You can test our method by executing the provided scripts or running the following commands in the ./scripts
directory:
# 5 tasks
bash -i run.sh cifar 0 5
# 10 tasks
bash -i run.sh cifar 0 10
# 25 tasks
bash -i run.sh cifar 0 25
# 5 tasks
bash -i run.sh imagenet 0 5
# 10 tasks
bash -i run.sh imagenet 0 10
# 25 tasks
bash -i run.sh imagenet 0 25
-data
: Dataset name. Choose fromcifar100
orimagenet_sub
.-log_dir
: Directory to save models, logs, and events.-num_task
: Number of tasks after the initial task.-nb_cl_fg
: Number of classes in the first task.
For additional tunable arguments, refer to the opts_eTag.py
file.
This project is licensed under the Apache License 2.0.
A permissive license that requires preservation of copyright and license notices. Contributors provide an express grant of patent rights. Licensed works, modifications, and larger works may be distributed under different terms and without source code.
Permissions | Conditions | Limitations |
---|---|---|
✅ Commercial use | ⓘ License and copyright notice | ❌ Trademark use |
✅ Modification | ⓘ State changes | ❌ Liability |
✅ Distribution | ❌ Warranty | |
✅ Patent use | ||
✅ Private use |