Here is the code for our work CoSDA:Continual Source-Free Domain Adaptation. To ensure fair comparison, we build a unified codebase for the methods of source-free domain adaptation and continual DA, as shown in supported methods.
Continual source-free domain adaptation is a new and practical task in the field of domain adaptation, which seeks to preserve the performance of a model across all domains encountered during the adaptation process while also protecting the privacy of private data, as illustrated in the following figure:
CoSDA is a continual source-free domain adaptation approach that employs a dual-speed optimized teacher-student model pair and is equipped with consistency learning, as shown in the following figure. The implementaion details of CoSDA are shown in [train/cosda/cosda.py].
First, download the datasets from the following links:
- DomainNet
- OfficeHome: (1) image_list (2) Art (3) Clipart (4) Product (5) Real_World
- Office31: (1) image_list (2) amazon (3) dslr (4) webcam
- VisDA17
Next, select a base_path
, create a dataset
folder within it and place the downloaded files in this folder, as shown below:
base_path
├── dataset
├── DomainNet
│ ├── splits
│ ├── clipart
│ ├── infograph
│ ├── painting
│ ├── quickdraw
│ ├── real
│ ├── sketch
├── Office31
│ ├── image_list
│ ├── amazon
│ ├── dslr
│ └── webcam
├── OfficeHome
│ ├── image_list
│ ├── Art
│ ├── Clipart
│ ├── Product
│ ├── Real_World
└── Visda2017
├── image_list
├── train
└── validation
-
Dependencies and environment setup.
pip install -r requirements.txt
-
Pretrain.
python pretrain.py -bp [base_path] --config [config_file]
The
base_path
is the selected location where the dataset will be installed. Theconfig_file
is stored in the [pretrain/config/backup] directory. We have designed specific configurations for GSFDA, and SHOT++, while for other methods, we use the same pretrain configuration as SHOT method used.Once the pretraining process is complete, the model will be saved within the base_path directory. Below is an example of the resulting file structure for DomainNet:
base_path ├── DomainNet ├── pretrain_parameters_shot │ ├── source_{}_backbone_{}.pth.tar
- We supply pretrained parameters for all datasets using the SHOT, SHOT++, and GSFDA methods. Download Links: DomainNet, OfficeHome, Office31, VisDA2017.
-
Single-target adaptation.
python single_tar.py -bp [base_path] --config [config_file] --writer [tensorboard / wandb]
We have created separate configuration files for each method, which can be found in the [adaptationcfg/backup)] directory. The
source_domain
andtarget_domain
can be manually specified under theDAConfig
key in the configuration file.We provide support for two methods of recording the training process:
tensorboard
andwandb
. To usetensorboard
, you need to specify thelog_path
to store the event files locally using-lp [log_path]
. To usewandb
, you need to specify theentity
using-e [entity]
. -
Multi-targets adaptation.
python multi_tar.py --config [config_file] -bp [base_path] --writer [tensorboard / wandb] (-lp [log_path] or -e [entity])
The settings for multi-target domain adaptation are the same as those for single-target adaptation. For DomainNet, the sequential adaptation order is Real → Infograph → Clipart → Painting → Sketch → Quickdraw. For OfficeHome, the sequential adaptation order is Art → Clipart → Product → Real-world.
Apart from CoSDA, we also support the following methods.
-
SHOT - Do We Really Need to Access the Source Data? Source Hypothesis Transfer for Unsupervised Domain Adaptation. (ICML'20) [train/shot/shot_plus.py]
-
SHOT++ - Source Data-absent Unsupervised Domain Adaptation through Hypothesis Transfer and Labeling Transfer. (TPAMI) [train/shot/shot_plus.py]
-
G-SFDA - Generalized Source-free Domain Adaptation. (ICCV'21) [train/gsfda/gsfda.py]
-
NRC - Exploiting the Intrinsic Neighborhood Structure for Source-free Domain Adaptation. (NeurIPS'21) [train/nrc/nrc.py]
-
AaD - Attracting and Dispersing: A Simple Approach for Source-free Domain Adaptation. (NeurIPS'22) [train/aad/aad.py]
-
DaC - Divide and Contrast: Source-free Domain Adaptation via Adaptive Contrastive Learning. (NeurIPS'22) [train/dac/dac.py]
-
Edgemix - Balancing Discriminability and Transferability for Source-Free Domain Adaptation. (ICML'22) [train/dataaug/edgemix.py]
-
CoTTA - Continual Test-Time Domain Adaptation. (CVPR'22) [train/cotta/cotta.py]
If you find our paper helpful, consider citing us via:
@article{feng2023cosda,
title={CoSDA: Continual Source-Free Domain Adaptation},
author={Feng, Haozhe and Yang, Zhaorui and Chen, Hesun and Pang, Tianyu and Du, Chao and Zhu, Minfeng and Chen, Wei and Yan, Shuicheng},
journal={arXiv preprint arXiv:2304.06627},
year={2023}
}