[KDD2025] Mitigating Source Label Dependency in Time-Series Domain Adaptation under Label Shifts
This is the implementation of TA4LS published in KDD 2025 paper
Time-series unsupervised domain adaptation (TS-UDA) is essential in fields such as healthcare and manufacturing, where data often consists of distinct entities, such as individual patients or machinery. This heterogeneity leads to discrepancies not only in feature distributions but also in label distributions, posing a significant challenge for domain adaptation. However, prior studies have mostly focused on alleviating covariate shifts, resulting in predicted target labels that are often biased toward the source domain's label distribution. To address this issue, we propose Time-series domain Adaptation for mitigating Label Shifts (TA4LS), a novel label refinement approach. TA4LS leverages the consistency between predicted labels and clustering information obtained from the unique characteristics that differentiate each label in the target domain. Furthermore, our approach as a plug-in module achieves performance improvements across diverse existing unsupervised domain adaptation methods, particularly in scenarios with significant discrepancies between source and target label distributions. In experiments on four benchmark datasets with label shifts, TA4LS demonstrates superior performance across six unsupervised domain adaptation methods and six label shift handling modules.
- Python Version: 3.9.13
- Torch Version: 1.13.1+cu117
- Package List
- Please refer to the
envs.txtin config folder
- Please refer to the
- Parameter options
--data_path: set the path of the datasets
--dataset : dataset name (e.g., HAR, HHAR_SA, WISDM, and EEG)
--device : GPU device
--label_shift : True (TA4LS), False (Original UDA)
--backbone : backbone network (e.g., CNN, TCN, and ResNet18)
--exp_name : your experiment name
--da_method : existing UDA method
--num_runs: repetitions
- Script
python main.py --dataset WISDM --device cuda:0 --label_shift True --exp_name YOUR_EXP_NAME --da_method DSAN --num_runs 3
We refer to the implementation of AdaTime(TKDD 2023)