Distributed training example.
- Python 3.8.6
pip install -r requirements.txtSingle GPU training:
CUDA_VISIBLE_DEVICES=0 python train.pyDistributed training using two GPUs:
CUDA_VISIBLE_DEVICES=0,1 python train_ddp.py -g 2Distributed training using two GPUs with Mixed Precision:
CUDA_VISIBLE_DEVICES=0,1 python train_ddp_mp.py -g 2