I am using a server with two 24GB 3090 GPUs. When I run the bash run_baselines_lora.sh script, I encounter an error indicating insufficient GPU memory. How can the code be configured to support dual GPUs, or if it cannot support dual GPUs, what can I do to get it running on a single 3090?
I am using a server with two 24GB 3090 GPUs. When I run the bash run_baselines_lora.sh script, I encounter an error indicating insufficient GPU memory. How can the code be configured to support dual GPUs, or if it cannot support dual GPUs, what can I do to get it running on a single 3090?