disable use_pip by default for PyTorch, except for recent versions (>= 2.0)#3079
disable use_pip by default for PyTorch, except for recent versions (>= 2.0)#3079branfosj merged 2 commits intoeasybuilders:5.0.xfrom
use_pip by default for PyTorch, except for recent versions (>= 2.0)#3079Conversation
|
@Flamefire Anything to add here? |
Just to clarify: Nothing wrong with that. AFAIK this means our use of I would even use 2.x as the cutoff version to simplify this to "PyTorch 2 uses pip by default", no need to go into minor versions here. |
use_pip by default for PyTorch, except for recent versions (>= 2.1)use_pip by default for PyTorch, except for recent versions (>= 2.0)
|
@boegelbot please test @ jsc-zen3 |
|
@boegel: Request for testing this PR well received on jsczen3l1.int.jsc-zen3.fz-juelich.de PR test command '
Test results coming soon (I hope)... Details- notification for comment with ID 1900472395 processed Message to humans: this is just bookkeeping information for me, |
|
Test report by @boegelbot Overview of tested easyconfigs (in order)
Build succeeded for 9 out of 11 (5 easyconfigs in total) |
|
We haven't had test reports for For For I'm pretty sure that they are not caused by enabling @Flamefire Thoughts? |
Generally those are test errors, not failures. So very likely the test run into something it didn't expect from the environment. E.g for the CUDA build I had it because PyTorch was built with NCCL but run without GPUs. But can't tell from this alone |
|
@Flamefire There's indeed no GPU available in this test setup. In any case, you don't see any reason to block this PR, right? |
That was just an example that applies to the in-progress CUDA ECs. But yes, I agree that this shouldn't block this PR. Might be worth investigating the failures later though. |
Up until now, we've only enabled
use_pipfor the latest versions ofPyTorch(>= 2.1).It's probably not worth the effort to also do that for older PyTorch versions, so we need to overrule the changing default in
PythonPackage(cfr. #3022)With this approach, old
PyTorcheasyconfigs could explicitly setuse_pip = False, and future PyTorch easyconfigs could opt-in to not usingpipby usinguse_pip = False, should the need arise (since PyTorch currently still uses the legacy install procedure that involvessetup.py build).