The current requirements-travis.txt in the root directory of the repository should be converted to a YAML format that complies with the Conda environment format. Especially note the version range formatting: https://docs.conda.io/projects/conda-build/en/latest/resources/package-spec.html#package-match-specifications
requirements.yaml should contain the Conda dependencies for a generic (GPU?) platform.
We will probably want to have custom Conda environments files for each of the following computers, mostly to handle the non-default Conda channels that may be necessary (e.g. IBM Watson AI for the 2x Power9 architectures with V100s). But we also want to set strict channel priority, be more specific about compatible dependency version ranges, etc.
- Princeton Research Computing
- Tiger 2/TigerGPU P100s
- Traverse V100s
- ALCF
- OLCF
Also, I am in favor of storing files such as traverse-env.cmd containing the following lines, e.g.:
#!/usr/bin/env bash
module load anaconda3
conda activate frnn # must activate conda env before module loads
export OMPI_MCA_btl="tcp,self,vader"
module load cudatoolkit
module load cudnn/cuda-10.1/7.6.1
module load openmpi/gcc/3.1.4/64
module load hdf5/gcc/openmpi-3.1.4/1.10.5
This will make it easier for the user to build FRNN on each platform after creating the Conda environment, e.g. from the new directory named envs/ or environments/:
conda env create --file envs/requirements-traverse.yaml
# alternative: "conda create --name frnn --file traverse.yaml"
source traverse-env.cmd
python setup.py install
See Sample Installation on TigerGPU, for example.
Also, examples/slurm.cmd can source the exact same modules in the .cmd file for consistency.
However, this will require frequent updates to the *.cmd files as system admins upgrade the modules and libraries on the various platforms.
Current limitations of Conda YAML format:
The current
requirements-travis.txtin the root directory of the repository should be converted to a YAML format that complies with the Conda environment format. Especially note the version range formatting: https://docs.conda.io/projects/conda-build/en/latest/resources/package-spec.html#package-match-specificationsrequirements.yamlshould contain the Conda dependencies for a generic (GPU?) platform.We will probably want to have custom Conda environments files for each of the following computers, mostly to handle the non-default Conda channels that may be necessary (e.g. IBM Watson AI for the 2x Power9 architectures with V100s). But we also want to set strict channel priority, be more specific about compatible dependency version ranges, etc.
Also, I am in favor of storing files such as
traverse-env.cmdcontaining the following lines, e.g.:This will make it easier for the user to build FRNN on each platform after creating the Conda environment, e.g. from the new directory named
envs/orenvironments/:See Sample Installation on TigerGPU, for example.
Also,
examples/slurm.cmdcan source the exact same modules in the.cmdfile for consistency.However, this will require frequent updates to the
*.cmdfiles as system admins upgrade the modules and libraries on the various platforms..yamland.cmdpairs of files per platform be all stored in a top-level subdirectory, e.g.environments/,env/, orenvs/? How do other projects handle this problem and organize such files? Or do they typically only define a single Conda YAML?examples/slurm.cmdcheck the hostname in order to automaticallysourcethe correct.cmd?.cmdenvironment files assume that the Conda environment is namedfrnn? Or force the user to define an environment variableFRNN_CONDA_ENV?setup.py.setup.pyCurrent limitations of Conda YAML format:
Cannot set Conda (strict) channel priority setting within these files. See 336135b.
Follow Allow configuration setting to be set in environment.yaml file conda/conda#8675 for updates regarding this requested feature extension. Until then, it is recommended that each user modifies their
.condarclocal configuration withconda config --set channel_priority strictShould not install
mpi4pyvia pip within the Conda YAML file, since we need to source the modules to get the system MPI libraries beforepip install mpi4py. There is currently no way to execute arbitrary shell commands, modify environment variables, etc. between theconda install pkgandpip install pkgstages of the environment creation process from a YAML file, as far as I am aware.See similar requests in https://github.com/conda/conda/issues?q=is%3Aopen+is%3Aissue+label%3Atag-environment_spec.
See f68f2f0.