Add Hyperparameter Optimization (HPO) scripts using HyperNOs and Ray Tune#717
Add Hyperparameter Optimization (HPO) scripts using HyperNOs and Ray Tune#717MaxGhi8 wants to merge 1 commit intoneuraloperator:mainfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR adds Ray Tune-based hyperparameter optimization (HPO) entry-point scripts for several NeuralOperator models, using HyperNOs for dataset/model integration, and updates the README to point users to these scripts.
Changes:
- Added Ray Tune + HyperNOs HPO scripts for TFNO (Darcy), UNO (Darcy), and SFNO (Spherical SWE).
- Updated README to mention availability of HPO scripts in
scripts/.
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 4 comments.
| File | Description |
|---|---|
| scripts/ray_tune_sfno_swe.py | Adds an HPO driver for SFNO on spherical SWE using Ray Tune + HyperNOs. |
| scripts/ray_tune_tfno_darcy.py | Adds an HPO driver for TFNO on Darcy Flow using Ray Tune + HyperNOs. |
| scripts/ray_tune_uno_darcy.py | Adds an HPO driver for UNO on Darcy Flow using Ray Tune + HyperNOs. |
| README.rst | Documents the new HPO scripts. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| uno_n_modes=config["uno_n_modes"], | ||
| uno_scalings=[[1.0, 1.0], [0.5, 0.5], [1.0, 1.0], [2.0, 2.0]], | ||
| n_layers=4, | ||
| horizontal_skips_map={4: 0, 3: 1} |
There was a problem hiding this comment.
horizontal_skips_map is set to {4: 0, 3: 1} while n_layers=4 (valid layer indices are 0–3). The 4: 0 entry is unused and the skip pattern doesn’t match the default skip-map generation for 4 layers ({3: 0, 2: 1} in UNO). Consider removing horizontal_skips_map to use the model’s default, or update it to the correct indices for n_layers=4.
| horizontal_skips_map={4: 0, 3: 1} |
| To use W&B logging features, simply create a file in ``neuraloperator/config`` | ||
| called ``wandb_api_key.txt`` and paste your W&B API key there. | ||
|
|
||
| Hyperparameter optimization (HPO) scripts using `HyperNOs` and `Ray Tune` are also available in the ``scripts/`` directory (e.g. ``scripts/ray_tune_tfno_darcy.py``). |
There was a problem hiding this comment.
This README line uses single backticks around HyperNOs / Ray Tune, but the rest of this README uses double-backticks for inline literals. In reStructuredText, single backticks are interpreted text and may not render as intended on GitHub/Sphinx. Use HyperNOs / Ray Tune (or proper links) for consistent rendering.
| Hyperparameter optimization (HPO) scripts using `HyperNOs` and `Ray Tune` are also available in the ``scripts/`` directory (e.g. ``scripts/ray_tune_tfno_darcy.py``). | |
| Hyperparameter optimization (HPO) scripts using ``HyperNOs`` and ``Ray Tune`` are also available in the ``scripts/`` directory (e.g. ``scripts/ray_tune_tfno_darcy.py``). |
| "learning_rate": tune.loguniform(1e-4, 1e-2), | ||
| "hidden_channels": tune.choice([16, 32, 64]), | ||
| "uno_out_channels": tune.choice([[32, 64, 64, 32], [16, 32, 32, 16]]), | ||
| "uno_n_modes": tune.choice([[16, 16], [8, 8]]), |
There was a problem hiding this comment.
uno_n_modes search space is defined as a list of length 2, but neuralop.models.UNO asserts that len(uno_n_modes) == n_layers (here n_layers=4). As written, trials will fail at model construction. Define uno_n_modes as a list of length 4 (one entry per layer), or reduce n_layers accordingly.
| "uno_n_modes": tune.choice([[16, 16], [8, 8]]), | |
| "uno_n_modes": tune.choice([ | |
| [[16, 16], [16, 16], [16, 16], [16, 16]], | |
| [[8, 8], [8, 8], [8, 8], [8, 8]], | |
| ]), |
| n_layers=config["n_layers"], | ||
| in_channels=config["input_dim"], | ||
| out_channels=config["out_dim"], | ||
| spectral_channels=config["modes"], |
There was a problem hiding this comment.
SFNO (a partialclass of FNO) does not accept a spectral_channels argument (the FNO.__init__ signature has no such parameter). This will raise a TypeError when building the model. Remove spectral_channels=... or replace it with a supported FNO/SFNO argument (e.g., adjust hidden_channels or n_modes).
| spectral_channels=config["modes"], |
This PR adds three new scripts to the scripts/ directory for performing hyperparameter optimization on SFNO, TFNO, and UNO models using HyperNOs and Ray Tune.
Changes
The new scripts provide a standardized framework for optimizing hyperparameters in neural operators. They utilize the HyperNOs library for model and dataset loading, while Ray Tune manages the
search space and execution of trials. This addition facilitates more robust experimentation and model tuning for users of the library.
Users will need to install hypernos and ray[tune] to use these scripts.