Solves #2846 Refactor run_opts into a @dataclass#2855
Solves #2846 Refactor run_opts into a @dataclass#2855pplantinga merged 32 commits intospeechbrain:developfrom
Conversation
|
Hi @nouranali - thanks for your contribution. This is a step in the right direction, but I think there's more that's needed here. Just to warn you, the So to start with, I think |
Great, will start working on your comments, how can I start testing along the way? |
|
@pplantinga Can u check the new updates? |
|
I don't have a lot of time to review this rn, but you can check it yourself by running a template, e.g. |
|
My only comments right now are: the https://docs.python.org/3/library/functions.html#classmethod |
Done |
pplantinga
left a comment
There was a problem hiding this comment.
At this point I'm unsure if this PR will need to change every recipe or not -- ideally we could get away without having to do this, but we'll see. Have you tried running a template yet?
|
I ran the checks and the linter failed. You can run it yourself, even automatically on every commit, following instructions here: |
Co-authored-by: Peter Plantinga <[email protected]>
yes
Yes I ran this |
|
This used to touch only a few files but now touches over 500... did you run a different version of black on the whole repo? |
What version do u use? and what files do u run it on? |
|
If you install the |
This reverts commit caac7ab.
Adel-Moumen
left a comment
There was a problem hiding this comment.
Overall LGTM! I think its a great improvement on core.py side.
| modules : dict[str, torch.nn.Module] | ||
| These modules are passed to the optimizer by default if they have | ||
| trainable parameters, and will have ``train()``/``eval()`` called on them. | ||
| opt_class : torch.optim class | ||
| opt_class : Optional[Type[torch.optim]] | ||
| A torch optimizer constructor that takes only the list of | ||
| parameters (e.g. a lambda or partial function definition). By default, | ||
| this will be passed all modules in ``modules`` at the | ||
| beginning of the ``fit()`` method. This behavior can be changed | ||
| by overriding the ``configure_optimizers()`` method. | ||
| hparams : dict | ||
| hparams : Optional[dict] |
There was a problem hiding this comment.
I am wondering if we should maybe log this to the user i.e. saying that no opt+class / hparams are provided etc to make sure everything is transparent to the user
|
@pplantinga / @Adel-Moumen if agreement from both of you, please merge! |
What does this PR do?
Fixes #2846