Skip to content

[tune](deps): Bump pytorch-lightning from 1.0.3 to 1.2.0 in /python/requirements#20

Closed
dependabot[bot] wants to merge 1 commit intomasterfrom
dependabot/pip/python/requirements/pytorch-lightning-1.2.0
Closed

[tune](deps): Bump pytorch-lightning from 1.0.3 to 1.2.0 in /python/requirements#20
dependabot[bot] wants to merge 1 commit intomasterfrom
dependabot/pip/python/requirements/pytorch-lightning-1.2.0

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Feb 20, 2021

Bumps pytorch-lightning from 1.0.3 to 1.2.0.

Release notes

Sourced from pytorch-lightning's releases.

Pruning & Quantization & SWA

[1.2.0] - 2021-02-18

Added

  • Added DataType, AverageMethod and MDMCAverageMethod enum in metrics (#5657)
  • Added support for summarized model total params size in megabytes (#5590)
  • Added support for multiple train loaders (#1959)
  • Added Accuracy metric now generalizes to Top-k accuracy for (multi-dimensional) multi-class inputs using the top_k parameter (#4838)
  • Added Accuracy metric now enables the computation of subset accuracy for multi-label or multi-dimensional multi-class inputs with the subset_accuracy parameter (#4838)
  • Added HammingDistance metric to compute the hamming distance (loss) (#4838)
  • Added max_fpr parameter to auroc metric for computing partial auroc metric (#3790)
  • Added StatScores metric to compute the number of true positives, false positives, true negatives and false negatives (#4839)
  • Added R2Score metric (#5241)
  • Added LambdaCallback (#5347)
  • Added BackboneLambdaFinetuningCallback (#5377)
  • Accelerator all_gather supports collection (#5221)
  • Added image_gradients functional metric to compute the image gradients of a given input image. (#5056)
  • Added MetricCollection (#4318)
  • Added .clone() method to metrics (#4318)
  • Added IoU class interface (#4704)
  • Support to tie weights after moving model to TPU via on_post_move_to_device hook
  • Added missing val/test hooks in LightningModule (#5467)
  • The Recall and Precision metrics (and their functional counterparts recall and precision) can now be generalized to Recall@K and Precision@K with the use of top_k parameter (#4842)
  • Added ModelPruning Callback (#5618, #5825, #6045)
  • Added PyTorchProfiler (#5560)
  • Added compositional metrics (#5464)
  • Added Trainer method predict(...) for high performence predictions (#5579)
  • Added on_before_batch_transfer and on_after_batch_transfer data hooks (#3671)
  • Added AUC/AUROC class interface (#5479)
  • Added PredictLoop object (#5752)
  • Added QuantizationAwareTraining callback (#5706, #6040)
  • Added LightningModule.configure_callbacks to enable the definition of model-specific callbacks (#5621)
  • Added dim to PSNR metric for mean-squared-error reduction (#5957)
  • Added promxial policy optimization template to pl_examples (#5394)
  • Added log_graph to CometLogger (#5295)
  • Added possibility for nested loaders (#5404)
  • Added sync_step to Wandb logger (#5351)
  • Added StochasticWeightAveraging callback (#5640)
  • Added LightningDataModule.from_datasets(...) (#5133)
  • Added PL_TORCH_DISTRIBUTED_BACKEND env variable to select backend (#5981)
  • Added Trainer flag to activate Stochastic Weight Averaging (SWA) Trainer(stochastic_weight_avg=True) (#6038)
  • Added DeepSpeed integration (#5954, #6042)

Changed

  • Changed stat_scores metric now calculates stat scores over all classes and gains new parameters, in line with the new StatScores metric (#4839)
  • Changed computer_vision_fine_tunning example to use BackboneLambdaFinetuningCallback (#5377)
  • Changed automatic casting for LoggerConnector metrics (#5218)
  • Changed iou [func] to allow float input (#4704)

... (truncated)

Changelog

Sourced from pytorch-lightning's changelog.

[1.2.0] - 2021-02-18

Added

  • Added DataType, AverageMethod and MDMCAverageMethod enum in metrics (#5657
  • Added support for summarized model total params size in megabytes (#5590)
  • Added support for multiple train loaders (#1959)
  • Added Accuracy metric now generalizes to Top-k accuracy for (multi-dimensional) multi-class inputs using the top_k parameter (#4838)
  • Added Accuracy metric now enables the computation of subset accuracy for multi-label or multi-dimensional multi-class inputs with the subset_accuracy parameter (#4838)
  • Added HammingDistance metric to compute the hamming distance (loss) (#4838)
  • Added max_fpr parameter to auroc metric for computing partial auroc metric (#3790)
  • Added StatScores metric to compute the number of true positives, false positives, true negatives and false negatives (#4839)
  • Added R2Score metric (#5241)
  • Added LambdaCallback (#5347)
  • Added BackboneLambdaFinetuningCallback (#5377)
  • Accelerator all_gather supports collection (#5221)
  • Added image_gradients functional metric to compute the image gradients of a given input image. (#5056)
  • Added MetricCollection (#4318)
  • Added .clone() method to metrics (#4318)
  • Added IoU class interface (#4704)
  • Support to tie weights after moving model to TPU via on_post_move_to_device hook
  • Added missing val/test hooks in LightningModule (#5467)
  • The Recall and Precision metrics (and their functional counterparts recall and precision) can now be generalized to Recall@K and Precision@K with the use of top_k parameter (#4842)
  • Added ModelPruning Callback (#5618, #5825, #6045)
  • Added PyTorchProfiler (#5560)
  • Added compositional metrics (#5464)
  • Added Trainer method predict(...) for high performence predictions (#5579)
  • Added on_before_batch_transfer and on_after_batch_transfer data hooks (#3671)
  • Added AUC/AUROC class interface (#5479)
  • Added PredictLoop object (#5752)
  • Added QuantizationAwareTraining callback (#5706, #6040)
  • Added LightningModule.configure_callbacks to enable the definition of model-specific callbacks (#5621)
  • Added dim to PSNR metric for mean-squared-error reduction (#5957)
  • Added promxial policy optimization template to pl_examples (#5394)
  • Added log_graph to CometLogger (#5295)
  • Added possibility for nested loaders (#5404)
  • Added sync_step to Wandb logger (#5351)
  • Added StochasticWeightAveraging callback (#5640)
  • Added LightningDataModule.from_datasets(...) (#5133)
  • Added PL_TORCH_DISTRIBUTED_BACKEND env variable to select backend (#5981)
  • Added Trainer flag to activate Stochastic Weight Averaging (SWA) Trainer(stochastic_weight_avg=True) (#6038)
  • Added DeepSpeed integration (#5954, #6042)

Changed

  • Changed stat_scores metric now calculates stat scores over all classes and gains new parameters, in line with the new StatScores metric (#4839)

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Feb 20, 2021
@dependabot @github
Copy link
Author

dependabot bot commented on behalf of github Feb 25, 2021

Superseded by #22.

@dependabot dependabot bot closed this Feb 25, 2021
@dependabot dependabot bot deleted the dependabot/pip/python/requirements/pytorch-lightning-1.2.0 branch February 25, 2021 03:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants