torchood: A User-Friendly Wrapper for Torch, Making Neural Network Training Effortless
Jumpstart your deep learning endeavors and rapidly prototype custom projects with torchood 🚀⚡🔥
├───torchood
│ │ dataset.py
│ │ trainer.py
│ │ __init__.py
│ │
│ ├───models
│ │ │ common.py
│ │ │ custom_resnet.py
│ │ │ mini_resnet.py
│ │ │ resnet.py
│ │ │ __init__.py
│ │
│ ├───utils
│ │ gradcam.py
│ │ misc.py
│ │ plotting.py
│ │ __init__.py# clone project
git clone https://github.com/anantgupta129/TorcHood.git
cd TorcHood
# [OPTIONAL] create conda environment
conda create -n myenv python=3.10
conda activate myenv
# install pytorch according to instructions
# https://pytorch.org/get-started/
! python setup.py sdist
! pip install .-
LR finder
from torch.nn import CrossEntropyLoss from torchood.utils.misc import find_lr optimizer = torch.optim.SGD(model.parameters(), lr=0.01, momentum=0.9) kwargs = {"end_lr":10, "num_iter": 200, "step_mode":"exp"} find_lr(model, device, optimizer, criterion=CrossEntropyLoss(), dataloader=train_loader, **kwargs)
-
grad cam
from torchood.utils.gradcam import plot_cam_on_image mean = (0.49139968, 0.48215841, 0.44653091) std = (0.24703223, 0.24348513, 0.26158784) plot_cam_on_image(model, [model.layer4[1]], imgs_list, {"mean": mean, "std": std}) -
Training & training History
from torchood.trainer import Trainer trainer = Trainer(model, device, optimizer, scheduler) for epoch in range(1, num_epochs + 1): print(f"Epoch {epoch}") trainer.train(train_loader) trainer.evaluate(test_loader) trainer.plot_history()
-
Supports the CIFAR10 dataset as a sample. In the future, we plan to add support for additional datasets.
-
Supports sample models
Contributions are invited! Don't hesitate to submit a pull request.