PyTorch, No Tears Logo

Foundations

  • 1. Environment
  • 2. Tensor
  • 3. Autograd

Data

  • 1. Data
  • 2. Datasets and Caching
  • 3. Data Pipelines at Scale
  • 4. Transformation

Training

  • 1. Loss Functions
  • 2. Optimizer
  • 3. Scheduler
  • 4. Learning
  • 5. Training Recipe
  • 6. Early Stopping
  • 7. Class Imbalance
  • 8. Debugging Training
  • 9. Ragged Sequences
  • 10. Multi-Task Learning

Performance and Scaling

  • 1. Training Performance
  • 2. Memory Optimization
  • 3. Profiling
  • 4. Compiled Models
  • 5. Distributed Training

Models, Persistence, and Deployment

  • 1. Models and Convolutional Neural Networks
  • 2. Saving and Loading Models
  • 3. Checkpoint and Resume
  • 4. Model Zoo
  • 5. Parameter-Efficient Finetuning
  • 6. Fine-Tuning Patterns
  • 7. Custom Autograd
  • 8. Quantization
  • 9. Export and Deploy
  • 10. Serving API
  • 11. Inference Batching

Evaluation and Reliability

  • 1. Calibration and Metrics
  • 2. Experiment Tracking
  • 3. Uncertainty
  • 4. Reproducibility
  • 5. Testing PyTorch Code

Text and Language

  • 1. Text Data
  • 2. Embeddings
  • 3. Sequence Classification
  • 4. Transformer Language Model
  • 5. Recurrent Neural Networks

Graphs

  • 1. Graph Data
  • 2. Message Passing
  • 3. Graph Convolution
  • 4. Graph Classification
  • 5. Heterogeneous Graphs
  • 6. PyTorch Geometric

Forecasting and Recommendation

  • 1. Time Series
  • 2. Recommendation Systems

Vision and Generative Models

  • 1. Generative Adversarial Network
  • 2. Neural Transfer
  • 3. Object Detection
  • 4. TensorBoard
PyTorch, No Tears
  • Search


© Copyright 2019, One-Off Coder. Last updated on Apr 10, 2026, 11:42:18 PM.