PyTorch Modules and Classes
TL;DR
torch.nn
Module
: creates a callable which behaves like a function, but can also contain state(such as neural net layer weights). It knows whatParameter
(s) it contains and can zero all their gradients, loop through them for weight updates, etc.Parameter
: a wrapper for a tensor that tells aModule
that it has weights that need updating during backprop. Only tensors with the requires_grad attribute set are updatedfunctional
: a module (usually imported into theF
namespace by convention) which contains activation functions, loss functions, etc, as well as non-stateful versions of layers such as convolutional and linear layers.
torch.optim
: Contains optimizers such asSGD
, which update the weights ofParameter
during the backward stepDataset
: An abstract interface of objects with a__len__
and a__getitem__
, including classes provided with Pytorch such asTensorDataset
DataLoader
: Takes anyDataset
and creates an iterator which returns batches of data.
Notebook
View in nbviewer