PyTorch Modules and Classes
TL;DR
torch.nnModule: creates a callable which behaves like a function, but can also contain state(such as neural net layer weights). It knows whatParameter(s) it contains and can zero all their gradients, loop through them for weight updates, etc.Parameter: a wrapper for a tensor that tells aModulethat it has weights that need updating during backprop. Only tensors with the requires_grad attribute set are updatedfunctional: a module (usually imported into theFnamespace by convention) which contains activation functions, loss functions, etc, as well as non-stateful versions of layers such as convolutional and linear layers.
torch.optim: Contains optimizers such asSGD, which update the weights ofParameterduring the backward stepDataset: An abstract interface of objects with a__len__and a__getitem__, including classes provided with Pytorch such asTensorDatasetDataLoader: Takes anyDatasetand creates an iterator which returns batches of data.
Notebook
View in nbviewer