Deep Learning

PyTorch Modules and Classes

Important PyTorch modules and classes for creating and training neural networks.

2020-09-10

Softmax and Its Derivative

Softmax We use softmax activation function to predict the probability assigned to $n$ classes. For example, the probability of assigning input sample to $j$-th class is: $$ p\_j = \operatorname{softmax}(z\_j) = \frac{e^{z\_j}}{\sum\_{k=1}^n e^{z\_k}} $$ Furthermore, we use One-Hot encoding to represent the groundtruth $y$, which means $$ \sum\_{k=1}^n y\_k = 1 $$ Loss function (Cross-Entropy): $$ \begin{aligned} L &= -\sum\_{k=1}^n y\_k \log(p\_k) \\\\ &= - \left(y\_j \log(p\_j) + \sum\_{k \neq j}y\_k \log(p\_k)\right) \end{aligned} $$ Gradient w.

2020-09-08

‼️ Issues & Gotchas

Issues and gotchas which may occur in practice.

2020-09-07

📈 Training

Practical tips and tools for training of neural networks with PyTorch.

2020-09-07

PyTorch
PyTorch

PyTorch is an open source machine learning framework that accelerates the path from research prototyping to production deployment.

2020-09-07

🧾 PyTorch Recipes

Useful recipes that make use of specific PyTorch features.

2020-09-07

📚 PyTorch Resources

Useful PyTorch resources.

2020-09-07

Build and Train a Neural Network

A simple yet typical workflow of building and traing a neural network using PyTorch

2020-09-07

Autograd

A PyTorch a built-in differentiation engine that supports automatic computation of gradient for any computational graph.

2020-09-07

Tensor

A specialized data structure that are very similar to arrays and matrices. In PyTorch, tensors are used to encode the inputs and outputs of a model, as well as the model’s parameters.

2020-09-07