Notebooks

There have been many algorithmic and mathematical techniques introduced in order to better optimize neural networks. This is a compliation of articles and tutorials designed to dig into some of these finer details.

2020-01-02

Data Augmentation

A strategy used in machine learning to increase the diversity and amount of training data without actually collecting new data. It involves creating modified versions of existing data using techniques like rotation, scaling, flipping, cropping, and brightness or color adjustments.

Read More
2020-01-02

Quantization

Quantization is used to reduce the precision of the weights and biases in a model in order to decrease computational requirements. It involves converting full-precision 32-bit weights into lower-precision formats. Typically 16-bit or 8-bit quantization is used, but research has shown promise in resource constrained enviroments for ternary and binary networks.

Read More
2020-01-02

Weight Pruning

Pruning is used to reduce the complexity and size of a model by removing weights or neurons. Pruning methods typically select weights to prune according to importance heuristics like magnitude or gradient saliency. However, even random pruning has been shown to produce accurate models at significant levels of sparsity. While pruning can cause some loss in model accuracy, this can be mitigated by fine-tuning the pruned model on the original dataset.

Read More
2020-01-02

Learning Rate Schedules

These strategies adjust the learning rate, and/or other optimizer parameters, throughout the training process. Generally, large learning rates are used in the early stages of training where large steps can be taken to improve efficiency before slowly decaying to small rates to encourage fine-grained convergence.

Read More
2020-01-02

Convolutions

Convolutions are mathematical operations that process input data using filters or kernels. The operation involves sliding the filter over the input data and performing an element-wise multiplication between the filter weights and the input window. These filters often learn to detect edges, shapes, or textures in image data in a translation invariant way.

Read More
2020-01-01

Activation Functions

Activation functions play a crucial role in determining the output of neurons in neural networks. They introduce non-linearity into the network, enabling it to model complex relationships between inputs and outputs.

Read More

Hero images generated with neural networks via midjourney.