This file is to give an overview of layer types and how they can be combined. Model notes should explain the abstract idea of what the model should be used for and why. Furthermore they should explain the architecture by linking to the individual layers. Finally they should explain why, from an equivariance/scaling standpoint the model deals with the curse of dimensionality

Activation Functions (activation layers)

theory

Activation Functions

specific layers

ReLU

Normalization Layers

centered and normalized input data is still important.

normalisation comparisons.png

Goal:

A model can deal best with data, that is centered and normalised (ref). This is true for the input, but also for between layers (each layer can be considered its own model taking input from the previous layer).

Methods

Convolutional Layers