Description
Multi-layer Perceptrons (MLPs) are standard neural networks with fully connected layers, where each input unit is connected with each output unit. We rarely use these kind of layers when working with…
Summary
- ?
- It is interesting to note, that this does not require a change of the convolutional part of the model, as convolutions are independent of the input size of the image.
- As we can see, a stack of convolutional layers produces an equivariant mapping.
- Then, ϕ(0,0 p) is nothing else then the “impulse response”, so the output of our operator when applied to a Dirac at the origin, which we defined as being “h”.