ResNets
Last updated
Last updated
ResNets stands for Residual Networks. They were invented in 2015.
They introduced the concept of skip-connections that allow us to feed the activation of one layer to another layer deeper in the network. This, in turn, allows us to train significantly deeper neural networks.
ResNets are build out of residual blocks:
Basically, the activation of one layer gets propagated to a deeper layer without passing through intermediate layers, and gets added to the output of the destination layer before ReLU is applied.
The following image shows a regular "plain" network vs. a residual network: