Residual neural networks are an exciting area of deep learning research — Init.ai Decoded

Residual neural networks are an exciting area of #deeplearning research. 1000 layers! #AI

  • The paper Deep Residual Networks with Exponential Linear Unit , by Shah et al., combines exponential linear units, an alternative to rectified linear units, with ResNets to show improved performance, even without batch normalization.
  • ResNets will be important to enable complex models of the world.
  • ResNets tweak the mathematical formula for a deep neural network.
  • The paper enables practical training of neural networks with thousands of layers.
  • I am highlighting several recent papers that show the potential of residual neural networks.

Read the full article, click here.


@StartupYou: “Residual neural networks are an exciting area of #deeplearning research. 1000 layers! #AI”


The identity function is simply id(x) = x; given an input x it returns the same value x as output.


Residual neural networks are an exciting area of deep learning research — Init.ai Decoded