Machine Learning is Fun! Part 3: Deep Learning and Convolutional Neural Networks — Medium

Machine Learning is Fun! Part 3: Deep Learning and Convolutional Neural Networks — Medium

  • More data makes the problem harder for our neural network to solve, but we can compensate for that by making our network bigger and able to learn more complicated patterns.
  • We need to be smarter about how we process images into our neural network.
  • But now we want to process images with our neural network.
  • Our program can now recognize birds in images!
  • Step 1: Break the image into overlapping image tiles

Read the full article, click here.


@MikeTamir: “Machine Learning is Fun! Part 3: Deep Learning and Convolutional Neural Networks — Medium”


Update: Machine Learning is Fun! Part 4 is now available!


Machine Learning is Fun! Part 3: Deep Learning and Convolutional Neural Networks — Medium

Residual neural networks are an exciting area of deep learning research — Init.ai Decoded

Residual neural networks are an exciting area of #deeplearning research. 1000 layers! #AI

  • The paper Deep Residual Networks with Exponential Linear Unit , by Shah et al., combines exponential linear units, an alternative to rectified linear units, with ResNets to show improved performance, even without batch normalization.
  • ResNets will be important to enable complex models of the world.
  • ResNets tweak the mathematical formula for a deep neural network.
  • The paper enables practical training of neural networks with thousands of layers.
  • I am highlighting several recent papers that show the potential of residual neural networks.

Read the full article, click here.


@StartupYou: “Residual neural networks are an exciting area of #deeplearning research. 1000 layers! #AI”


The identity function is simply id(x) = x; given an input x it returns the same value x as output.


Residual neural networks are an exciting area of deep learning research — Init.ai Decoded