GitHub

Visualizing output of activation functions of CNNs: Comments  #DeepLearning #ML#AI

  • From Andrej Karpathy’s course cs231n:CNNs for Visual Recognition

    All the plots were generated with one full forward pass across all the layers of the network with the same activation function

    There are layers, each layer having units.

  • Random data points of training examples are generated from a univariate “normal” (Gaussian) distribution of mean and variance .
  • Weights for each layer were generated from the same distribution as that of but later on varied to obtain different plots.

DeepNets – How weight initialization affects forward and backward passes of a deep neural network

@craigbrownphd: Visualizing output of activation functions of CNNs: Comments #DeepLearning #ML#AI

GitHub – kvmanohar22/DeepNets: How weight initialization affects forward and backward passes of a deep neural network

Please note that GitHub no longer supports old versions of Firefox.

We recommend upgrading to the latest Safari, Google Chrome, or Firefox.

How weight initialization affects forward and backward passes of a deep neural network

Use Git or checkout with SVN using the web URL.

How weight initialization affects the forward and backprop of a deep Neural Network ?

All the plots were generated with one full forward pass across all the

There are

layers, each layer having

units.

Tanh, ReLU, Sigmoid were used.

Random data points of

training examples are generated from a univariate “normal” (Gaussian) distribution of mean

and variance

Weights for each layer were generated from the same distribution as that of

data points

but later on varied to obtain different plots.

You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.

GitHub