srez/README.md at master · david-gpu/srez · GitHub

This is CSI-level: facial reconstruction via machine learning DCGA networks
 HT @diogomonica

  • The generator network relies on ResNet modules as we’ve found them to train substantially faster than more old-fashioned architectures.
  • ‘s an random, non cherry-picked, example of what this network can do.
  • The adversarial term of the loss function ensures the generator produces plausible faces, while the L1 term ensures that those faces resemble the low-res input data.
  • In addition to that the loss function of the generator has a term that measures the L1 difference between the 16×16 input and downscaled version of the image produced by the generator.
  • Extract all images to a subfolder named dataset .

srez – Image super-resolution through deep learning
Continue reading “srez/README.md at master · david-gpu/srez · GitHub”

GitHub

Code for super-resolution of faces, model is DCGAN, implementation TensorFlow:

  • The generator network relies on ResNet modules as we’ve found them to train substantially faster than more old-fashioned architectures.
  • The resulting 64×64 images display sharp features that are plausible based on the dataset that was used to train the neural net.
  • We have found that this L1 term greatly accelerates the convergence of the network during the first batches and also appears to prevent the generator from getting stuck in a poor local solution.
  • ‘s an random, non cherry-picked, example of what this network can do.
  • Download zip file titled Align&Cropped Images and extract all images to a subfolder named dataset .

Read the full article, click here.


@Reza_Zadeh: “Code for super-resolution of faces, model is DCGAN, implementation TensorFlow:”


srez – Image super-resolution through deep learning


GitHub

Residual neural networks are an exciting area of deep learning research — Init.ai Decoded

Residual neural networks are an exciting area of #deeplearning research. 1000 layers! #AI

  • The paper Deep Residual Networks with Exponential Linear Unit , by Shah et al., combines exponential linear units, an alternative to rectified linear units, with ResNets to show improved performance, even without batch normalization.
  • ResNets will be important to enable complex models of the world.
  • ResNets tweak the mathematical formula for a deep neural network.
  • The paper enables practical training of neural networks with thousands of layers.
  • I am highlighting several recent papers that show the potential of residual neural networks.

Read the full article, click here.


@StartupYou: “Residual neural networks are an exciting area of #deeplearning research. 1000 layers! #AI”


The identity function is simply id(x) = x; given an input x it returns the same value x as output.


Residual neural networks are an exciting area of deep learning research — Init.ai Decoded