Microsoft’s BrainWave is going to supercharge AI

Microsoft's #BrainWave is going to supercharge #AI - and it's coming to the cloud.

  • Microsoft is using custom hardware to realize a 50-100x speed up in how quickly it can run AI algorithms that power its Bing search engine — and will make the tech available to all from next year.
  • The acceleration is being powered by the BrainWave platform, a network of customizable chips known as Field-Programmable Gate Arrays (FPGAs), tailored to efficiently handle deep neural networks.
  • “The power of the BrainWave platform are FPGAs, field-programmable gate arrays, which are really executing these AI algorithms in hardware,” said Joseph Sirosh, corporate VP for artificial intelligence research at Microsoft.
  • A large amount of Bing search queries are served through BrainWave, with Sirosh saying the FPGAs are “especially good” for accelerating text-based applications.
  • So as AI evolves so rapidly, as new algorithms come in, it is possible for you to build the custom gate logic required to execute in hardware in that FPGA.

Microsoft’s AI chief talks about the speed up in AI performance being realized by its BrainWave platform.
Continue reading “Microsoft’s BrainWave is going to supercharge AI”

GitHub

  • This is the code for my article “Coloring B&W portraits with neural networks” – – Earlier this year, Amir Avni used neural networks to troll the subreddit /r/Colorization.
  • For those who are not familiar, this subreddit is focused in “hand” coloring historical images using Photoshop.
  • Coloring images with neural networks creates some very interesting results.
  • Read the article to understand the context of the code.

Coloring-greyscale-images-in-Keras – Coloring B&W portraits with neural networks.
Continue reading “GitHub”

Artificial intelligence now powers all of Facebook’s translation

Artificial intelligence now powers all of Facebook’s translation

  • On Thursday, Facebook announced that all of its user translation services—those little magic tricks that happen when you click “see translation” beneath a post or comment—are now powered by neural networks, which are a form of artificial intelligence.
  • Back in May, the company’s artificial intelligence division, called Facebook AI Research, announced that they had developed a kind of neural network called a CNN (that stands for convolutional neural network, not the news organization where Wolf Blitzer works) that was a fast, accurate translator.
  • Now, Facebook says that they have incorporated that CNN tech into their translation system, as well as another type of neural network, called an RNN (the R is for recurrent).
  • Facebook says that the new AI-powered translation is 11 percent more accurate than the old-school approach, which is what they call a “phrase-based machine translation” technique that wasn’t powered by neural networks.
  • As an example of the difference between the two translation systems, Facebook demonstrated how the old approach would have translated a sentence from Turkish into English, and then showed how the new AI-powered system would do it.

On Thursday, Facebook announced that all of its user translation services—those little magic tricks that happen when you click “see translation” beneath a post or comment—are now powered by neural networks, which are a form of artificial intelligence.
Continue reading “Artificial intelligence now powers all of Facebook’s translation”

tensorflow/tensorflow/contrib/kfac at master · tensorflow/tensorflow · GitHub

TensorFlow has a great new K-FAC library in tf.contrib by the authors and collaborators:

  • When applied to feedforward and convolutional neural networks, K-FAC can converge faster in fewer iterations than SGD with Momentum.
  • K-FAC, short for “Kronecker-factored Approximate Curvature”, is an approximation to the Natural Gradient algorithm designed specifically for neural networks.
  • K-FAC can be used in place of SGD, Adam, and other implementations.
  • For an 8-layer Autoencoder, K-FAC converges to the same loss as SGD with Momentum in 3.8x fewer seconds and 14.7x fewer updates.
  • See how training loss changes as a function of number of epochs, steps, and seconds: – – If you have a feedforward or convolutional model for classification that is converging too slowly, K-FAC is for you.

tensorflow – Computation using data flow graphs for scalable machine learning
Continue reading “tensorflow/tensorflow/contrib/kfac at master · tensorflow/tensorflow · GitHub”

Artificial intelligence now powers all of Facebook’s translation

Artificial intelligence now powers all of Facebook’s translation

  • On Thursday, Facebook announced that all of its user translation services—those little magic tricks that happen when you click “see translation” beneath a post or comment—are now powered by neural networks, which are a form of artificial intelligence.
  • Back in May, the company’s artificial intelligence division, called Facebook AI Research, announced that they had developed a kind of neural network called a CNN (that stands for convolutional neural network, not the news organization where Wolf Blitzer works) that was a fast, accurate translator.
  • Now, Facebook says that they have incorporated that CNN tech into their translation system, as well as another type of neural network, called an RNN (the R is for recurrent).
  • Facebook says that the new AI-powered translation is 11 percent more accurate than the old-school approach, which is what they call a “phrase-based machine translation” technique that wasn’t powered by neural networks.
  • As an example of the difference between the two translation systems, Facebook demonstrated how the old approach would have translated a sentence from Turkish into English, and then showed how the new AI-powered system would do it.

On Thursday, Facebook announced that all of its user translation services—those little magic tricks that happen when you click “see translation” beneath a post or comment—are now powered by neural networks, which are a form of artificial intelligence.
Continue reading “Artificial intelligence now powers all of Facebook’s translation”

Free Machine Learning eBooks

Free #MachineLearning eBooks - March 2017 #abdsc

  • Machine learning is one of the fastest growing areas of computer science, with far-reaching applications.
  • The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way.
  • The book provides an extensive theoretical account of the fundamental ideas underlying machine learning and the mathematical derivations that transform these principles into practical algorithms.
  • These include a discussion of the computational complexity of learning and the concepts of convexity and stability; important algorithmic paradigms including stochastic gradient descent, neural networks, and structured output learning; and emerging theoretical concepts such as the PAC-Bayes approach and compression-based bounds.
  • Designed for an advanced undergraduate or beginning graduate course, the text makes the fundamentals and algorithms of machine learning accessible to students and non-expert readers in statistics, computer science, mathematics, and engineering.

Here are three eBooks available for free.
MACHINE LEARNING
Edited by Abdelhamid Mellouk and Abdennacer Chebira
Machine Learning can be defined in various ways…
Continue reading “Free Machine Learning eBooks”

Learning Deep Learning with Keras

Learning #DeepLearning with #Keras  #NeuralNetworks @pmigdal

  • For that reason, I suggest starting with image recognition tasks in Keras, a popular neural network library in Python.
  • Deep learning is a name for machine learning techniques using many-layered artificial neural networks.
  • See a plot of AUC score for logistic regression, random forest and deep learning on Higgs dataset (data points are in millions):

    In general there is no guarantee that, even with a lot of data, deep learning does better than other techniques, for example tree-based such as random forest or boosted trees.

  • Deep learning (that is – neural networks with many layers) uses mostly very simple mathematical operations – just many of them.
  • Its mathematics is simple to the point that a convolutional neural network for digit recognition can be implemented in a spreadsheet (with no macros), see: Deep Spreadsheets with ExcelNet.

I teach deep learning both for a living (as the main deepsense.io instructor, in a Kaggle-winning team1) and as a part of my volunteering with the Polish Chi…
Continue reading “Learning Deep Learning with Keras”