An Overview of Python Deep Learning Frameworks

#ICYMI An Overview of #Python #DeepLearning Frameworks

  • I recently stumbled across an old Data Science Stack Exchange answer of mine on the topic of the “Best Python library for neural networks”, and it struck me how much the Python deep learning ecosystem has evolved over the course of the past 2.5 years.
  • Since Theano aims first and foremost to be a library for symbolic mathematics, Lasagne offers abstractions on top of Theano that make it more suitable for deep learning.
  • Similar to Lasagne, Blocks is a shot at adding a layer of abstraction on top of Theano to facilitate cleaner, simpler, more standardized definitions of deep learning models than writing raw Theano.
  • More recently, the TensorFlow team decided to incorporate support for Keras, the next deep learning library on our list.
  • It’s a loose port of Lua’s Torch library to Python, and is notable because it’s backed by the Facebook Artificial Intelligence Research team (FAIR), and because it’s designed to handle dynamic computation graphs — a feature absent from the likes of Theano, TensorFlow, and derivatives.


Read this concise overview of leading Python deep learning frameworks, including Theano, Lasagne, Blocks, TensorFlow, Keras, MXNet, and PyTorch.

Continue reading “An Overview of Python Deep Learning Frameworks”

An Overview of Python Deep Learning Frameworks

#ICYMI An Overview of #Python #DeepLearning Frameworks

  • I recently stumbled across an old Data Science Stack Exchange answer of mine on the topic of the “Best Python library for neural networks”, and it struck me how much the Python deep learning ecosystem has evolved over the course of the past 2.5 years.
  • Since Theano aims first and foremost to be a library for symbolic mathematics, Lasagne offers abstractions on top of Theano that make it more suitable for deep learning.
  • Similar to Lasagne, Blocks is a shot at adding a layer of abstraction on top of Theano to facilitate cleaner, simpler, more standardized definitions of deep learning models than writing raw Theano.
  • More recently, the TensorFlow team decided to incorporate support for Keras, the next deep learning library on our list.
  • It’s a loose port of Lua’s Torch library to Python, and is notable because it’s backed by the Facebook Artificial Intelligence Research team (FAIR), and because it’s designed to handle dynamic computation graphs — a feature absent from the likes of Theano, TensorFlow, and derivatives.


Read this concise overview of leading Python deep learning frameworks, including Theano, Lasagne, Blocks, TensorFlow, Keras, MXNet, and PyTorch.

Continue reading “An Overview of Python Deep Learning Frameworks”

TensorKart: self-driving MarioKart with TensorFlow

TensorKart: self-driving MarioKart with TensorFlow

  • After playing way too much MarioKart and writing an emulator plugin in C , I managed to get some decent results.
  • When the plugin is loaded, the emulator checks for several function definitions and errors if any are missing.
  • With this in mind I played more MarioKart to record new training data.
  • Rabbit Hole – writing a mupen64plus input plugin
  • I started by modifying the TensorFlow tutorial for a character recognizer using the MNIST dataset .

Kevin Hughes’ Blog
Continue reading “TensorKart: self-driving MarioKart with TensorFlow”

GitHub

  • Linux GPU: Python 2 ( build history ) / Python 3.4 ( build history ) / Python 3.5 ( build history )
  • Latest commit 55b0159 Jan 1, 2017 yifeif committed on GitHub Merge pull request #6588 from terrytangyuan/run_config_flag
  • TensorFlow is an open source software library for numerical computation using data flow graphs.
  • Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) that flow between them.
  • TensorFlow also includes TensorBoard, a data visualization toolkit.

tensorflow – Computation using data flow graphs for scalable machine learning
Continue reading “GitHub”

TensorFlow: A system for large-scale machine learning

An overview of TensorFlow and where the Google Brain team might be taking it next...

  • We face the intriguing problem of providing a system that transparently and efficiently uses distributed resources, even when the structure of the computation unfolds dynamically.
  • Some users have begun to chafe at the limitations of a static dataflow graph, especially for algorithms like deep reinforcement learning.
  • If you want to model a sparse tensor you need to encode it somehow at the application level.
  • Operations take one or more tensors as input and produce one or more tensors as output.
  • It uses a dataflow graph to represent the computation at each worker, and uses a parameter server to scale training across multiple machines.

TensorFlow: A system for large-scale machine learning Abadi et al. (Google Brain) OSDI 2016 This is my last paper review for 2016! The Morning Paper will be taking a two week break for the holidays, resuming again on the 2nd January. Sometime inbetween I’ll do a short retrospective on the year. It seems fitting to…
Continue reading “TensorFlow: A system for large-scale machine learning”

The Good, Bad, & Ugly of TensorFlow

The Good, Bad, and Ugly of #TensorFlow. #BigData #DeepLearning #MachineLearning  #AI

  • If you are deploying a model to a cloud environment, you want to know that your model can execute on the hardware available to it, without unpredictable interactions with other code that may access the same hardware.
  • For example, the Udacity tutorials and the RNN tutorial using Penn TreeBank data to build a language model are very illustrative, thanks to their simplicity.
  • For me, holding mental context for a new framework and model I’m building to solve a hard problem is already pretty taxing, so it can be really helpful to inspect a totally different representation of a model; the TensorBoard graph visualization is great for this.
  • But good programmers know it is much harder to write code that humans will use, versus code that a machine can compile and execute.
  • We appreciate their strategy of integrating new features and tests first so early adopters can try things before they are documented.

A survey of six months rapid evolution (+ tips/hacks and code to fix the ugly stuff) from Dan Kuster, one of indico’s deep learning researchers.
Continue reading “The Good, Bad, & Ugly of TensorFlow”