- I recently stumbled across an old Data Science Stack Exchange answer of mine on the topic of the “Best Python library for neural networks”, and it struck me how much the Python deep learning ecosystem has evolved over the course of the past 2.5 years.
- Since Theano aims first and foremost to be a library for symbolic mathematics, Lasagne offers abstractions on top of Theano that make it more suitable for deep learning.
- Similar to Lasagne, Blocks is a shot at adding a layer of abstraction on top of Theano to facilitate cleaner, simpler, more standardized definitions of deep learning models than writing raw Theano.
- More recently, the TensorFlow team decided to incorporate support for Keras, the next deep learning library on our list.
- It’s a loose port of Lua’s Torch library to Python, and is notable because it’s backed by the Facebook Artificial Intelligence Research team (FAIR), and because it’s designed to handle dynamic computation graphs — a feature absent from the likes of Theano, TensorFlow, and derivatives.
Read this concise overview of leading Python deep learning frameworks, including Theano, Lasagne, Blocks, TensorFlow, Keras, MXNet, and PyTorch.
Continue reading “An Overview of Python Deep Learning Frameworks”
- Google Brain’s new super fast and highly accurate AI: the Mixture of Experts Layer.Conditional Training on unreasonable large networks.One of the big problems in Artificial Intelligence is the gigantic amount of GPUs (or computers) needed to train large networks.The training time of neural networks grows quadratically (think squared) in function of their size.
- Therefore, we have to build giant neural networks to process the ton of data that corporations like Google Microsoft have.Well, that was the case until Google released their paper Mixture of Experts Layer.The Mixture of Experts Layer as shown in the original Paper.The rough concept is to keep multiple experts inside the network.
- Each expert is itself a neural network.
- This does look similar to the PathNet paper, however, in this case, we only have one layer of modules.You can think of experts as multiple humans specialized in different tasks.In front of those experts stands the Gating Network that chooses which experts to consult for a given data (named x in the figure).
- The Gating Network also decides on output weights for each expert.The output of the MoE is then:ResultsIt works surprisingly well.Take for example machine translation from English to French:The MoE with experts shows higher accuracy (or lower perplexity) than the state of the art using only 16% of the training time.ConclusionThis technique lowers the training time while achieving better than state of the art accuracy.
One of the big problems in Artificial Intelligence is the gigantic amount of GPUs (or computers) needed to train large networks. The training time of neural networks grows quadratically (think…
Continue reading “Google Brain’s new super fast and highly accurate AI: the Mixture of Experts Layer.”
- IBM has dubbed its AI offering Watson – a powerful cognitive computing engine that pervades everything IBM is bringing to customers.
- David Kenny, Senior Vice President of IBM Watson and Cloud Platform, laid out IBM’s strategic ONE Architecture consisting of four layers: Cloud, Data AI, and Applications, as the image below shows.
- The ONE Architecture is remarkable in two ways: first, AI (in the form of Watson) represents an entire layer, indicating the breadth of IBM’s bet on the technology.
- What, then, does Watson truly bring to customers of the IBM Cloud?
- This enormous commitment of person-hours from highly qualified professionals as well as vast quantities of data makes Watson more of a consultant’s tool, best suited for selling the time of IBM consultants, more so than a modular, LEGO-block style plug-in for customers to incorporate directly into their own applications.
It’s clear that Rometty has taken the lesson of the Innovator’s Dilemma to heart, and is willing to jeopardize existing revenue streams in order to place large bets on innovation. The big picture here, of course, is business transformation – most notably for IBM, but also for its customers.
Continue reading “IBM Bets The Company On Cloud, AI, And Blockchain”
- Now, the federal government has committed billions of dollars in new funding to make sure that future climate disasters aren’t so completely devastating.
- Mitigation efforts take a significant amount of money, and last year an audit by the Commissioner of the Environment and Sustainable Development stated that despite increased levels of funding for disaster recovery, Canada is still not prepared for the intensity of future climate disasters.
- Read More: Vancouver Considers Abandoning Parts of the Coast Because of Climate Change
Beyond the climate disaster mitigation fund, Canada has committed to funding machine-learning research, helping out on one of NASA’s Mars missions, and paying for research into stem cell therapies.
- The 2017 budget has added $150 million from the Public Transit and Green Infrastructure fund, to make a total of $950 million over five years.
- Another new budget effort to attract more forward-looking companies is the Smart Cities Challenge Fund.
In its new federal budget, Canada invests in machine learning and teaching kids to code, but offers no real plan for how to address workplace automation.
Continue reading “Canada Is Investing $2 Billion in a Climate Disaster Mitigation Fund”
- When Unanimous AI developed UNU in 2015, the goal was to create artificial intelligence (AI) systems that “keep people in the loop,” amplifying human intelligence instead of replacing it.
- Unanimous AI’s March Madness bracket was able to beat all but three percent of ESPN brackets across the country on the first day of the tournament.
- The technology makes use of the collective intelligence of people — combining “knowledge, insights, and intuitions,” as Unanimous AI puts it — to develop a kind of artificial intelligence that’s inherently human.
- As Unanimous AI explains, “We empower people to act as ‘data processors’ that come together online and form an intelligent system, connected by AI algorithms.
- One day, perhaps we’ll be able to combine the intelligence of Watson with that of a swarm of medical professionals to improve healthcare, or combine the insights of an investment-making AI with a swarm of finance experts.
The collective intelligence is killing it on ESPN.
Continue reading “March Madness: A Swarm Intelligence Is Predicting the Future”
- In case you missed it, there’s a post on Medium by Steven Levy that explains everything you might want to know about how machine learning works at Apple.
- It’s a fascinating account of the Apple Brain, the A.I. hidden inside your iPhone.
- And yes, when Apple buys a company, it is usually doing that to hire the people.
- Is this the new Apple, a company that allows people to get an inside view of what they are doing?
- And we at least get some juicy information, like this line about how the Apple Pencil works with the iPad Pro: “Using a machine learning model for ‘palm rejection’ enabled the screen sensor to detect the difference between a swipe, a touch, and a pencil input with a very high degree of accuracy.”
In case you missed it, there’s a post on Medium by Steven Levy that explains everything you might want to know about how machine learning works at Apple.
Continue reading “How Apple uses machine learning”
A couple of weeks ago, Coca-Cola’s global senior digital director Mariano Bosaz told Adweek he wanted “to start experimenting” with “automated narratives,” including using bots for music and editing the closing credits of commercials.
Continue reading “How 4 Agencies Are Using Artificial Intelligence as Part of the Creative Process – Adweek”