Try Deep Learning in Python now with a fully pre-configured VM

Try #DeepLearning in #Python now with a fully pre-configured VM

  • Try Deep Learning in Python now with a fully pre-configured VMI love to write about face recognition, image recognition and all the other cool things you can build with machine learning.
  • If you aren’t a long-time Linux user, it can be really hard to figure out how to get a system fully configured with all the required machine learning libraries and tools like TensorFlow, Theano, Keras, OpenCV, and dlib.
  • To make it simple for anyone to play around with machine learning, I’ve put together a simple virtual machine image that you can download and run without any complicated installation steps.The virtual machine image has Ubuntu Linux Desktop 16.04 LTS 64-bit pre-installed with the following machine learning tools:Python 3.5OpenCV 3.2 with Python 3 bindingsdlib 19.4 with Python 3 bindingsTensorFlow 1.0 for Python 3Keras 2.0 for Python 3Theanoface_recognition for Python 3 (for playing around with face recognition)PyCharm Community Edition already set up and ready to go for all these librariesConvenient code examples ready to run, right on the desktop!Even the webcam is preconfigured to work inside the Linux VM for OpenCV / face_recognition examples (as long as you set up your webcam to be accessible in the VMware settings).
  • So don’t the VirtualBox version unless you don’t have any other choice.You need VMware to run this virtual machine image.
  • Right-click on the code window and choose “Run” to run the current file in PyCharm.If you configure your webcam in VMware settings, you can access your webcam from inside the Linux virtual machine!

I love to write about face recognition, image recognition and all the other cool things you can build with machine learning. Whenever possible, I try to include code examples or even write libraries…
Continue reading “Try Deep Learning in Python now with a fully pre-configured VM”

Blockchains are a data buffet for AIs – Fred Ehrsam – Medium

  • And while many of the tech giants working on AI like Google and Facebook have open sourced some of their algorithms, they hold back most of their data.In contrast, blockchains represent and even incent open data.
  • For example: creating a decentralized Uber requires a relatively open dataset of riders and drivers available to coordinate the network.The network effects and economic incentives around these open systems and their data can be more powerful than current centralized companies because they are open standards that anyone can build on in the same way the protocols of the internet like TCP/IP, HTML, and SMTP have achieved far greater scale than any company that sits atop them.
  • And oracle systems (a fancy way of saying getting people all over the world to report real world information to the blockchain in a way we can trust) like Augur will inject more data.This open data has the potential to commoditize the data silos most tech companies like Google, Facebook, Uber, LinkedIn, and Amazon are built on and extract rent from.
  • AIs trained on open data are more likely to be neutral and trustworthy instead of biased by the interests of the corporation who created and trained them.Since blockchains allow us to explicitly program incentive structures, they may make the incentives of AI more transparent.Simplified, AI is driven by 3 things: tools, compute power, and training data.
  • My guess is they shift to 1) creating blockchain protocols and their native tokens and 2) AIs that leverage the open, global data layer of the blockchain.

Sam Altman recently wrote that we are entering an era of hyperscale technology companies. These companies own massive troves of data with strong network effects around them and they are only getting…
Continue reading “Blockchains are a data buffet for AIs – Fred Ehrsam – Medium”

Google Is Making AI That Can Make More AI

Google Is Making #AI That Can Make More AI

  • What’s more, using AIs to build more AIs may also increase the speed at which new AIs can be made.
  • Once you’ve trained an AI to accomplish a certain goal, you can’t necessarily crack it open and see how it is doing it.
  • The downside is that AI building more AIs sure seems like it’s inviting a runaway cascade and, eventually, Skynet.
  • Google Is Making AI That Can Make More AI
  • The lab is reportedly building AI software that can build more AI software , with the goal of making future AI cheaper and easier.

There’s no way this could go wrong.
Continue reading “Google Is Making AI That Can Make More AI”

7 AI trends to watch in 2017

Seven Artificial Intelligence trends to watch in 2017 - #AI

  • A recent Forrester survey of business and technology professionals found that 58% of them are researching AI, but only 12% are using AI systems.
  • Concerns about AI stealing jobs are nothing new but we anticipate deeper, more nuanced conversations on what AI will mean economically.
  • Expect to hear (a little) less about malevolent AI taking over the world and more about the economic impacts of AI.
  • Most AI systems are black boxes -and immensely complex.
  • Watch highlights covering artificial intelligence, machine learning, intelligence engineering, and more.

From tools, to research, to ethics, Ben Lorica looks at what’s in store for artificial intelligence in 2017.
Continue reading “7 AI trends to watch in 2017”

GitHub

  • Linux GPU: Python 2 ( build history ) / Python 3.4 ( build history ) / Python 3.5 ( build history )
  • Latest commit 55b0159 Jan 1, 2017 yifeif committed on GitHub Merge pull request #6588 from terrytangyuan/run_config_flag
  • TensorFlow is an open source software library for numerical computation using data flow graphs.
  • Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) that flow between them.
  • TensorFlow also includes TensorBoard, a data visualization toolkit.

tensorflow – Computation using data flow graphs for scalable machine learning
Continue reading “GitHub”

Using SQL Server 2016 with R Services for Campaign Optimization

Using #SQLServer 2016 w/ R Services for campaign optimization

  • Completed solutions are deployed to SQL Server 2016 by embedding calls to R in stored procedures.
  • The solution provides a hands on experience by deploying into your Azure subscription.
  • The post contains more information about this new solution.
  • Using SQL Server 2016 with R Services for Campaign Optimization
  • We have published the solution in the Cortana Intelligence Solutions Gallery .

We are happy to announce a new Campaign Optimization solution based on R Services in SQL Server 2016, designed to help customers apply machine learning to increase response rates from their leads. This post contains more information about this new solution.
Continue reading “Using SQL Server 2016 with R Services for Campaign Optimization”