- IBM and the USAF announced on Friday that the machine will run on an array of 64 TrueNorth Neurosynaptic chips.
- The TrueNorth chips are wired together like, and operate in a similar fashion to, the synapses within a biological brain.
- That is, these chips don’t require a clock, as conventional CPUs do, to function.VIDEOWhat’s more, because of the distributed nature of the system, even if one core fails, the rest of the array will continue to work.
- This 64-chip array will contain the processing equivalent of 64 million neurons and 16 billion synapses, yet absolutely sips energy — each processor consumes just 10 watts of electricity.Like other neural networks, this system will be put to use in pattern recognition and sensory processing roles.
- The Air Force wants to combine the TrueNorth’s ability to convert multiple data feeds — whether it’s audio, video or text — into machine readable symbols with a conventional supercomputer’s ability to crunch data.This isn’t the first time that IBM’s neural chip system has been integrated into cutting-edge technology.
Supercomputers today are capable of performing incredible feats, from accurately predicting the weather to uncovering insights into climate change, but they sti…
Continue reading “The Air Force and IBM are building an AI supercomputer”
- In a new blog post, Facebook explains how existing chatbots can hold short conversations and perform simple tasks such as booking a restaurant – but building machines that can hold meaningful conversations with people is challenging because it requires a bot to combine its understanding of the conversation with its knowledge of the world, and then produce a new sentence that helps it achieve its goals.
- To help build their training set, the team created an interface with multi-issue bargaining scenarios and crowdsourced humans on Amazon Mechanical Turk to negotiate in natural language to divide a random set of objects.
- The models were trained end-to-end from the language and decisions that humans made, meaning that the approach can easily be adapted to other tasks.
- Reinforcement learning was then used to reward the model when it achieved a good outcome which prevents the AI bot from developing its own language.
- In their experiments, majority of the people didn’t know they were talking to a bot and FAIR’s best reinforcement learning negotiation agent matched the performance of human negotiators – achieving better deals about as often as worse deals.
Researchers at Facebook Artificial Intelligence Research (FAIR) published a paper introducing AI-based dialog agents that can negotiate and compromise.
Continue reading “Facebook Training AI Bots to Negotiate with Humans – News Center”
- Sharing our vision of the future, including advances in research and innovation.
- Our goal is to make our innovations accessible to as many people as possible.
- When technology starts with people and serves people, it makes progress.
- We also want to promote innovative and responsible uses of digital technologies – prudent innovation – that improve everyday life for everyone.
- Innovation and periods of rapid change have always raised questions.
Continue reading “Innovation endorsed by Orange: welcome to tomorrow’s world”
Ray Kurzweil, Rodney Brooks, and others weigh in on the future of artificial intelligence
Continue reading “Human-Level AI Is Right Around the Corner—or Hundreds of Years Away”
A quarter of Dubai’s police will be robots by 2030.
Continue reading “RoboCop on the beat”
- Our goal is to ensure that the most promising researchers in the world have access to enough compute power to imagine, implement, and publish the next wave of ML breakthroughs.
- We’re setting up a program to accept applications for access to the TensorFlow Research Cloud and will evaluate applications on a rolling basis.
- The program will be highly selective since demand for ML compute is overwhelming, but we specifically encourage individuals with a wide range of backgrounds, affiliations, and interests to apply.
- The program will start small and scale up.
Researchers need enormous computational resources to train the machine learning models that have delivered
recent advances in medical imaging, speech recognition, game playing, and many other domains. The TensorFlow
Research Cloud is a cluster of 1,000 Cloud TPUs that provides the machine learning research community with
a total of 180 petaflops of raw compute power — at no charge — to support the next wave of breakthroughs.
Continue reading “Accelerating open machine learning research with Cloud TPUs”