GitHub

  • At each timestep, it has state consisting of the current memory contents (and auxiliary information such as memory usage), and maps input at time to output at time .
  • The access module is where the main DNC logic happens; as this is where memory is written to and read from.
  • At every timestep, the input to an access module is a vector passed from the , and its output is the contents read from memory.
  • The controller module “controls” memory access.
  • Typically, it is just a feedforward or (possibly deep) LSTM network, whose inputs are the inputs to the overall recurrent network at that time, concatenated with the read memory output from the access module from the previous timestep.

dnc – A TensorFlow implementation of the Differentiable Neural Computer.

@DeepMindAI: We’ve open sourced the Differentiable Neural Computer! Built with #Sonnet and #TensorFlow. Available here:

This package provides an implementation of the Differentiable Neural Computer, as published in Nature.

Any publication that discloses findings arising from using this source code must cite “Hybrid computing using a neural network with dynamic external memory”, Nature 538, 471–476 (October 2016) doi:10.1038/nature20101.

The Differentiable Neural Computer is a recurrent neural network. At each timestep, it has state consisting of the current memory contents (and auxiliary information such as memory usage), and maps input at time

t

to output at time

t

. It is implemented as a collection of

modules, which allow plugging together the different modules to experiment with variations on the architecture.

The access module is where the main DNC logic happens; as this is where memory is written to and read from. At every timestep, the input to an access module is a vector passed from the

controller

, and its output is the contents read from memory. It uses two futher

s:

which tracks the order of memory writes, and

which tracks which memory locations have been written to and not yet subsequently “freed”. These are both defined in

addressing.py

The controller module “controls” memory access. Typically, it is just a feedforward or (possibly deep) LSTM network, whose inputs are the inputs to the overall recurrent network at that time, concatenated with the read memory output from the access module from the previous timestep.

The dnc simply wraps the access module and the control module, and forms the basic

unit of the overall architecture. This is defined in

dnc.py

requires an installation of TensorFlow and Sonnet. An example training script is provided for the algorithmic task of repeatedly copying a given input string. This can be executed from a python interpreter:

You can specify training options, including parameters to the model and optimizer, via flags:

Periodically saving, or ‘checkpointing’, the model is disabled by default. To enable, use the

flag. E.g.

steps. The model will be checkpointed to

by default. From there training can be resumed. To specify an alternate checkpoint directory, use the

flag. Note: ensure that

is deleted before training is resumed with different model parameters, to avoid shape inconsistency errors.

can be used as a standard TensorFlow rnn core and unrolled with TensorFlow rnn ops, such as

on any sequential task.

Disclaimer: This is not an official Google product

You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.

GitHub