Mastering TensorFlow 1.x
上QQ阅读APP看书,第一时间看更新

Sonnet

Sonnet is an object-oriented library written in Python. It was released by DeepMind in 2017. Sonnet intends to cleanly separate the following two aspects of building computation graphs from objects:

  • The configuration of objects called modules
  • The connection of objects to computation graphs

Sonnet can be installed in Python 3 with the following command:

pip3 install dm-sonnet

Sonnet can be installed from the source by following directions from the following link: https://github.com/deepmind/sonnet/blob/master/docs/INSTALL.md.

The modules are defined as sub-classes of the abstract class sonnet.AbstractModule. At the time of writing this book, the following modules are available in Sonnet:

We can define our own new modules by creating a sub-class of sonnet.AbstractModule. An alternate non-recommended way of creating a module from a function is to create an object of the sonnet.Module class by passing the function to be wrapped as a module.

The workflow to build a model in the Sonnet library is as follows:

  1. Create classes for the dataset and network architecture which inherit from sonnet.AbstractModule. In our example, we create an MNIST class and an MLP class.
  2. Define the parameters and hyperparameters.
  3. Define the test and train datasets from the dataset classes defined in the preceding step.
  4. Define the model using the network class defined. As an example, model = MLP([20, n_classes]) in our case creates an MLP network with two layers of 20 and the n_classes number of neurons each.
  5. Define the y_hat placeholders for the train and test sets using the model.
  1. Define the loss placeholders for the train and test sets.
  2. Define the optimizer using the train loss placeholder.
  3. Execute the loss function in a TensorFlow session for the desired number of epochs to optimize the parameters.

The complete code for the Sonnet MNIST classification example is provided in the notebook ch-02_TF_High_Level_LibrariesThe __init__ method in each class initializes the class and the related superclass. The _build method creates and returns the dataset or the model objects when the class is called.The output from the Sonnet MNIST example is as follows:

Epoch : 0 Training Loss : 236.79913330078125
Epoch : 1 Training Loss : 227.3693084716797
Epoch : 2 Training Loss : 221.96337890625
Epoch : 3 Training Loss : 220.99142456054688
Epoch : 4 Training Loss : 215.5921173095703
Epoch : 5 Training Loss : 213.88958740234375
Epoch : 6 Training Loss : 203.7091064453125
Epoch : 7 Training Loss : 204.57427978515625
Epoch : 8 Training Loss : 196.17218017578125
Epoch : 9 Training Loss : 192.3954315185547
Test loss : 192.8847198486328

Your output could be a little varied due to the stochastic nature of the computations in the neural nets. This covers our overview of the Sonnet module.

For more details on Sonnet, you can explore the following link: https://deepmind.github.io/sonnet/.