# Theano¶

## Abstraction¶

Theano is an inception of the modern deep learning frameworks.

We find that the fundamentals of compile are quite useful to wrap the primitives of computation graph, thus, instead of defining any primitives that are hard to remember, we still prefer the Theano.

We implement it based on the backend of Dragon thorough native Python language. All operators are compiled offline, which is more similar to TensorFlow but not the original Theano.

Our implemented Theano is also easy to debug or extend, it could be a substitution for supporting the early codes of Deep Learning.

## Architectures¶

Compile

This module consists several crucial primitives to drive the backend of Dragon. We find these primitives useful through injecting the intentions of programming (such as Feed/Fetch, Net Topology, and Control Flow) into the virtual machine, instead of explicitly declaring and creating.

Unlike Caffe2, we prefer this primitive to declaring a graph and stuffing operators, as a graph can be confirmed and further optimized if only the inputs and outputs are deterministic.

We also remove the naive arithmetic update operations, use the fine-grained Updater instead. Our implemented Updater provides the speedup for large-scale distributed training, which will enhance all frameworks in the VirtualBox.

This primitive is a simple wrapper of FeedTensor.

We remove the mechanism of SharedVaraible due to the memory-storage is taken by the backend. Following the Caffe2 and TensorFlow, we attribute it to the Feed of data streams.

For detailed Documentation, see: Compile.

Tensor

This module provides massive methods based on Tensor.

The structure itself is obscure in Theano, while Symbolic Variable is more frequently represented. We simplify the messy representations by the unified VirtualTensor structure, that Tensor is generally represented any n-dimensions arrays.

All methods in this module will share the Ops library of Dragon. We are sorry for removing some odd implementations supported by the original Theano.

Inheriting from the hierarchy of Theano, we categorize these methods as follows:

□   NNet

For detailed Documentation, see: Tensor.