In this work we develop an algebraic language that represents a formal calculus for deep learning and is, at the same time a model which enables implementations and investigations of programs.
To this purpose, we develop an abstract computational model of automatically differentiable programs. In the model, programs are elements of op. cit. programming spaces. Programs are viewed as maps from a finite-dimensional vector space to itself op. cit. virtual memory space. Virtual memory space is an algebra of programs, an algebraic data structure (one can calculate with). The elements of the virtual memory space give the expansion of a program into an infinite tensor series. We define a differential operator on programming spaces and, using its powers, implement the general shift operator and the operator of program composition.
The algebraic language constructed in this way is a complete model of deep learning. It enables us to express programs in such a way, that their properties may be derived from their source codes.